Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.
Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O
2006-03-01
The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.
NASA Astrophysics Data System (ADS)
Asinari, Pietro
2010-10-01
The homogeneous isotropic Boltzmann equation (HIBE) is a fundamental dynamic model for many applications in thermodynamics, econophysics and sociodynamics. Despite recent hardware improvements, the solution of the Boltzmann equation remains extremely challenging from the computational point of view, in particular by deterministic methods (free of stochastic noise). This work aims to improve a deterministic direct method recently proposed [V.V. Aristov, Kluwer Academic Publishers, 2001] for solving the HIBE with a generic collisional kernel and, in particular, for taking care of the late dynamics of the relaxation towards the equilibrium. Essentially (a) the original problem is reformulated in terms of particle kinetic energy (exact particle number and energy conservation during microscopic collisions) and (b) the computation of the relaxation rates is improved by the DVM-like correction, where DVM stands for Discrete Velocity Model (ensuring that the macroscopic conservation laws are exactly satisfied). Both these corrections make possible to derive very accurate reference solutions for this test case. Moreover this work aims to distribute an open-source program (called HOMISBOLTZ), which can be redistributed and/or modified for dealing with different applications, under the terms of the GNU General Public License. The program has been purposely designed in order to be minimal, not only with regards to the reduced number of lines (less than 1000), but also with regards to the coding style (as simple as possible). Program summaryProgram title: HOMISBOLTZ Catalogue identifier: AEGN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 23 340 No. of bytes in distributed program, including test data, etc.: 7 635 236 Distribution format: tar.gz Programming language: Tested with Matlab version ⩽6.5. However, in principle, any recent version of Matlab or Octave should work Computer: All supporting Matlab or Octave Operating system: All supporting Matlab or Octave RAM: 300 MBytes Classification: 23 Nature of problem: The problem consists in integrating the homogeneous Boltzmann equation for a generic collisional kernel in case of isotropic symmetry, by a deterministic direct method. Difficulties arise from the multi-dimensionality of the collisional operator and from satisfying the conservation of particle number and energy (momentum is trivial for this test case) as accurately as possible, in order to preserve the late dynamics. Solution method: The solution is based on the method proposed by Aristov (2001) [1], but with two substantial improvements: (a) the original problem is reformulated in terms of particle kinetic energy (this allows one to ensure exact particle number and energy conservation during microscopic collisions) and (b) a DVM-like correction (where DVM stands for Discrete Velocity Model) is adopted for improving the relaxation rates (this allows one to satisfy exactly the conservation laws at macroscopic level, which is particularly important for describing the late dynamics in the relaxation towards the equilibrium). Both these corrections make possible to derive very accurate reference solutions for this test case. Restrictions: The nonlinear Boltzmann equation is extremely challenging from the computational point of view, in particular for deterministic methods, despite the increased computational power of recent hardware. In this work, only the homogeneous isotropic case is considered, for making possible the development of a minimal program (by a simple scripting language) and allowing the user to check the advantages of the proposed improvements beyond Aristov's (2001) method [1]. The initial conditions are supposed parameterized according to a fixed analytical expression, but this can be easily modified. Running time: From minutes to hours (depending on the adopted discretization of the kinetic energy space). For example, on a 64 bit workstation with Intel CoreTM i7-820Q Quad Core CPU at 1.73 GHz and 8 MBytes of RAM, the provided test run (with the corresponding binary data file storing the pre-computed relaxation rates) requires 154 seconds. References:V.V. Aristov, Direct Methods for Solving the Boltzmann Equation and Study of Nonequilibrium Flows, Kluwer Academic Publishers, 2001.
The Deterministic Mine Burial Prediction System
2009-01-12
or below the water-line, initial linear and angular velocities, and fall angle relative to the mine’s axis of symmetry. Other input data needed...c. Run_DMBP.m: start-up MATLAB script for the program 2. C:\\DMBP\\DMBP_src: This directory contains source code, geotechnical databases, and...approved for public release). b. \\Impact_35: The IMPACT35 model c. \\MakeTPARfiles: scripts for creating wave height and wave period input data from
Deterministic Intracellular Modeling
2003-03-01
eukaryotes encompass all plants, animal, fungi and protists [6:71]. Structures in this class are more defined. For example, cells in this class possess a...affect cells. 5.3 Recommendations Further research into the construction and evaluation of intracellular models would benefit Air Force toxicology studies...manual220/indexE.html. 16. MathWorks, “The Benefits of MATLAB.” Internet, 2003. http://www.mathworks.com/products/matlab/description1.jsp. 17. Mendes
Deterministic seismic hazard macrozonation of India
NASA Astrophysics Data System (ADS)
Kolathayar, Sreevalsa; Sitharam, T. G.; Vipin, K. S.
2012-10-01
Earthquakes are known to have occurred in Indian subcontinent from ancient times. This paper presents the results of seismic hazard analysis of India (6°-38°N and 68°-98°E) based on the deterministic approach using latest seismicity data (up to 2010). The hazard analysis was done using two different source models (linear sources and point sources) and 12 well recognized attenuation relations considering varied tectonic provinces in the region. The earthquake data obtained from different sources were homogenized and declustered and a total of 27,146 earthquakes of moment magnitude 4 and above were listed in the study area. The sesismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones which are associated with earthquakes of magnitude 4 and above. A new program was developed in MATLAB for smoothing of the point sources. For assessing the seismic hazard, the study area was divided into small grids of size 0.1° × 0.1° (approximately 10 × 10 km), and the hazard parameters were calculated at the center of each of these grid cells by considering all the seismic sources within a radius of 300 to 400 km. Rock level peak horizontal acceleration (PHA) and spectral accelerations for periods 0.1 and 1 s have been calculated for all the grid points with a deterministic approach using a code written in MATLAB. Epistemic uncertainty in hazard definition has been tackled within a logic-tree framework considering two types of sources and three attenuation models for each grid point. The hazard evaluation without logic tree approach also has been done for comparison of the results. The contour maps showing the spatial variation of hazard values are presented in the paper.
Dynamical modeling and multi-experiment fitting with PottersWheel
Maiwald, Thomas; Timmer, Jens
2008-01-01
Motivation: Modelers in Systems Biology need a flexible framework that allows them to easily create new dynamic models, investigate their properties and fit several experimental datasets simultaneously. Multi-experiment-fitting is a powerful approach to estimate parameter values, to check the validity of a given model, and to discriminate competing model hypotheses. It requires high-performance integration of ordinary differential equations and robust optimization. Results: We here present the comprehensive modeling framework Potters-Wheel (PW) including novel functionalities to satisfy these requirements with strong emphasis on the inverse problem, i.e. data-based modeling of partially observed and noisy systems like signal transduction pathways and metabolic networks. PW is designed as a MATLAB toolbox and includes numerous user interfaces. Deterministic and stochastic optimization routines are combined by fitting in logarithmic parameter space allowing for robust parameter calibration. Model investigation includes statistical tests for model-data-compliance, model discrimination, identifiability analysis and calculation of Hessian- and Monte-Carlo-based parameter confidence limits. A rich application programming interface is available for customization within own MATLAB code. Within an extensive performance analysis, we identified and significantly improved an integrator–optimizer pair which decreases the fitting duration for a realistic benchmark model by a factor over 3000 compared to MATLAB with optimization toolbox. Availability: PottersWheel is freely available for academic usage at http://www.PottersWheel.de/. The website contains a detailed documentation and introductory videos. The program has been intensively used since 2005 on Windows, Linux and Macintosh computers and does not require special MATLAB toolboxes. Contact: maiwald@fdm.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18614583
NASA Astrophysics Data System (ADS)
Ivanova, Violeta M.; Sousa, Rita; Murrihy, Brian; Einstein, Herbert H.
2014-06-01
This paper presents results from research conducted at MIT during 2010-2012 on modeling of natural rock fracture systems with the GEOFRAC three-dimensional stochastic model. Following a background summary of discrete fracture network models and a brief introduction of GEOFRAC, the paper provides a thorough description of the newly developed mathematical and computer algorithms for fracture intensity, aperture, and intersection representation, which have been implemented in MATLAB. The new methods optimize, in particular, the representation of fracture intensity in terms of cumulative fracture area per unit volume, P32, via the Poisson-Voronoi Tessellation of planes into polygonal fracture shapes. In addition, fracture apertures now can be represented probabilistically or deterministically whereas the newly implemented intersection algorithms allow for computing discrete pathways of interconnected fractures. In conclusion, results from a statistical parametric study, which was conducted with the enhanced GEOFRAC model and the new MATLAB-based Monte Carlo simulation program FRACSIM, demonstrate how fracture intensity, size, and orientations influence fracture connectivity.
MatLab Script and Functional Programming
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali
2007-01-01
MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.
GPELab, a Matlab toolbox to solve Gross-Pitaevskii equations II: Dynamics and stochastic simulations
NASA Astrophysics Data System (ADS)
Antoine, Xavier; Duboscq, Romain
2015-08-01
GPELab is a free Matlab toolbox for modeling and numerically solving large classes of systems of Gross-Pitaevskii equations that arise in the physics of Bose-Einstein condensates. The aim of this second paper, which follows (Antoine and Duboscq, 2014), is to first present the various pseudospectral schemes available in GPELab for computing the deterministic and stochastic nonlinear dynamics of Gross-Pitaevskii equations (Antoine, et al., 2013). Next, the corresponding GPELab functions are explained in detail. Finally, some numerical examples are provided to show how the code works for the complex dynamics of BEC problems.
MatLab Programming for Engineers Having No Formal Programming Knowledge
NASA Technical Reports Server (NTRS)
Shaykhian, Linda H.; Shaykhian, Gholam Ali
2007-01-01
MatLab is one of the most widely used very high level programming languages for Scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. Also, stated are the current limitations of the MatLab, which possibly can be taken care of by Mathworks Inc. in a future version to make MatLab more versatile.
Atmospheric Downscaling using Genetic Programming
NASA Astrophysics Data System (ADS)
Zerenner, Tanja; Venema, Victor; Simmer, Clemens
2013-04-01
Coupling models for the different components of the Soil-Vegetation-Atmosphere-System requires up-and downscaling procedures. Subject of our work is the downscaling scheme used to derive high resolution forcing data for land-surface and subsurface models from coarser atmospheric model output. The current downscaling scheme [Schomburg et. al. 2010, 2012] combines a bi-quadratic spline interpolation, deterministic rules and autoregressive noise. For the development of the scheme, training and validation data sets have been created by carrying out high-resolution runs of the atmospheric model. The deterministic rules in this scheme are partly based on known physical relations and partly determined by an automated search for linear relationships between the high resolution fields of the atmospheric model output and high resolution data on surface characteristics. Up to now deterministic rules are available for downscaling surface pressure and partially, depending on the prevailing weather conditions, for near surface temperature and radiation. Aim of our work is to improve those rules and to find deterministic rules for the remaining variables, which require downscaling, e.g. precipitation or near surface specifc humidity. To accomplish that, we broaden the search by allowing for interdependencies between different atmospheric parameters, non-linear relations, non-local and time-lagged relations. To cope with the vast number of possible solutions, we use genetic programming, a method from machine learning, which is based on the principles of natural evolution. We are currently working with GPLAB, a Genetic Programming toolbox for Matlab. At first we have tested the GP system to retrieve the known physical rule for downscaling surface pressure, i.e. the hydrostatic equation, from our training data. We have found this to be a simple task to the GP system. Furthermore we have improved accuracy and efficiency of the GP solution by implementing constant variation and optimization as genetic operators. Next we have worked on an improvement of the downscaling rule for the two-meter-temperature. We have added an if-function with four input arguments to the function set. Since this has shown to increase bloat we have additionally modified our fitness function by including penalty terms for both the size of the solutions and the number intron nodes, i.e program parts that are never evaluated. Starting from the known downscaling rule for the two-meter temperature, which linearly exploits the orography anomalies allowed or disallowed by a certain temperature gradient, our GP system has been able to find an improvement. The rule produced by the GP clearly shows a better performance concerning the reproduced small-scale variability.
Sobie, Eric A
2011-09-13
This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy.
Sobie, Eric A.
2014-01-01
This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy. PMID:21934110
A Summary of the Naval Postgraduate School Research Program and Recent Publications
1990-09-01
principles to divide the spectrum of MATLAB computer program on a 386-type a wide-band spread-spectrum signal into sub- computer. Because of the high rf...original in time and a large data sample was required. An signal. Effects due the fiber optic pickup array extended version of MATLAB that allows and...application, such as orbital mechanics and weather prediction. Professor Gragg has also developed numerous MATLAB programs for linear programming problems
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
MatLab(TradeMark)(MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many countries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its real strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbox. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using symbolic operations. MatLab in its interpreter programming language form (command interface) is similar with well known programming languages such as C/C++, support data structures and cell arrays to define classes in object oriented programming. As such, MatLab is equipped with most of the essential constructs of a higher programming language. MatLab is packaged with an editor and debugging functionality useful to perform analysis of large MatLab programs and find errors. We believe there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and analysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applications. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientific problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabular format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed.
Fault Diagnosis System of Wind Turbine Generator Based on Petri Net
NASA Astrophysics Data System (ADS)
Zhang, Han
Petri net is an important tool for discrete event dynamic systems modeling and analysis. And it has great ability to handle concurrent phenomena and non-deterministic phenomena. Currently Petri nets used in wind turbine fault diagnosis have not participated in the actual system. This article will combine the existing fuzzy Petri net algorithms; build wind turbine control system simulation based on Siemens S7-1200 PLC, while making matlab gui interface for migration of the system to different platforms.
Estimating aquifer transmissivity from specific capacity using MATLAB.
McLin, Stephen G
2005-01-01
Historically, specific capacity information has been used to calculate aquifer transmissivity when pumping test data are unavailable. This paper presents a simple computer program written in the MATLAB programming language that estimates transmissivity from specific capacity data while correcting for aquifer partial penetration and well efficiency. The program graphically plots transmissivity as a function of these factors so that the user can visually estimate their relative importance in a particular application. The program is compatible with any computer operating system running MATLAB, including Windows, Macintosh OS, Linux, and Unix. Two simple examples illustrate program usage.
Ant Lion Optimization algorithm for kidney exchanges.
Hamouda, Eslam; El-Metwally, Sara; Tarek, Mayada
2018-01-01
The kidney exchange programs bring new insights in the field of organ transplantation. They make the previously not allowed surgery of incompatible patient-donor pairs easier to be performed on a large scale. Mathematically, the kidney exchange is an optimization problem for the number of possible exchanges among the incompatible pairs in a given pool. Also, the optimization modeling should consider the expected quality-adjusted life of transplant candidates and the shortage of computational and operational hospital resources. In this article, we introduce a bio-inspired stochastic-based Ant Lion Optimization, ALO, algorithm to the kidney exchange space to maximize the number of feasible cycles and chains among the pool pairs. Ant Lion Optimizer-based program achieves comparable kidney exchange results to the deterministic-based approaches like integer programming. Also, ALO outperforms other stochastic-based methods such as Genetic Algorithm in terms of the efficient usage of computational resources and the quantity of resulting exchanges. Ant Lion Optimization algorithm can be adopted easily for on-line exchanges and the integration of weights for hard-to-match patients, which will improve the future decisions of kidney exchange programs. A reference implementation for ALO algorithm for kidney exchanges is written in MATLAB and is GPL licensed. It is available as free open-source software from: https://github.com/SaraEl-Metwally/ALO_algorithm_for_Kidney_Exchanges.
MNPBEM - A Matlab toolbox for the simulation of plasmonic nanoparticles
NASA Astrophysics Data System (ADS)
Hohenester, Ulrich; Trügler, Andreas
2012-02-01
MNPBEM is a Matlab toolbox for the simulation of metallic nanoparticles (MNP), using a boundary element method (BEM) approach. The main purpose of the toolbox is to solve Maxwell's equations for a dielectric environment where bodies with homogeneous and isotropic dielectric functions are separated by abrupt interfaces. Although the approach is in principle suited for arbitrary body sizes and photon energies, it is tested (and probably works best) for metallic nanoparticles with sizes ranging from a few to a few hundreds of nanometers, and for frequencies in the optical and near-infrared regime. The toolbox has been implemented with Matlab classes. These classes can be easily combined, which has the advantage that one can adapt the simulation programs flexibly for various applications. Program summaryProgram title: MNPBEM Catalogue identifier: AEKJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v2 No. of lines in distributed program, including test data, etc.: 15 700 No. of bytes in distributed program, including test data, etc.: 891 417 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b) Computer: Any which supports Matlab 7.11.0 (R2010b) Operating system: Any which supports Matlab 7.11.0 (R2010b) RAM: ⩾1 GByte Classification: 18 Nature of problem: Solve Maxwell's equations for dielectric particles with homogeneous dielectric functions separated by abrupt interfaces. Solution method: Boundary element method using electromagnetic potentials. Running time: Depending on surface discretization between seconds and hours.
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
MatLab(R) (MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many cou ntries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its re al strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbo x. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using s ymbolic operations. MatLab in its interpreter programming language fo rm (command interface) is similar with well known programming languag es such as C/C++, support data structures and cell arrays to define c lasses in object oriented programming. As such, MatLab is equipped with most ofthe essential constructs of a higher programming language. M atLab is packaged with an editor and debugging functionality useful t o perform analysis of large MatLab programs and find errors. We belie ve there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and ana lysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applicati ons. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientifi c problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabu lar format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed. The presentation will emphasize creating practIcal scripts (pro grams) that extend the basic features of MatLab TOPICS mclude (1) Ma trix and vector analysis and manipulations (2) Mathematical functions (3) Symbolic calculations & functions (4) Import/export data files (5) Program lOgic and flow control (6) Writing function and passing parameters (7) Test application programs
Test Generator for MATLAB Simulations
NASA Technical Reports Server (NTRS)
Henry, Joel
2011-01-01
MATLAB Automated Test Tool, version 3.0 (MATT 3.0) is a software package that provides automated tools that reduce the time needed for extensive testing of simulation models that have been constructed in the MATLAB programming language by use of the Simulink and Real-Time Workshop programs. MATT 3.0 runs on top of the MATLAB engine application-program interface to communicate with the Simulink engine. MATT 3.0 automatically generates source code from the models, generates custom input data for testing both the models and the source code, and generates graphs and other presentations that facilitate comparison of the outputs of the models and the source code for the same input data. Context-sensitive and fully searchable help is provided in HyperText Markup Language (HTML) format.
Improve Problem Solving Skills through Adapting Programming Tools
NASA Technical Reports Server (NTRS)
Shaykhian, Linda H.; Shaykhian, Gholam Ali
2007-01-01
There are numerous ways for engineers and students to become better problem-solvers. The use of command line and visual programming tools can help to model a problem and formulate a solution through visualization. The analysis of problem attributes and constraints provide insight into the scope and complexity of the problem. The visualization aspect of the problem-solving approach tends to make students and engineers more systematic in their thought process and help them catch errors before proceeding too far in the wrong direction. The problem-solver identifies and defines important terms, variables, rules, and procedures required for solving a problem. Every step required to construct the problem solution can be defined in program commands that produce intermediate output. This paper advocates improved problem solving skills through using a programming tool. MatLab created by MathWorks, is an interactive numerical computing environment and programming language. It is a matrix-based system that easily lends itself to matrix manipulation, and plotting of functions and data. MatLab can be used as an interactive command line or a sequence of commands that can be saved in a file as a script or named functions. Prior programming experience is not required to use MatLab commands. The GNU Octave, part of the GNU project, a free computer program for performing numerical computations, is comparable to MatLab. MatLab visual and command programming are presented here.
Calculus Demonstrations Using MATLAB
ERIC Educational Resources Information Center
Dunn, Peter K.; Harman, Chris
2002-01-01
The note discusses ways in which technology can be used in the calculus learning process. In particular, five MATLAB programs are detailed for use by instructors or students that demonstrate important concepts in introductory calculus: Newton's method, differentiation and integration. Two of the programs are animated. The programs and the…
Comparison of cyclic correlation algorithm implemented in matlab and python
NASA Astrophysics Data System (ADS)
Carr, Richard; Whitney, James
Simulation is a necessary step for all engineering projects. Simulation gives the engineers an approximation of how their devices will perform under different circumstances, without hav-ing to build, or before building a physical prototype. This is especially true for space bound devices, i.e., space communication systems, where the impact of system malfunction or failure is several orders of magnitude over that of terrestrial applications. Therefore having a reliable simulation tool is key in developing these devices and systems. Math Works Matrix Laboratory (MATLAB) is a matrix based software used by scientists and engineers to solve problems and perform complex simulations. MATLAB has a number of applications in a wide variety of fields which include communications, signal processing, image processing, mathematics, eco-nomics and physics. Because of its many uses MATLAB has become the preferred software for many engineers; it is also very expensive, especially for students and startups. One alternative to MATLAB is Python. The Python is a powerful, easy to use, open source programming environment that can be used to perform many of the same functions as MATLAB. Python programming environment has been steadily gaining popularity in niche programming circles. While there are not as many function included in the software as MATLAB, there are many open source functions that have been developed that are available to be downloaded for free. This paper illustrates how Python can implement the cyclic correlation algorithm and com-pares the results to the cyclic correlation algorithm implemented in the MATLAB environment. Some of the characteristics to be compared are the accuracy and precision of the results, and the length of the programs. The paper will demonstrate that Python is capable of performing simulations of complex algorithms such cyclic correlation.
The Realization of Drilling Fault Diagnosis Based on Hybrid Programming with Matlab and VB
NASA Astrophysics Data System (ADS)
Wang, Jiangping; Hu, Yingcai
This paper presents a method using hybrid programming with Matlab and VB based on ActiveX to design the system of drilling accident prediction and diagnosis. So that the powerful calculating function and graphical display function of Matlab and visual development interface of VB are combined fully. The main interface of the diagnosis system is compiled in VB,and the analysis and fault diagnosis are implemented by neural network tool boxes in Matlab.The system has favorable interactive interface,and the fault example validation shows that the diagnosis result is feasible and can meet the demands of drilling accident prediction and diagnosis.
High-Speed GPU-Based Fully Three-Dimensional Diffuse Optical Tomographic System
Saikia, Manob Jyoti; Kanhirodan, Rajan; Mohan Vasu, Ram
2014-01-01
We have developed a graphics processor unit (GPU-) based high-speed fully 3D system for diffuse optical tomography (DOT). The reduction in execution time of 3D DOT algorithm, a severely ill-posed problem, is made possible through the use of (1) an algorithmic improvement that uses Broyden approach for updating the Jacobian matrix and thereby updating the parameter matrix and (2) the multinode multithreaded GPU and CUDA (Compute Unified Device Architecture) software architecture. Two different GPU implementations of DOT programs are developed in this study: (1) conventional C language program augmented by GPU CUDA and CULA routines (C GPU), (2) MATLAB program supported by MATLAB parallel computing toolkit for GPU (MATLAB GPU). The computation time of the algorithm on host CPU and the GPU system is presented for C and Matlab implementations. The forward computation uses finite element method (FEM) and the problem domain is discretized into 14610, 30823, and 66514 tetrahedral elements. The reconstruction time, so achieved for one iteration of the DOT reconstruction for 14610 elements, is 0.52 seconds for a C based GPU program for 2-plane measurements. The corresponding MATLAB based GPU program took 0.86 seconds. The maximum number of reconstructed frames so achieved is 2 frames per second. PMID:24891848
High-Speed GPU-Based Fully Three-Dimensional Diffuse Optical Tomographic System.
Saikia, Manob Jyoti; Kanhirodan, Rajan; Mohan Vasu, Ram
2014-01-01
We have developed a graphics processor unit (GPU-) based high-speed fully 3D system for diffuse optical tomography (DOT). The reduction in execution time of 3D DOT algorithm, a severely ill-posed problem, is made possible through the use of (1) an algorithmic improvement that uses Broyden approach for updating the Jacobian matrix and thereby updating the parameter matrix and (2) the multinode multithreaded GPU and CUDA (Compute Unified Device Architecture) software architecture. Two different GPU implementations of DOT programs are developed in this study: (1) conventional C language program augmented by GPU CUDA and CULA routines (C GPU), (2) MATLAB program supported by MATLAB parallel computing toolkit for GPU (MATLAB GPU). The computation time of the algorithm on host CPU and the GPU system is presented for C and Matlab implementations. The forward computation uses finite element method (FEM) and the problem domain is discretized into 14610, 30823, and 66514 tetrahedral elements. The reconstruction time, so achieved for one iteration of the DOT reconstruction for 14610 elements, is 0.52 seconds for a C based GPU program for 2-plane measurements. The corresponding MATLAB based GPU program took 0.86 seconds. The maximum number of reconstructed frames so achieved is 2 frames per second.
OXlearn: a new MATLAB-based simulation tool for connectionist models.
Ruh, Nicolas; Westermann, Gert
2009-11-01
OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.
[Application of the mixed programming with Labview and Matlab in biomedical signal analysis].
Yu, Lu; Zhang, Yongde; Sha, Xianzheng
2011-01-01
This paper introduces the method of mixed programming with Labview and Matlab, and applies this method in a pulse wave pre-processing and feature detecting system. The method has been proved suitable, efficient and accurate, which has provided a new kind of approach for biomedical signal analysis.
Enhancing Student Writing and Computer Programming with LATEX and MATLAB in Multivariable Calculus
ERIC Educational Resources Information Center
Sullivan, Eric; Melvin, Timothy
2016-01-01
Written communication and computer programming are foundational components of an undergraduate degree in the mathematical sciences. All lower-division mathematics courses at our institution are paired with computer-based writing, coding, and problem-solving activities. In multivariable calculus we utilize MATLAB and LATEX to have students explore…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haas, Nicholas Q; Gillen, Robert E; Karnowski, Thomas P
MathWorks' MATLAB is widely used in academia and industry for prototyping, data analysis, data processing, etc. Many users compile their programs using the MATLAB Compiler to run on workstations/computing clusters via the free MATLAB Compiler Runtime (MCR). The MCR facilitates the execution of code calling Application Programming Interfaces (API) functions from both base MATLAB and MATLAB toolboxes. In a Linux environment, a sizable number of third-party runtime dependencies (i.e. shared libraries) are necessary. Unfortunately, to the MTLAB community's knowledge, these dependencies are not documented, leaving system administrators and/or end-users to find/install the necessary libraries either as runtime errors resulting frommore » them missing or by inspecting the header information of Executable and Linkable Format (ELF) libraries of the MCR to determine which ones are missing from the system. To address various shortcomings, Docker Images based on Community Enterprise Operating System (CentOS) 7, a derivative of Redhat Enterprise Linux (RHEL) 7, containing recent (2015-2017) MCR releases and their dependencies were created. These images, along with a provided sample Docker Compose YAML Script, can be used to create a simulated computing cluster where MATLAB Compiler created binaries can be executed using a sample Slurm Workload Manager script.« less
Identification of gene regulation models from single-cell data
NASA Astrophysics Data System (ADS)
Weber, Lisa; Raymond, William; Munsky, Brian
2018-09-01
In quantitative analyses of biological processes, one may use many different scales of models (e.g. spatial or non-spatial, deterministic or stochastic, time-varying or at steady-state) or many different approaches to match models to experimental data (e.g. model fitting or parameter uncertainty/sloppiness quantification with different experiment designs). These different analyses can lead to surprisingly different results, even when applied to the same data and the same model. We use a simplified gene regulation model to illustrate many of these concerns, especially for ODE analyses of deterministic processes, chemical master equation and finite state projection analyses of heterogeneous processes, and stochastic simulations. For each analysis, we employ MATLAB and PYTHON software to consider a time-dependent input signal (e.g. a kinase nuclear translocation) and several model hypotheses, along with simulated single-cell data. We illustrate different approaches (e.g. deterministic and stochastic) to identify the mechanisms and parameters of the same model from the same simulated data. For each approach, we explore how uncertainty in parameter space varies with respect to the chosen analysis approach or specific experiment design. We conclude with a discussion of how our simulated results relate to the integration of experimental and computational investigations to explore signal-activated gene expression models in yeast (Neuert et al 2013 Science 339 584–7) and human cells (Senecal et al 2014 Cell Rep. 8 75–83)5.
ERIC Educational Resources Information Center
Ocak, Mehmet A.
2006-01-01
This correlation study examined the relationship between gender and the students' attitude and prior knowledge of using one of the mathematical software programs (MATLAB). Participants were selected from one community college, one state university and one private college. Students were volunteers from three Calculus I classrooms (one class from…
Image Algebra Matlab language version 2.3 for image processing and compression research
NASA Astrophysics Data System (ADS)
Schmalz, Mark S.; Ritter, Gerhard X.; Hayden, Eric
2010-08-01
Image algebra is a rigorous, concise notation that unifies linear and nonlinear mathematics in the image domain. Image algebra was developed under DARPA and US Air Force sponsorship at University of Florida for over 15 years beginning in 1984. Image algebra has been implemented in a variety of programming languages designed specifically to support the development of image processing and computer vision algorithms and software. The University of Florida has been associated with development of the languages FORTRAN, Ada, Lisp, and C++. The latter implementation involved a class library, iac++, that supported image algebra programming in C++. Since image processing and computer vision are generally performed with operands that are array-based, the Matlab™ programming language is ideal for implementing the common subset of image algebra. Objects include sets and set operations, images and operations on images, as well as templates and image-template convolution operations. This implementation, called Image Algebra Matlab (IAM), has been found to be useful for research in data, image, and video compression, as described herein. Due to the widespread acceptance of the Matlab programming language in the computing community, IAM offers exciting possibilities for supporting a large group of users. The control over an object's computational resources provided to the algorithm designer by Matlab means that IAM programs can employ versatile representations for the operands and operations of the algebra, which are supported by the underlying libraries written in Matlab. In a previous publication, we showed how the functionality of IAC++ could be carried forth into a Matlab implementation, and provided practical details of a prototype implementation called IAM Version 1. In this paper, we further elaborate the purpose and structure of image algebra, then present a maturing implementation of Image Algebra Matlab called IAM Version 2.3, which extends the previous implementation of IAM to include polymorphic operations over different point sets, as well as recursive convolution operations and functional composition. We also show how image algebra and IAM can be employed in image processing and compression research, as well as algorithm development and analysis.
NASA Astrophysics Data System (ADS)
Wessel, Paul; Luis, Joaquim F.
2017-02-01
The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.
NASA Astrophysics Data System (ADS)
Adams, Mike; Smalian, Silva
2017-09-01
For nuclear waste packages the expected dose rates and nuclide inventory are beforehand calculated. Depending on the package of the nuclear waste deterministic programs like MicroShield® provide a range of results for each type of packaging. Stochastic programs like "Monte-Carlo N-Particle Transport Code System" (MCNP®) on the other hand provide reliable results for complex geometries. However this type of program requires a fully trained operator and calculations are time consuming. The problem here is to choose an appropriate program for a specific geometry. Therefore we compared the results of deterministic programs like MicroShield® and stochastic programs like MCNP®. These comparisons enable us to make a statement about the applicability of the various programs for chosen types of containers. As a conclusion we found that for thin-walled geometries deterministic programs like MicroShield® are well suited to calculate the dose rate. For cylindrical containers with inner shielding however, deterministic programs hit their limits. Furthermore we investigate the effect of an inhomogeneous material and activity distribution on the results. The calculations are still ongoing. Results will be presented in the final abstract.
Operating a Geiger-Muller Tube Using a PC Sound Card
ERIC Educational Resources Information Center
Azooz, A. A.
2009-01-01
In this paper, a simple MATLAB-based PC program that enables the computer to function as a replacement for the electronic scalar-counter system associated with a Geiger-Muller (GM) tube is described. The program utilizes the ability of MATLAB to acquire data directly from the computer sound card. The signal from the GM tube is applied to the…
Effective approach to spectroscopy and spectral analysis techniques using Matlab
NASA Astrophysics Data System (ADS)
Li, Xiang; Lv, Yong
2017-08-01
With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching
Blueprint XAS: a Matlab-based toolbox for the fitting and analysis of XAS spectra.
Delgado-Jaime, Mario Ulises; Mewis, Craig Philip; Kennepohl, Pierre
2010-01-01
Blueprint XAS is a new Matlab-based program developed to fit and analyse X-ray absorption spectroscopy (XAS) data, most specifically in the near-edge region of the spectrum. The program is based on a methodology that introduces a novel background model into the complete fit model and that is capable of generating any number of independent fits with minimal introduction of user bias [Delgado-Jaime & Kennepohl (2010), J. Synchrotron Rad. 17, 119-128]. The functions and settings on the five panels of its graphical user interface are designed to suit the needs of near-edge XAS data analyzers. A batch function allows for the setting of multiple jobs to be run with Matlab in the background. A unique statistics panel allows the user to analyse a family of independent fits, to evaluate fit models and to draw statistically supported conclusions. The version introduced here (v0.2) is currently a toolbox for Matlab. Future stand-alone versions of the program will also incorporate several other new features to create a full package of tools for XAS data processing.
The past, present and future of cyber-physical systems: a focus on models.
Lee, Edward A
2015-02-26
This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical.
The Past, Present and Future of Cyber-Physical Systems: A Focus on Models
Lee, Edward A.
2015-01-01
This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical. PMID:25730486
Karpievitch, Yuliya V; Almeida, Jonas S
2006-01-01
Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707
Karpievitch, Yuliya V; Almeida, Jonas S
2006-03-15
Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.
2013-12-01
Implementation of current NPS TPL design procedure that uses COTS software (MATLAB, SolidWorks, and ANSYS - CFX ) for the geometric rendering and...procedure that uses commercial-off-the-shelf software (MATLAB, SolidWorks, and ANSYS - CFX ) for the geometric rendering and analysis was modified and... CFX The CFD simulation program in ANSYS Workbench. CFX -Pre CFX boundary conditions and solver settings module. CFX -Solver CFX solver program. CFX
Photon-Limited Information in High Resolution Laser Ranging
2014-05-28
entangled photons generated by spontaneous parametric down-conversion of a chirped source to perform ranging measurements. Summary of the Most... Matlab program to collect the photon counts from the time to digital converter (TDC). This entailed setting up Matlab to talk to the TDC to get the...SECURITY CLASSIFICATION OF: This project is an effort under the Information in a Photon (InPho) program at DARPA\\DSO. Its purpose is to investigate
The Julia programming language: the future of scientific computing
NASA Astrophysics Data System (ADS)
Gibson, John
2017-11-01
Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.
OPTICON: Pro-Matlab software for large order controlled structure design
NASA Technical Reports Server (NTRS)
Peterson, Lee D.
1989-01-01
A software package for large order controlled structure design is described and demonstrated. The primary program, called OPTICAN, uses both Pro-Matlab M-file routines and selected compiled FORTRAN routines linked into the Pro-Matlab structure. The program accepts structural model information in the form of state-space matrices and performs three basic design functions on the model: (1) open loop analyses; (2) closed loop reduced order controller synthesis; and (3) closed loop stability and performance assessment. The current controller synthesis methods which were implemented in this software are based on the Generalized Linear Quadratic Gaussian theory of Bernstein. In particular, a reduced order Optimal Projection synthesis algorithm based on a homotopy solution method was successfully applied to an experimental truss structure using a 58-state dynamic model. These results are presented and discussed. Current plans to expand the practical size of the design model to several hundred states and the intention to interface Pro-Matlab to a supercomputing environment are discussed.
Xu, Jingping; Lightsom, Fran; Noble, Marlene A.; Denham, Charles
2002-01-01
During the past several years, the sediment transport group in the Coastal and Marine Geology Program (CMGP) of the U. S. Geological Survey has made major revisions to its methodology of processing, analyzing, and maintaining the variety of oceanographic time-series data. First, CMGP completed the transition of the its oceanographic time-series database to a self-documenting NetCDF (Rew et al., 1997) data format. Second, CMGP’s oceanographic data variety and complexity have been greatly expanded from traditional 2-dimensional, single-point time-series measurements (e.g., Electro-magnetic current meters, transmissometers) to more advanced 3-dimensional and profiling time-series measurements due to many new acquisitions of modern instruments such as Acoustic Doppler Current Profiler (RDI, 1996), Acoustic Doppler Velocitimeter, Pulse-Coherence Acoustic Doppler Profiler (SonTek, 2001), Acoustic Bacscatter Sensor (Aquatec, 1001001001001001001). In order to accommodate the NetCDF format of data from the new instruments, a software package of processing, analyzing, and visualizing time-series oceanographic data was developed. It is named CMGTooL. The CMGTooL package contains two basic components: a user-friendly GUI for NetCDF file analysis, processing and manipulation; and a data analyzing program library. Most of the routines in the library are stand-alone programs suitable for batch processing. CMGTooL is written in MATLAB computing language (The Mathworks, 1997), therefore users must have MATLAB installed on their computer in order to use this software package. In addition, MATLAB’s Signal Processing Toolbox is also required by some CMGTooL’s routines. Like most MATLAB programs, all CMGTooL codes are compatible with different computing platforms including PC, MAC, and UNIX machines (Note: CMGTooL has been tested on different platforms that run MATLAB 5.2 (Release 10) or lower versions. Some of the commands related to MAC may not be compatible with later releases of MATLAB). The GUI and some of the library routines call low-level NetCDF file I/O, variable and attribute functions. These NetCDF exclusive functions are supported by a MATLAB toolbox named NetCDF, created by Dr. Charles Denham . This toolbox has to be installed in order to use the CMGTooL GUI. The CMGTooL GUI calls several routines that were initially developed by others. The authors would like to acknowledge the following scientists for their ideas and codes: Dr. Rich Signell (USGS), Dr. Chris Sherwood (USGS), and Dr. Bob Beardsley (WHOI). Many special terms that carry special meanings in either MATLAB or the NetCDF Toolbox are used in this manual. Users are encouraged to read the documents of MATLAB and NetCDF for references.
MATLAB as an incentive for student learning of skills
NASA Astrophysics Data System (ADS)
Bank, C. G.; Ghent, R. R.
2016-12-01
Our course "Computational Geology" takes a holistic approach to student learning by using MATLAB as a focal point to increase students' computing, quantitative reasoning, data analysis, report writing, and teamwork skills. The course, taught since 2007 with recent enrollments around 35 and aimed at 2nd to 3rd-year students, is required for the Geology and Earth and Environmental Systems major programs, and can be chosen as elective in our other programs, including Geophysics. The course is divided into five projects: Pacific plate velocity from the Hawaiian hotspot track, predicting CO2 concentration in the atmosphere, volume of Earth's oceans and sea-level rise, comparing wind directions for Vancouver and Squamish, and groundwater flow. Each project is based on real data, focusses on a mathematical concept (linear interpolation, gradients, descriptive statistics, differential equations) and highlights a programming task (arrays, functions, text file input/output, curve fitting). Working in teams of three, students need to develop a conceptional model to explain the data, and write MATLAB code to visualize the data and match it to their conceptional model. The programming is guided, and students work individually on different aspects (for example: reading the data, fitting a function, unit conversion) which they need to put together to solve the problem. They then synthesize their thought process in a paper. Anecdotal evidence shows that students continue using MATLAB in other courses.
Ada programming guidelines for deterministic storage management
NASA Technical Reports Server (NTRS)
Auty, David
1988-01-01
Previous reports have established that a program can be written in the Ada language such that the program's storage management requirements are determinable prior to its execution. Specific guidelines for ensuring such deterministic usage of Ada dynamic storage requirements are described. Because requirements may vary from one application to another, guidelines are presented in a most-restrictive to least-restrictive fashion to allow the reader to match appropriate restrictions to the particular application area under investigation.
Case studies on optimization problems in MATLAB and COMSOL multiphysics by means of the livelink
NASA Astrophysics Data System (ADS)
Ozana, Stepan; Pies, Martin; Docekal, Tomas
2016-06-01
LiveLink for COMSOL is a tool that integrates COMSOL Multiphysics with MATLAB to extend one's modeling with scripting programming in the MATLAB environment. It allows user to utilize the full power of MATLAB and its toolboxes in preprocessing, model manipulation, and post processing. At first, the head script launches COMSOL with MATLAB and defines initial value of all parameters, refers to the objective function J described in the objective function and creates and runs the defined optimization task. Once the task is launches, the COMSOL model is being called in the iteration loop (from MATLAB environment by use of API interface), changing defined optimization parameters so that the objective function is minimized, using fmincon function to find a local or global minimum of constrained linear or nonlinear multivariable function. Once the minimum is found, it returns exit flag, terminates optimization and returns the optimized values of the parameters. The cooperation with MATLAB via LiveLink enhances a powerful computational environment with complex multiphysics simulations. The paper will introduce using of the LiveLink for COMSOL for chosen case studies in the field of technical cybernetics and bioengineering.
The Waveform Suite: A robust platform for accessing and manipulating seismic waveforms in MATLAB
NASA Astrophysics Data System (ADS)
Reyes, C. G.; West, M. E.; McNutt, S. R.
2009-12-01
The Waveform Suite, developed at the University of Alaska Geophysical Institute, is an open-source collection of MATLAB classes that provide a means to import, manipulate, display, and share waveform data while ensuring integrity of the data and stability for programs that incorporate them. Data may be imported from a variety of sources, such as Antelope, Winston databases, SAC files, SEISAN, .mat files, or other user-defined file formats. The waveforms being manipulated in MATLAB are isolated from their stored representations, relieving the overlying programs from the responsibility of understanding the specific format in which data is stored or retrieved. The waveform class provides an object oriented framework that simplifies manipulations to waveform data. Playing with data becomes easier because the tedious aspects of data manipulation have been automated. The user is able to change multiple waveforms simultaneously using standard mathematical operators and other syntactically familiar functions. Unlike MATLAB structs or workspace variables, the data stored within waveform class objects are protected from modification, and instead are accessed through standardized functions, such as get and set; these are already familiar to users of MATLAB’s graphical features. This prevents accidental or nonsensical modifications to the data, which in turn simplifies troubleshooting of complex programs. Upgrades to the internal structure of the waveform class are invisible to applications which use it, making maintenance easier. We demonstrate the Waveform Suite’s capabilities on seismic data from Okmok and Redoubt volcanoes. Years of data from Okmok were retrieved from Antelope and Winston databases. Using the Waveform Suite, we built a tremor-location program. Because the program was built on the Waveform Suite, modifying it to operate on real-time data from Redoubt involved only minimal code changes. The utility of the Waveform Suite as a foundation for large developments is demonstrated with the Correlation Toolbox for MATLAB. This mature package contains 50+ codes for carrying out various type of waveform correlation analyses (multiplet analysis, clustering, interferometry, …) This package is greatly strengthened by delegating numerous book-keeping and signal processing tasks to the underlying Waveform Suite. The Waveform Suite’s built-in tools for searching arbitrary directory/file structures is demonstrated with matched video and audio from the recent eruption of Redoubt Volcano. These tools were used to find subsets of photo images corresponding to specific seismic traces. Using Waveform’s audio file routines, matched video and audio were assembled to produce outreach-quality eruption products. The Waveform Suite is not designed as a ready-to-go replacement for more comprehensive packages such as SAC or AH. Rather, it is a suite of classes which provide core time series functionality in a MATLAB environment. It is designed to be a more robust alternative to the numerous ad hoc MATLAB formats that exist. Complex programs may be created upon the Waveform Suite’s framework, while existing programs may be modified to take advantage of the Waveform Suites capabilities.
MATLAB-Based Program for Teaching Autocorrelation Function and Noise Concepts
ERIC Educational Resources Information Center
Jovanovic Dolecek, G.
2012-01-01
An attractive MATLAB-based tool for teaching the basics of autocorrelation function and noise concepts is presented in this paper. This tool enhances traditional in-classroom lecturing. The demonstrations of the tool described here highlight the description of the autocorrelation function (ACF) in a general case for wide-sense stationary (WSS)…
Arc_Mat: a Matlab-based spatial data analysis toolbox
NASA Astrophysics Data System (ADS)
Liu, Xingjian; Lesage, James
2010-03-01
This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.
Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J
2009-01-01
Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies.
MOFA Software for the COBRA Toolbox
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griesemer, Marc; Navid, Ali
MOFA-COBRA is a software code for Matlab that performs Multi-Objective Flux Analysis (MOFA), a solving of linear programming problems. Teh leading software package for conducting different types of analyses using constrain-based models is the COBRA Toolbox for Matlab. MOFA-COBRA is an added tool for COBRA that solves multi-objective problems using a novel algorithm.
Kinematic simulation and analysis of robot based on MATLAB
NASA Astrophysics Data System (ADS)
Liao, Shuhua; Li, Jiong
2018-03-01
The history of industrial automation is characterized by quick update technology, however, without a doubt, the industrial robot is a kind of special equipment. With the help of MATLAB matrix and drawing capacity in the MATLAB environment each link coordinate system set up by using the d-h parameters method and equation of motion of the structure. Robotics, Toolbox programming Toolbox and GUIDE to the joint application is the analysis of inverse kinematics and path planning and simulation, preliminary solve the problem of college students the car mechanical arm positioning theory, so as to achieve the aim of reservation.
Frequency Domain Identification Toolbox
NASA Technical Reports Server (NTRS)
Horta, Lucas G.; Juang, Jer-Nan; Chen, Chung-Wen
1996-01-01
This report documents software written in MATLAB programming language for performing identification of systems from frequency response functions. MATLAB is a commercial software environment which allows easy manipulation of data matrices and provides other intrinsic matrix functions capabilities. Algorithms programmed in this collection of subroutines have been documented elsewhere but all references are provided in this document. A main feature of this software is the use of matrix fraction descriptions and system realization theory to identify state space models directly from test data. All subroutines have templates for the user to use as guidelines.
Algorithm 971: An Implementation of a Randomized Algorithm for Principal Component Analysis
LI, HUAMIN; LINDERMAN, GEORGE C.; SZLAM, ARTHUR; STANTON, KELLY P.; KLUGER, YUVAL; TYGERT, MARK
2017-01-01
Recent years have witnessed intense development of randomized methods for low-rank approximation. These methods target principal component analysis and the calculation of truncated singular value decompositions. The present article presents an essentially black-box, foolproof implementation for Mathworks’ MATLAB, a popular software platform for numerical computation. As illustrated via several tests, the randomized algorithms for low-rank approximation outperform or at least match the classical deterministic techniques (such as Lanczos iterations run to convergence) in basically all respects: accuracy, computational efficiency (both speed and memory usage), ease-of-use, parallelizability, and reliability. However, the classical procedures remain the methods of choice for estimating spectral norms and are far superior for calculating the least singular values and corresponding singular vectors (or singular subspaces). PMID:28983138
Simulation of anaerobic digestion processes using stochastic algorithm.
Palanichamy, Jegathambal; Palani, Sundarambal
2014-01-01
The Anaerobic Digestion (AD) processes involve numerous complex biological and chemical reactions occurring simultaneously. Appropriate and efficient models are to be developed for simulation of anaerobic digestion systems. Although several models have been developed, mostly they suffer from lack of knowledge on constants, complexity and weak generalization. The basis of the deterministic approach for modelling the physico and bio-chemical reactions occurring in the AD system is the law of mass action, which gives the simple relationship between the reaction rates and the species concentrations. The assumptions made in the deterministic models are not hold true for the reactions involving chemical species of low concentration. The stochastic behaviour of the physicochemical processes can be modeled at mesoscopic level by application of the stochastic algorithms. In this paper a stochastic algorithm (Gillespie Tau Leap Method) developed in MATLAB was applied to predict the concentration of glucose, acids and methane formation at different time intervals. By this the performance of the digester system can be controlled. The processes given by ADM1 (Anaerobic Digestion Model 1) were taken for verification of the model. The proposed model was verified by comparing the results of Gillespie's algorithms with the deterministic solution for conversion of glucose into methane through degraders. At higher value of 'τ' (timestep), the computational time required for reaching the steady state is more since the number of chosen reactions is less. When the simulation time step is reduced, the results are similar to ODE solver. It was concluded that the stochastic algorithm is a suitable approach for the simulation of complex anaerobic digestion processes. The accuracy of the results depends on the optimum selection of tau value.
Integrating products of Bessel functions with an additional exponential or rational factor
NASA Astrophysics Data System (ADS)
Van Deun, Joris; Cools, Ronald
2008-04-01
We provide two MATLAB programs to compute integrals of the form ex∏i=1kJν_i(ax)dxand 0∞xr+x∏i=1kJν_i(ax)dx with Jν_i(x) the Bessel function of the first kind and (real) order ν. The parameter m is a real number such that ∑ν+m>-1 (to assure integrability near zero), r is real and the numbers c and a are all strictly positive. The program can deliver accurate error estimates. Program summaryProgram title: BESSELINTR, BESSELINTC Catalogue identifier: AEAH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1601 No. of bytes in distributed program, including test data, etc.: 13 161 Distribution format: tar.gz Programming language: Matlab (version ⩾6.5), Octave (version ⩾2.1.69) Computer: All supporting Matlab or Octave Operating system: All supporting Matlab or Octave RAM: For k Bessel functions our program needs approximately ( 500+140k) double precision variables Classification: 4.11 Nature of problem: The problem consists in integrating an arbitrary product of Bessel functions with an additional rational or exponential factor over a semi-infinite interval. Difficulties arise from the irregular oscillatory behaviour and the possible slow decay of the integrand, which prevents truncation at a finite point. Solution method: The interval of integration is split into a finite and infinite part. The integral over the finite part is computed using Gauss-Legendre quadrature. The integrand on the infinite part is approximated using asymptotic expansions and this approximation is integrated exactly with the aid of the upper incomplete gamma function. In the case where a rational factor is present, this factor is first expanded in a Taylor series around infinity. Restrictions: Some (and eventually all) numerical accuracy is lost when one or more of the parameters r,c,a or v grow very large, or when r becomes small. Running time: Less than 0.02 s for a simple problem (two Bessel functions, small parameters), a few seconds for a more complex problem (more than six Bessel functions, large parameters), in Matlab 7.4 (R2007a) on a 2.4 GHz AMD Opteron Processor 250. References:J. Van Deun, R. Cools, Algorithm 858: Computing infinite range integrals of an arbitrary product of Bessel functions, ACM Trans. Math. Software 32 (4) (2006) 580-596.
OMPC: an Open-Source MATLAB-to-Python Compiler.
Jurica, Peter; van Leeuwen, Cees
2009-01-01
Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.
Rose, Jonas; Otto, Tobias; Dittrich, Lars
2008-10-30
The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.
Operating a Geiger Müller tube using a PC sound card
NASA Astrophysics Data System (ADS)
Azooz, A. A.
2009-01-01
In this paper, a simple MATLAB-based PC program that enables the computer to function as a replacement for the electronic scalar-counter system associated with a Geiger-Müller (GM) tube is described. The program utilizes the ability of MATLAB to acquire data directly from the computer sound card. The signal from the GM tube is applied to the computer sound card via the line in port. All standard GM experiments, pulse shape and statistical analysis experiments can be carried out using this system. A new visual demonstration of dead time effects is also presented.
Intelligent traffic lights based on MATLAB
NASA Astrophysics Data System (ADS)
Nie, Ying
2018-04-01
In this paper, I describes the traffic lights system and it has some. Through analysis, I used MATLAB technology, transformed the camera photographs into digital signals. Than divided the road vehicle is into three methods: very congestion, congestion, a little congestion. Through the MCU programming, solved the different roads have different delay time, and Used this method, saving time and resources, so as to reduce road congestion.
GRAFLAB 2.3 for UNIX - A MATLAB database, plotting, and analysis tool: User`s guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunn, W.N.
1998-03-01
This report is a user`s manual for GRAFLAB, which is a new database, analysis, and plotting package that has been written entirely in the MATLAB programming language. GRAFLAB is currently used for data reduction, analysis, and archival. GRAFLAB was written to replace GRAFAID, which is a FORTRAN database, analysis, and plotting package that runs on VAX/VMS.
NASA Astrophysics Data System (ADS)
Sylwestrzak, Marcin; Szlag, Daniel; Marchand, Paul J.; Kumar, Ashwin S.; Lasser, Theo
2017-08-01
We present an application of massively parallel processing of quantitative flow measurements data acquired using spectral optical coherence microscopy (SOCM). The need for massive signal processing of these particular datasets has been a major hurdle for many applications based on SOCM. In view of this difficulty, we implemented and adapted quantitative total flow estimation algorithms on graphics processing units (GPU) and achieved a 150 fold reduction in processing time when compared to a former CPU implementation. As SOCM constitutes the microscopy counterpart to spectral optical coherence tomography (SOCT), the developed processing procedure can be applied to both imaging modalities. We present the developed DLL library integrated in MATLAB (with an example) and have included the source code for adaptations and future improvements. Catalogue identifier: AFBT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 913552 No. of bytes in distributed program, including test data, etc.: 270876249 Distribution format: tar.gz Programming language: CUDA/C, MATLAB. Computer: Intel x64 CPU, GPU supporting CUDA technology. Operating system: 64-bit Windows 7 Professional. Has the code been vectorized or parallelized?: Yes, CPU code has been vectorized in MATLAB, CUDA code has been parallelized. RAM: Dependent on users parameters, typically between several gigabytes and several tens of gigabytes Classification: 6.5, 18. Nature of problem: Speed up of data processing in optical coherence microscopy Solution method: Utilization of GPU for massively parallel data processing Additional comments: Compiled DLL library with source code and documentation, example of utilization (MATLAB script with raw data) Running time: 1,8 s for one B-scan (150 × faster in comparison to the CPU data processing time)
A general spectral method for the numerical simulation of one-dimensional interacting fermions
NASA Astrophysics Data System (ADS)
Clason, Christian; von Winckel, Gregory
2012-08-01
This software implements a general framework for the direct numerical simulation of systems of interacting fermions in one spatial dimension. The approach is based on a specially adapted nodal spectral Galerkin method, where the basis functions are constructed to obey the antisymmetry relations of fermionic wave functions. An efficient Matlab program for the assembly of the stiffness and potential matrices is presented, which exploits the combinatorial structure of the sparsity pattern arising from this discretization to achieve optimal run-time complexity. This program allows the accurate discretization of systems with multiple fermions subject to arbitrary potentials, e.g., for verifying the accuracy of multi-particle approximations such as Hartree-Fock in the few-particle limit. It can be used for eigenvalue computations or numerical solutions of the time-dependent Schrödinger equation. The new version includes a Python implementation of the presented approach. New version program summaryProgram title: assembleFermiMatrix Catalogue identifier: AEKO_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKO_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 332 No. of bytes in distributed program, including test data, etc.: 5418 Distribution format: tar.gz Programming language: MATLAB/GNU Octave, Python Computer: Any architecture supported by MATLAB, GNU Octave or Python Operating system: Any supported by MATLAB, GNU Octave or Python RAM: Depends on the data Classification: 4.3, 2.2. External routines: Python 2.7+, NumPy 1.3+, SciPy 0.10+ Catalogue identifier of previous version: AEKO_v1_0 Journal reference of previous version: Comput. Phys. Commun. 183 (2012) 405 Does the new version supersede the previous version?: Yes Nature of problem: The direct numerical solution of the multi-particle one-dimensional Schrödinger equation in a quantum well is challenging due to the exponential growth in the number of degrees of freedom with increasing particles. Solution method: A nodal spectral Galerkin scheme is used where the basis functions are constructed to obey the antisymmetry relations of the fermionic wave function. The assembly of these matrices is performed efficiently by exploiting the combinatorial structure of the sparsity patterns. Reasons for new version: A Python implementation is now included. Summary of revisions: Added a Python implementation; small documentation fixes in Matlab implementation. No change in features of the package. Restrictions: Only one-dimensional computational domains with homogeneous Dirichlet or periodic boundary conditions are supported. Running time: Seconds to minutes.
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poon, Justin; Sabondjian, Eric; Sankreacha, Raxa
Purpose: A robust Quality Assurance (QA) program is essential for prostate brachytherapy ultrasound systems due to the importance of imaging accuracy during treatment and planning. Task Group 128 of the American Association of Physicists in Medicine has recommended a set of QA tests covering grayscale visibility, depth of penetration, axial and lateral resolution, distance measurement, area measurement, volume measurement, and template/electronic grid alignment. Making manual measurements on the ultrasound system can be slow and inaccurate, so a MATLAB program was developed for automation of the described tests. Methods: Test images were acquired using a BK Medical Flex Focus 400 ultrasoundmore » scanner and 8848 transducer with the CIRS Brachytherapy QA Phantom – Model 045A. For each test, the program automatically segments the inputted image(s), makes the appropriate measurements, and indicates if the test passed or failed. The program was tested by analyzing two sets of images, where the measurements from the first set were used as baseline values. Results: The program successfully analyzed the images for each test and determined if any action limits were exceeded. All tests passed – the measurements made by the program were consistent and met the requirements outlined by Task Group 128. Conclusions: The MATLAB program we have developed can be used for automated QA of an ultrasound system for prostate brachytherapy. The GUI provides a user-friendly way to analyze images without the need for any manual measurement, potentially removing intra- and inter-user variability for more consistent results.« less
Probabilistic Design and Analysis Framework
NASA Technical Reports Server (NTRS)
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.
Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C
2004-07-01
A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.
Earth Science Curriculum Enrichment Through Matlab!
NASA Astrophysics Data System (ADS)
Salmun, H.; Buonaiuto, F. S.
2016-12-01
The use of Matlab in Earth Science undergraduate courses in the Department of Geography at Hunter College began as a pilot project in Fall 2008 and has evolved and advanced to being a significant component of an Advanced Oceanography course, the selected tool for data analysis in other courses and the main focus of a graduate course for doctoral students at The city University of New York (CUNY) working on research related to geophysical, oceanic and atmospheric dynamics. The primary objectives of these efforts were to enhance the Earth Science curriculum through course specific applications, to increase undergraduate programming and data analysis skills, and to develop a Matlab users network within the Department and the broader Hunter College and CUNY community. Students have had the opportunity to learn Matlab as a stand-alone course, within an independent study group, or as a laboratory component within related STEM classes. All of these instructional efforts incorporated the use of prepackaged Matlab exercises and a research project. Initial exercises were designed to cover basic scripting and data visualization techniques. Students were provided data and a skeleton script to modify and improve upon based on the laboratory instructions. As student's programming skills increased throughout the semester more advanced scripting, data mining and data analysis were assigned. In order to illustrate the range of applications within the Earth Sciences, laboratory exercises were constructed around topics selected from the disciplines of Geology, Physics, Oceanography, Meteorology and Climatology. In addition the structure of the research component of the courses included both individual and team projects.
Multiscale Hy3S: hybrid stochastic simulation for supercomputers.
Salis, Howard; Sotiropoulos, Vassilios; Kaznessis, Yiannis N
2006-02-24
Stochastic simulation has become a useful tool to both study natural biological systems and design new synthetic ones. By capturing the intrinsic molecular fluctuations of "small" systems, these simulations produce a more accurate picture of single cell dynamics, including interesting phenomena missed by deterministic methods, such as noise-induced oscillations and transitions between stable states. However, the computational cost of the original stochastic simulation algorithm can be high, motivating the use of hybrid stochastic methods. Hybrid stochastic methods partition the system into multiple subsets and describe each subset as a different representation, such as a jump Markov, Poisson, continuous Markov, or deterministic process. By applying valid approximations and self-consistently merging disparate descriptions, a method can be considerably faster, while retaining accuracy. In this paper, we describe Hy3S, a collection of multiscale simulation programs. Building on our previous work on developing novel hybrid stochastic algorithms, we have created the Hy3S software package to enable scientists and engineers to both study and design extremely large well-mixed biological systems with many thousands of reactions and chemical species. We have added adaptive stochastic numerical integrators to permit the robust simulation of dynamically stiff biological systems. In addition, Hy3S has many useful features, including embarrassingly parallelized simulations with MPI; special discrete events, such as transcriptional and translation elongation and cell division; mid-simulation perturbations in both the number of molecules of species and reaction kinetic parameters; combinatorial variation of both initial conditions and kinetic parameters to enable sensitivity analysis; use of NetCDF optimized binary format to quickly read and write large datasets; and a simple graphical user interface, written in Matlab, to help users create biological systems and analyze data. We demonstrate the accuracy and efficiency of Hy3S with examples, including a large-scale system benchmark and a complex bistable biochemical network with positive feedback. The software itself is open-sourced under the GPL license and is modular, allowing users to modify it for their own purposes. Hy3S is a powerful suite of simulation programs for simulating the stochastic dynamics of networks of biochemical reactions. Its first public version enables computational biologists to more efficiently investigate the dynamics of realistic biological systems.
Flight Dynamics and Control of a Morphing UAV: Bio inspired by Natural Fliers
2017-02-17
Approved for public release: distribution unlimited. IV Modelling and Sizing Tornado Vortex Lattice Method (VLM) was used for aerodynamic prediction... Tornado is a Vortex Lattice Method software programmed in MATLAB; it was selected due to its fast solving time and ability to be controlled through...custom MATLAB scripts. Tornado VLM models the wing as thin sheet of discrete vortices and computes the pressure and force distributions around the
2004-11-16
MATLAB Algorithms for Rapid Detection and Embedding of Palindrome and Emordnilap Electronic Watermarks in Simulated Chemical and Biological Image ...and Emordnilap Electronic Watermarks in Simulated Chemical and Biological Image Data 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...Conference on Chemical and Biological Defense Research. Held in Hunt Valley, Maryland on 15-17 November 2004., The original document contains color images
Nuclear Fuel Depletion Analysis Using Matlab Software
NASA Astrophysics Data System (ADS)
Faghihi, F.; Nematollahi, M. R.
Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.
Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam
2018-03-11
To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.
A Simulation Program for Dynamic Infrared (IR) Spectra
ERIC Educational Resources Information Center
Zoerb, Matthew C.; Harris, Charles B.
2013-01-01
A free program for the simulation of dynamic infrared (IR) spectra is presented. The program simulates the spectrum of two exchanging IR peaks based on simple input parameters. Larger systems can be simulated with minor modifications. The program is available as an executable program for PCs or can be run in MATLAB on any operating system. Source…
Do rational numbers play a role in selection for stochasticity?
Sinclair, Robert
2014-01-01
When a given tissue must, to be able to perform its various functions, consist of different cell types, each fairly evenly distributed and with specific probabilities, then there are at least two quite different developmental mechanisms which might achieve the desired result. Let us begin with the case of two cell types, and first imagine that the proportion of numbers of cells of these types should be 1:3. Clearly, a regular structure composed of repeating units of four cells, three of which are of the dominant type, will easily satisfy the requirements, and a deterministic mechanism may lend itself to the task. What if, however, the proportion should be 10:33? The same simple, deterministic approach would now require a structure of repeating units of 43 cells, and this certainly seems to require a far more complex and potentially prohibitive deterministic developmental program. Stochastic development, replacing regular units with random distributions of given densities, might not be evolutionarily competitive in comparison with the deterministic program when the proportions should be 1:3, but it has the property that, whatever developmental mechanism underlies it, its complexity does not need to depend very much upon target cell densities at all. We are immediately led to speculate that proportions which correspond to fractions with large denominators (such as the 33 of 10/33) may be more easily achieved by stochastic developmental programs than by deterministic ones, and this is the core of our thesis: that stochastic development may tend to occur more often in cases involving rational numbers with large denominators. To be imprecise: that simple rationality and determinism belong together, as do irrationality and randomness.
NASA Astrophysics Data System (ADS)
Caplan, R. M.
2013-04-01
We present a simple to use, yet powerful code package called NLSEmagic to numerically integrate the nonlinear Schrödinger equation in one, two, and three dimensions. NLSEmagic is a high-order finite-difference code package which utilizes graphic processing unit (GPU) parallel architectures. The codes running on the GPU are many times faster than their serial counterparts, and are much cheaper to run than on standard parallel clusters. The codes are developed with usability and portability in mind, and therefore are written to interface with MATLAB utilizing custom GPU-enabled C codes with the MEX-compiler interface. The packages are freely distributed, including user manuals and set-up files. Catalogue identifier: AEOJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOJ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 124453 No. of bytes in distributed program, including test data, etc.: 4728604 Distribution format: tar.gz Programming language: C, CUDA, MATLAB. Computer: PC, MAC. Operating system: Windows, MacOS, Linux. Has the code been vectorized or parallelized?: Yes. Number of processors used: Single CPU, number of GPU processors dependent on chosen GPU card (max is currently 3072 cores on GeForce GTX 690). Supplementary material: Setup guide, Installation guide. RAM: Highly dependent on dimensionality and grid size. For typical medium-large problem size in three dimensions, 4GB is sufficient. Keywords: Nonlinear Schröodinger Equation, GPU, high-order finite difference, Bose-Einstien condensates. Classification: 4.3, 7.7. Nature of problem: Integrate solutions of the time-dependent one-, two-, and three-dimensional cubic nonlinear Schrödinger equation. Solution method: The integrators utilize a fully-explicit fourth-order Runge-Kutta scheme in time and both second- and fourth-order differencing in space. The integrators are written to run on NVIDIA GPUs and are interfaced with MATLAB including built-in visualization and analysis tools. Restrictions: The main restriction for the GPU integrators is the amount of RAM on the GPU as the code is currently only designed for running on a single GPU. Unusual features: Ability to visualize real-time simulations through the interaction of MATLAB and the compiled GPU integrators. Additional comments: Setup guide and Installation guide provided. Program has a dedicated web site at www.nlsemagic.com. Running time: A three-dimensional run with a grid dimension of 87×87×203 for 3360 time steps (100 non-dimensional time units) takes about one and a half minutes on a GeForce GTX 580 GPU card.
ALGORITHMS AND PROGRAMS FOR STRONG GRAVITATIONAL LENSING IN KERR SPACE-TIME INCLUDING POLARIZATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Bin; Maddumage, Prasad; Kantowski, Ronald
2015-05-15
Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravitymore » field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.« less
NASA Astrophysics Data System (ADS)
Bakhtavar, E.
2015-09-01
In this study, transition from open pit to block caving has been considered as a challenging problem. For this purpose, the linear integer programing code of Matlab was initially developed on the basis of the binary integer model proposed by Bakhtavar et al (2012). Then a program based on graphical user interface (GUI) was set up and named "Op-Ug TD Optimizer". It is a beneficial tool for simple application of the model in all situations where open pit is considered together with block caving method for mining an ore deposit. Finally, Op-Ug TD Optimizer has been explained step by step through solving the transition from open pit to block caving problem of a case ore deposit. W pracy tej rozważano skomplikowane zagadnienie przejścia od wybierania odkrywkowego do komorowego. W tym celu opracowano kod programowania liniowego w środowisku MATLAB w oparciu o model liczb binarnych zaproponowany przez Bakhtavara (2012). Następnie opracowano program z wykorzystujący graficzny interfejs użytkownika o nazwie Optymalizator Op-Ug TD. Jest to niezwykle cenne narzędzie umożliwiające stosowanie modelu dla wszystkich warunków w sytuacjach gdy rozważamy prowadzenie wydobycia metodą odkrywkową oraz wydobycie komorowe przy eksploatacji złóż rud żelaza. W końcowej części pracy podano szczegółową instrukcję stosowanie programu Optymalizator na przedstawionym przykładzie przejścia od wydobycia rud żelaza metodami odkrywkowymi poprzez wydobycie komorami.
Algorithms and Programs for Strong Gravitational Lensing In Kerr Space-time Including Polarization
NASA Astrophysics Data System (ADS)
Chen, Bin; Kantowski, Ronald; Dai, Xinyu; Baron, Eddie; Maddumage, Prasad
2015-05-01
Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravity field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.
Synthesis of multi-loop automatic control systems by the nonlinear programming method
NASA Astrophysics Data System (ADS)
Voronin, A. V.; Emelyanova, T. A.
2017-01-01
The article deals with the problem of calculation of the multi-loop control systems optimal tuning parameters by numerical methods and nonlinear programming methods. For this purpose, in the paper the Optimization Toolbox of Matlab is used.
TU-AB-303-11: Predict Parotids Deformation Applying SIS Epidemiological Model in H&N Adaptive RT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maffei, N; Guidi, G; University of Bologna, Bologna, Bologna
2015-06-15
Purpose: The aim is to investigate the use of epidemiological models to predict morphological variations in patients undergoing radiation therapy (RT). The susceptible-infected-susceptible (SIS) deterministic model was applied to simulate warping within a focused region of interest (ROI). Hypothesis is to consider each voxel like a single subject of the whole sample and to treat displacement vector fields like an infection. Methods: Using Raystation hybrid deformation algorithms and automatic re-contouring based on mesh grid, we post-processed 360 MVCT images of 12 H&N patients treated with Tomotherapy. Study focused on parotid glands, identified by literature and previous analysis, as ROI moremore » susceptible to warping in H&N region. Susceptible (S) and infectious (I) cases were identified in voxels with inter-fraction movement respectively under and over a set threshold. IronPython scripting allowed to export positions and displacement data of surface voxels for every fraction. A MATLAB homemade toolbox was developed to model the SIS. Results: SIS model was validated simulating organ motion on QUASAR phantom. Applying model in patients, within a [0–1cm] range, a single voxel movement of 0.4cm was selected as displacement threshold. SIS indexes were evaluated by MATLAB simulations. Dynamic time warping algorithm was used to assess matching between model and parotids behavior days of treatments. The best fit of the model was obtained with contact rate of 7.89±0.94 and recovery rate of 2.36±0.21. Conclusion: SIS model can follow daily structures evolutions, making possible to compare warping conditions and highlighting challenges due to abnormal variation and set-up errors. By epidemiology approach, organ motion could be assessed and predicted not in terms of average of the whole ROI, but in a voxel-by-voxel deterministic trend. Identifying anatomical region subjected to variations, would be possible to focus clinic controls within a cohort of pre-selected patients eligible for adaptive RT. The research is partially co-funded by the Italian Research Grant: Dose warping methods for IGRT and Adaptive RT: dose accumulation based on organ motion and anatomical variations of the patients during radiation therapy treatments,MoH (GR-2010-2318757) and Tecnologie Avanzate S.r.l.(Italy)« less
Coded Modulation in C and MATLAB
NASA Technical Reports Server (NTRS)
Hamkins, Jon; Andrews, Kenneth S.
2011-01-01
This software, written separately in C and MATLAB as stand-alone packages with equivalent functionality, implements encoders and decoders for a set of nine error-correcting codes and modulators and demodulators for five modulation types. The software can be used as a single program to simulate the performance of such coded modulation. The error-correcting codes implemented are the nine accumulate repeat-4 jagged accumulate (AR4JA) low-density parity-check (LDPC) codes, which have been approved for international standardization by the Consultative Committee for Space Data Systems, and which are scheduled to fly on a series of NASA missions in the Constellation Program. The software implements the encoder and decoder functions, and contains compressed versions of generator and parity-check matrices used in these operations.
ERIC Educational Resources Information Center
Arrabal-Campos, Francisco M.; Cortés-Villena, Alejandro; Fernández, Ignacio
2017-01-01
This paper presents a programming project named NMRviewer that allows students to visualize transformed and processed 1 H NMR data in an accessible, interactive format while allowing instructors to incorporate programming content into the chemistry curricula. Using the MATLAB graphical user interface development environment (GUIDE), students can…
How to get students to love (or not hate) MATLAB and programming
NASA Astrophysics Data System (ADS)
Reckinger, Shanon; Reckinger, Scott
2014-11-01
An effective programming course geared toward engineering students requires the utilization of modern teaching philosophies. A newly designed course that focuses on programming in MATLAB involves flipping the classroom and integrating various active teaching techniques. Vital aspects of the new course design include: lengthening in-class contact hours, Process-Oriented Guided Inquiry Learning (POGIL) method worksheets (self-guided instruction), student created video content posted on YouTube, clicker questions (used in class to practice reading and debugging code), programming exams that don't require computers, integrating oral exams into the classroom, fostering an environment for formal and informal peer learning, and designing in a broader theme to tie together assignments. However, possibly the most important piece to this programming course puzzle: the instructor needs to be able to find programming mistakes very fast and then lead individuals and groups through the steps to find their mistakes themselves. The effectiveness of the new course design is demonstrated through pre- and post- concept exam results and student evaluation feedback. Students reported that the course was challenging and required a lot of effort, but left largely positive feedback.
A Compilation of MATLAB Scripts and Functions for MACGMC Analyses
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.
2017-01-01
The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic simulations with minimal manual intervention. This document is formatted to provide MATLAB source files and descriptions of how to utilize them. It is assumed that the user has a basic understanding of how MATLAB scripts work and some MATLAB programming experience.
NASA Astrophysics Data System (ADS)
Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.
2016-12-01
Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html
NASA Astrophysics Data System (ADS)
Reimer, Ashton S.; Cheviakov, Alexei F.
2013-03-01
A Matlab-based finite-difference numerical solver for the Poisson equation for a rectangle and a disk in two dimensions, and a spherical domain in three dimensions, is presented. The solver is optimized for handling an arbitrary combination of Dirichlet and Neumann boundary conditions, and allows for full user control of mesh refinement. The solver routines utilize effective and parallelized sparse vector and matrix operations. Computations exhibit high speeds, numerical stability with respect to mesh size and mesh refinement, and acceptable error values even on desktop computers. Catalogue identifier: AENQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v3.0 No. of lines in distributed program, including test data, etc.: 102793 No. of bytes in distributed program, including test data, etc.: 369378 Distribution format: tar.gz Programming language: Matlab 2010a. Computer: PC, Macintosh. Operating system: Windows, OSX, Linux. RAM: 8 GB (8, 589, 934, 592 bytes) Classification: 4.3. Nature of problem: To solve the Poisson problem in a standard domain with “patchy surface”-type (strongly heterogeneous) Neumann/Dirichlet boundary conditions. Solution method: Finite difference with mesh refinement. Restrictions: Spherical domain in 3D; rectangular domain or a disk in 2D. Unusual features: Choice between mldivide/iterative solver for the solution of large system of linear algebraic equations that arise. Full user control of Neumann/Dirichlet boundary conditions and mesh refinement. Running time: Depending on the number of points taken and the geometry of the domain, the routine may take from less than a second to several hours to execute.
Vellela, Melissa; Qian, Hong
2009-10-06
Schlögl's model is the canonical example of a chemical reaction system that exhibits bistability. Because the biological examples of bistability and switching behaviour are increasingly numerous, this paper presents an integrated deterministic, stochastic and thermodynamic analysis of the model. After a brief review of the deterministic and stochastic modelling frameworks, the concepts of chemical and mathematical detailed balances are discussed and non-equilibrium conditions are shown to be necessary for bistability. Thermodynamic quantities such as the flux, chemical potential and entropy production rate are defined and compared across the two models. In the bistable region, the stochastic model exhibits an exchange of the global stability between the two stable states under changes in the pump parameters and volume size. The stochastic entropy production rate shows a sharp transition that mirrors this exchange. A new hybrid model that includes continuous diffusion and discrete jumps is suggested to deal with the multiscale dynamics of the bistable system. Accurate approximations of the exponentially small eigenvalue associated with the time scale of this switching and the full time-dependent solution are calculated using Matlab. A breakdown of previously known asymptotic approximations on small volume scales is observed through comparison with these and Monte Carlo results. Finally, in the appendix section is an illustration of how the diffusion approximation of the chemical master equation can fail to represent correctly the mesoscopically interesting steady-state behaviour of the system.
NASA Astrophysics Data System (ADS)
Vienhage, Paul; Barcomb, Heather; Marshall, Karel; Black, William A.; Coons, Amanda; Tran, Hien T.; Nguyen, Tien M.; Guillen, Andy T.; Yoh, James; Kizer, Justin; Rogers, Blake A.
2017-05-01
The paper describes the MATLAB (MathWorks) programs that were developed during the REU workshop1 to implement The Aerospace Corporation developed Unified Game-based Acquisition Framework and Advanced Game - based Mathematical Framework (UGAF-AGMF) and its associated War-Gaming Engine (WGE) models. Each game can be played from the perspectives of the Department of Defense Acquisition Authority (DAA) or of an individual contractor (KTR). The programs also implement Aerospace's optimum "Program and Technical Baseline (PTB) and associated acquisition" strategy that combines low Total Ownership Cost (TOC) with innovative designs while still meeting warfighter needs. The paper also describes the Bayesian Acquisition War-Gaming approach using Monte Carlo simulations, a numerical analysis technique to account for uncertainty in decision making, which simulate the PTB development and acquisition processes and will detail the procedure of the implementation and the interactions between the games.
2015-03-26
and the realistic space. These plot were generated using Matlab as teh program to run the simulations. Figure 67. Position 1, Scenario 1 Figure 68...The circle of Apollonius”. Mathematics Education Program J. Wilson, EMAT, 2009 . 12. Oyler, Dave W, Pierre T Kabamba, and Anouck R Girard. “Pursuit
ELRIS2D: A MATLAB Package for the 2D Inversion of DC Resistivity/IP Data
NASA Astrophysics Data System (ADS)
Akca, Irfan
2016-04-01
ELRIS2D is an open source code written in MATLAB for the two-dimensional inversion of direct current resistivity (DCR) and time domain induced polarization (IP) data. The user interface of the program is designed for functionality and ease of use. All available settings of the program can be reached from the main window. The subsurface is discre-tized using a hybrid mesh generated by the combination of structured and unstructured meshes, which reduces the computational cost of the whole inversion procedure. The inversion routine is based on the smoothness constrained least squares method. In order to verify the program, responses of two test models and field data sets were inverted. The models inverted from the synthetic data sets are consistent with the original test models in both DC resistivity and IP cases. A field data set acquired in an archaeological site is also used for the verification of outcomes of the program in comparison with the excavation results.
Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka
2016-03-01
This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiong, Z; Vijayan, S; Rana, V
2015-06-15
Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less
MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.
Elliott, Mark T; Welchman, Andrew E; Wing, Alan M
2009-02-15
Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.
Automated Flight Routing Using Stochastic Dynamic Programming
NASA Technical Reports Server (NTRS)
Ng, Hok K.; Morando, Alex; Grabbe, Shon
2010-01-01
Airspace capacity reduction due to convective weather impedes air traffic flows and causes traffic congestion. This study presents an algorithm that reroutes flights in the presence of winds, enroute convective weather, and congested airspace based on stochastic dynamic programming. A stochastic disturbance model incorporates into the reroute design process the capacity uncertainty. A trajectory-based airspace demand model is employed for calculating current and future airspace demand. The optimal routes minimize the total expected traveling time, weather incursion, and induced congestion costs. They are compared to weather-avoidance routes calculated using deterministic dynamic programming. The stochastic reroutes have smaller deviation probability than the deterministic counterpart when both reroutes have similar total flight distance. The stochastic rerouting algorithm takes into account all convective weather fields with all severity levels while the deterministic algorithm only accounts for convective weather systems exceeding a specified level of severity. When the stochastic reroutes are compared to the actual flight routes, they have similar total flight time, and both have about 1% of travel time crossing congested enroute sectors on average. The actual flight routes induce slightly less traffic congestion than the stochastic reroutes but intercept more severe convective weather.
Developing Matlab scripts for image analysis and quality assessment
NASA Astrophysics Data System (ADS)
Vaiopoulos, A. D.
2011-11-01
Image processing is a very helpful tool in many fields of modern sciences that involve digital imaging examination and interpretation. Processed images however, often need to be correlated with the original image, in order to ensure that the resulting image fulfills its purpose. Aside from the visual examination, which is mandatory, image quality indices (such as correlation coefficient, entropy and others) are very useful, when deciding which processed image is the most satisfactory. For this reason, a single program (script) was written in Matlab language, which automatically calculates eight indices by utilizing eight respective functions (independent function scripts). The program was tested in both fused hyperspectral (Hyperion-ALI) and multispectral (ALI, Landsat) imagery and proved to be efficient. Indices were found to be in agreement with visual examination and statistical observations.
NASA Astrophysics Data System (ADS)
Chęciński, Jakub; Frankowski, Marek
2016-10-01
We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.
Kotze, Ben; Jordaan, Gerrit
2014-08-25
Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed.
Kotze, Ben; Jordaan, Gerrit
2014-01-01
Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed. PMID:25157548
Parallel Computation of the Jacobian Matrix for Nonlinear Equation Solvers Using MATLAB
NASA Technical Reports Server (NTRS)
Rose, Geoffrey K.; Nguyen, Duc T.; Newman, Brett A.
2017-01-01
Demonstrating speedup for parallel code on a multicore shared memory PC can be challenging in MATLAB due to underlying parallel operations that are often opaque to the user. This can limit potential for improvement of serial code even for the so-called embarrassingly parallel applications. One such application is the computation of the Jacobian matrix inherent to most nonlinear equation solvers. Computation of this matrix represents the primary bottleneck in nonlinear solver speed such that commercial finite element (FE) and multi-body-dynamic (MBD) codes attempt to minimize computations. A timing study using MATLAB's Parallel Computing Toolbox was performed for numerical computation of the Jacobian. Several approaches for implementing parallel code were investigated while only the single program multiple data (spmd) method using composite objects provided positive results. Parallel code speedup is demonstrated but the goal of linear speedup through the addition of processors was not achieved due to PC architecture.
TRIAC II. A MatLab code for track measurements from SSNT detectors
NASA Astrophysics Data System (ADS)
Patiris, D. L.; Blekas, K.; Ioannides, K. G.
2007-08-01
A computer program named TRIAC II written in MATLAB and running with a friendly GUI has been developed for recognition and parameters measurements of particles' tracks from images of Solid State Nuclear Track Detectors. The program, using image analysis tools, counts the number of tracks and depending on the current working mode classifies them according to their radii (Mode I—circular tracks) or their axis (Mode II—elliptical tracks), their mean intensity value (brightness) and their orientation. Images of the detectors' surfaces are input to the code, which generates text files as output, including the number of counted tracks with the associated track parameters. Hough transform techniques are used for the estimation of the number of tracks and their parameters, providing results even in cases of overlapping tracks. Finally, it is possible for the user to obtain informative histograms as well as output files for each image and/or group of images. Program summaryTitle of program:TRIAC II Catalogue identifier:ADZC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZC_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: Pentium III, 600 MHz Installations: MATLAB 7.0 Operating system under which the program has been tested: Windows XP Programming language used:MATLAB Memory required to execute with typical data:256 MB No. of bits in a word:32 No. of processors used:one Has the code been vectorized or parallelized?:no No. of lines in distributed program, including test data, etc.:25 964 No. of bytes in distributed program including test data, etc.: 4 354 510 Distribution format:tar.gz Additional comments: This program requires the MatLab Statistical toolbox and the Image Processing Toolbox to be installed. Nature of physical problem: Following the passage of a charged particle (protons and heavier) through a Solid State Nuclear Track Detector (SSNTD), a damage region is created, usually named latent track. After the chemical etching of the detectors in aqueous NaOH or KOH solutions, latent tracks can be sufficiently enlarged (with diameters of 1 μm or more) to become visible under an optical microscope. Using the appropriate apparatus, one can record images of the SSNTD's surface. The shapes of the particle's tracks are strongly dependent on their charge, energy and the angle of incidence. Generally, they have elliptical shapes and in the special case of vertical incidence, they are circular. The manual counting of tracks is a tedious and time-consuming task. An automatic system is needed to speed up the process and to increase the accuracy of the results. Method of solution: TRIAC II is based on a segmentation method that groups image pixels according to their intensity value (brightness) in a number of grey level groups. After the segmentation of pixels, the program recognizes and separates the track from the background, subsequently performing image morphology, where oversized objects or objects smaller than a threshold value are removed. Finally, using the appropriate Hough transform technique, the program counts the tracks, even those which overlap and classifies them according to their shape parameters and brightness. Typical running time: The analysis of an image with a PC (Intel Pentium III processor running at 600 MHz) requires 2 to 10 minutes, depending on the number of observed tracks and the digital resolution of the image. Unusual features of the program: This program has been tested with images of CR-39 detectors exposed to alpha particles. Also, in low contrast images with few or small tracks, background pixels can be recognized as track pixels. To avoid this problem the brightness of the background pixels should be sufficiently higher than that of the track pixels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tumuluru, Jaya Shankar; McCulloch, Richard Chet James
In this work a new hybrid genetic algorithm was developed which combines a rudimentary adaptive steepest ascent hill climbing algorithm with a sophisticated evolutionary algorithm in order to optimize complex multivariate design problems. By combining a highly stochastic algorithm (evolutionary) with a simple deterministic optimization algorithm (adaptive steepest ascent) computational resources are conserved and the solution converges rapidly when compared to either algorithm alone. In genetic algorithms natural selection is mimicked by random events such as breeding and mutation. In the adaptive steepest ascent algorithm each variable is perturbed by a small amount and the variable that caused the mostmore » improvement is incremented by a small step. If the direction of most benefit is exactly opposite of the previous direction with the most benefit then the step size is reduced by a factor of 2, thus the step size adapts to the terrain. A graphical user interface was created in MATLAB to provide an interface between the hybrid genetic algorithm and the user. Additional features such as bounding the solution space and weighting the objective functions individually are also built into the interface. The algorithm developed was tested to optimize the functions developed for a wood pelleting process. Using process variables (such as feedstock moisture content, die speed, and preheating temperature) pellet properties were appropriately optimized. Specifically, variables were found which maximized unit density, bulk density, tapped density, and durability while minimizing pellet moisture content and specific energy consumption. The time and computational resources required for the optimization were dramatically decreased using the hybrid genetic algorithm when compared to MATLAB's native evolutionary optimization tool.« less
EEGgui: a program used to detect electroencephalogram anomalies after traumatic brain injury.
Sick, Justin; Bray, Eric; Bregy, Amade; Dietrich, W Dalton; Bramlett, Helen M; Sick, Thomas
2013-05-21
Identifying and quantifying pathological changes in brain electrical activity is important for investigations of brain injury and neurological disease. An example is the development of epilepsy, a secondary consequence of traumatic brain injury. While certain epileptiform events can be identified visually from electroencephalographic (EEG) or electrocorticographic (ECoG) records, quantification of these pathological events has proved to be more difficult. In this study we developed MATLAB-based software that would assist detection of pathological brain electrical activity following traumatic brain injury (TBI) and present our MATLAB code used for the analysis of the ECoG. Software was developed using MATLAB(™) and features of the open access EEGLAB. EEGgui is a graphical user interface in the MATLAB programming platform that allows scientists who are not proficient in computer programming to perform a number of elaborate analyses on ECoG signals. The different analyses include Power Spectral Density (PSD), Short Time Fourier analysis and Spectral Entropy (SE). ECoG records used for demonstration of this software were derived from rats that had undergone traumatic brain injury one year earlier. The software provided in this report provides a graphical user interface for displaying ECoG activity and calculating normalized power density using fast fourier transform of the major brain wave frequencies (Delta, Theta, Alpha, Beta1, Beta2 and Gamma). The software further detects events in which power density for these frequency bands exceeds normal ECoG by more than 4 standard deviations. We found that epileptic events could be identified and distinguished from a variety of ECoG phenomena associated with normal changes in behavior. We further found that analysis of spectral entropy was less effective in distinguishing epileptic from normal changes in ECoG activity. The software presented here was a successful modification of EEGLAB in the Matlab environment that allows detection of epileptiform ECoG signals in animals after TBI. The code allows import of large EEG or ECoG data records as standard text files and uses fast fourier transform as a basis for detection of abnormal events. The software can also be used to monitor injury-induced changes in spectral entropy if required. We hope that the software will be useful for other investigators in the field of traumatic brain injury and will stimulate future advances of quantitative analysis of brain electrical activity after neurological injury or disease.
Castaño-Díez, Daniel
2017-01-01
Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909
Castaño-Díez, Daniel
2017-06-01
Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.
tweezercalib 2.0: Faster version of MatLab package for precise calibration of optical tweezers
NASA Astrophysics Data System (ADS)
Hansen, Poul Martin; Tolić-Nørrelykke, Iva Marija; Flyvbjerg, Henrik; Berg-Sørensen, Kirstine
2006-03-01
We present a vectorized version of the MatLab (MathWorks Inc.) package tweezercalib for calibration of optical tweezers with precision. The calibration is based on the power spectrum of the Brownian motion of a dielectric bead trapped in the tweezers. Precision is achieved by accounting for a number of factors that affect this power spectrum, as described in vs. 1 of the package [I.M. Tolić-Nørrelykke, K. Berg-Sørensen, H. Flyvbjerg, Matlab program for precision calibration of optical tweezers, Comput. Phys. Comm. 159 (2004) 225-240]. The graphical user interface allows the user to include or leave out each of these factors. Several "health tests" are applied to the experimental data during calibration, and test results are displayed graphically. Thus, the user can easily see whether the data comply with the theory used for their interpretation. Final calibration results are given with statistical errors and covariance matrix. New version program summaryTitle of program: tweezercalib Catalogue identifier: ADTV_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTV_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Reference in CPC to previous version: I.M. Tolić-Nørrelykke, K. Berg-Sørensen, H. Flyvbjerg, Comput. Phys. Comm. 159 (2004) 225 Catalogue identifier of previous version: ADTV Does the new version supersede the original program: Yes Computer for which the program is designed and others on which it has been tested: General computer running MatLab (Mathworks Inc.) Operating systems under with the program has been tested: Windows2000, Windows-XP, Linux Programming language used: MatLab (Mathworks Inc.), standard license Memory required to execute with typical data: Of order four times the size of the data file High speed storage required: none No. of lines in distributed program, including test data, etc.: 135 989 No. of bytes in distributed program, including test data, etc.: 1 527 611 Distribution format: tar. gz Nature of physical problem: Calibrate optical tweezers with precision by fitting theory to experimental power spectrum of position of bead doing Brownian motion in incompressible fluid, possibly near microscope cover slip, while trapped in optical tweezers. Thereby determine spring constant of optical trap and conversion factor for arbitrary-units-to-nanometers for detection system. Method of solution: Elimination of cross-talk between quadrant photo-diode's output channels for positions (optional). Check that distribution of recorded positions agrees with Boltzmann distribution of bead in harmonic trap. Data compression and noise reduction by blocking method applied to power spectrum. Full accounting for hydrodynamic effects: Frequency-dependent drag force and interaction with nearby cover slip (optional). Full accounting for electronic filters (optional), for "virtual filtering" caused by detection system (optional). Full accounting for aliasing caused by finite sampling rate (optional). Standard non-linear least-squares fitting. Statistical support for fit is given, with several plots facilitating inspection of consistency and quality of data and fit. Summary of revisions: A faster fitting routine, adapted from [J. Nocedal, Y.x. Yuan, Combining trust region and line search techniques, Technical Report OTC 98/04, Optimization Technology Center, 1998; W.H. Press, B.P. Flannery, S.A. Teukolsky, W.T. Vetterling, Numerical Recipes. The Art of Scientific Computing, Cambridge University Press, Cambridge, 1986], is applied. It uses fewer function evaluations, and the remaining function evaluations have been vectorized. Calls to routines in Toolboxes not included with a standard MatLab license have been replaced by calls to routines that are included in the present package. Fitting parameters are rescaled to ensure that they are all of roughly the same size (of order 1) while being fitted. Generally, the program package has been updated to comply with MatLab, vs. 7.0, and optimized for speed. Restrictions on the complexity of the problem: Data should be positions of bead doing Brownian motion while held by optical tweezers. For high precision in final results, data should be time series measured over a long time, with sufficiently high experimental sampling rate: The sampling rate should be well above the characteristic frequency of the trap, the so-called corner frequency. Thus, the sampling frequency should typically be larger than 10 kHz. The Fast Fourier Transform used works optimally when the time series contain 2 data points, and long measurement time is obtained with n>12-15. Finally, the optics should be set to ensure a harmonic trapping potential in the range of positions visited by the bead. The fitting procedure checks for harmonic potential. Typical running time: Seconds Unusual features of the program: None References: The theoretical underpinnings for the procedure are found in [K. Berg-Sørensen, H. Flyvbjerg, Power spectrum analysis for optical tweezers, Rev. Sci. Ins. 75 (2004) 594-612].
A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.
Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher
2017-08-01
The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.
Accurate modeling of switched reluctance machine based on hybrid trained WNN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Shoujun, E-mail: sunnyway@nwpu.edu.cn; Ge, Lefei; Ma, Shaojie
2014-04-15
According to the strong nonlinear electromagnetic characteristics of switched reluctance machine (SRM), a novel accurate modeling method is proposed based on hybrid trained wavelet neural network (WNN) which combines improved genetic algorithm (GA) with gradient descent (GD) method to train the network. In the novel method, WNN is trained by GD method based on the initial weights obtained per improved GA optimization, and the global parallel searching capability of stochastic algorithm and local convergence speed of deterministic algorithm are combined to enhance the training accuracy, stability and speed. Based on the measured electromagnetic characteristics of a 3-phase 12/8-pole SRM, themore » nonlinear simulation model is built by hybrid trained WNN in Matlab. The phase current and mechanical characteristics from simulation under different working conditions meet well with those from experiments, which indicates the accuracy of the model for dynamic and static performance evaluation of SRM and verifies the effectiveness of the proposed modeling method.« less
Terminal altitude maximization for Mars entry considering uncertainties
NASA Astrophysics Data System (ADS)
Cui, Pingyuan; Zhao, Zeduan; Yu, Zhengshi; Dai, Juan
2018-04-01
Uncertainties present in the Mars atmospheric entry process may cause state deviations from the nominal designed values, which will lead to unexpected performance degradation if the trajectory is designed merely based on the deterministic dynamic model. In this paper, a linear covariance based entry trajectory optimization method is proposed considering the uncertainties presenting in the initial states and parameters. By extending the elements of the state covariance matrix as augmented states, the statistical behavior of the trajectory is captured to reformulate the performance metrics and path constraints. The optimization problem is solved by the GPOPS-II toolbox in MATLAB environment. Monte Carlo simulations are also conducted to demonstrate the capability of the proposed method. Primary trading performances between the nominal deployment altitude and its dispersion can be observed by modulating the weights on the dispersion penalty, and a compromised result referring to maximizing the 3σ lower bound of the terminal altitude is achieved. The resulting path constraints also show better satisfaction in a disturbed environment compared with the nominal situation.
Enabling On-Demand Database Computing with MIT SuperCloud Database Management System
2015-09-15
arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created
Automating an integrated spatial data-mining model for landfill site selection
NASA Astrophysics Data System (ADS)
Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul
2017-10-01
An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.
Boudet, Samuel; Peyrodie, Laurent; Gallois, Philippe; de l'Aulnoit, Denis Houzé; Cao, Hua; Forzy, Gérard
2013-01-01
This paper presents a Matlab-based software (MathWorks inc.) called BioSigPlot for the visualization of multi-channel biomedical signals, particularly for the EEG. This tool is designed for researchers on both engineering and medicine who have to collaborate to visualize and analyze signals. It aims to provide a highly customizable interface for signal processing experimentation in order to plot several kinds of signals while integrating the common tools for physician. The main advantages compared to other existing programs are the multi-dataset displaying, the synchronization with video and the online processing. On top of that, this program uses object oriented programming, so that the interface can be controlled by both graphic controls and command lines. It can be used as EEGlab plug-in but, since it is not limited to EEG, it would be distributed separately. BioSigPlot is distributed free of charge (http://biosigplot.sourceforge.net), under the terms of GNU Public License for non-commercial use and open source development.
A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.
O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A
2015-02-01
Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. Copyright © 2014 Elsevier B.V. All rights reserved.
DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation
Sherfey, Jason S.; Soplata, Austin E.; Ardid, Salva; Roberts, Erik A.; Stanley, David A.; Pittman-Polletta, Benjamin R.; Kopell, Nancy J.
2018-01-01
DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community. PMID:29599715
DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.
Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J
2018-01-01
DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.
Eastman, Kyler M; Huk, Alexander C
2012-01-01
Neurophysiological studies in awake, behaving primates (both human and non-human) have focused with increasing scrutiny on the temporal relationship between neural signals and behaviors. Consequently, laboratories are often faced with the problem of developing experimental equipment that can support data recording with high temporal precision and also be flexible enough to accommodate a wide variety of experimental paradigms. To this end, we have developed a MATLAB toolbox that integrates several modern pieces of equipment, but still grants experimenters the flexibility of a high-level programming language. Our toolbox takes advantage of three popular and powerful technologies: the Plexon apparatus for neurophysiological recordings (Plexon, Inc., Dallas, TX, USA), a Datapixx peripheral (Vpixx Technologies, Saint-Bruno, QC, Canada) for control of analog, digital, and video input-output signals, and the Psychtoolbox MATLAB toolbox for stimulus generation (Brainard, 1997; Pelli, 1997; Kleiner et al., 2007). The PLDAPS ("Platypus") system is designed to support the study of the visual systems of awake, behaving primates during multi-electrode neurophysiological recordings, but can be easily applied to other related domains. Despite its wide range of capabilities and support for cutting-edge video displays and neural recording systems, the PLDAPS system is simple enough for someone with basic MATLAB programming skills to design their own experiments.
ERIC Educational Resources Information Center
Karagiannis, P.; Markelis, I.; Paparrizos, K.; Samaras, N.; Sifaleras, A.
2006-01-01
This paper presents new web-based educational software (webNetPro) for "Linear Network Programming." It includes many algorithms for "Network Optimization" problems, such as shortest path problems, minimum spanning tree problems, maximum flow problems and other search algorithms. Therefore, webNetPro can assist the teaching process of courses such…
Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization
ERIC Educational Resources Information Center
Gelman, Andrew; Lee, Daniel; Guo, Jiqiang
2015-01-01
Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…
Robichaud, Guillaume; Garrard, Kenneth P; Barry, Jeremy A; Muddiman, David C
2013-05-01
During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.
NASA Astrophysics Data System (ADS)
Robichaud, Guillaume; Garrard, Kenneth P.; Barry, Jeremy A.; Muddiman, David C.
2013-05-01
During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.
MATLAB for laser speckle contrast analysis (LASCA): a practice-based approach
NASA Astrophysics Data System (ADS)
Postnikov, Eugene B.; Tsoy, Maria O.; Postnov, Dmitry E.
2018-04-01
Laser Speckle Contrast Analysis (LASCA) is one of the most powerful modern methods for revealing blood dynamics. The experimental design and theory for this method are well established, and the computational recipie is often regarded to be trivial. However, the achieved performance and spatial resolution may considerable differ for different implementations. We comprise a minireview of known approaches to the spatial laser speckle contrast data processing and their realization in MATLAB code providing an explicit correspondence to the mathematical representation, a discussion of available implementations. We also present the algorithm based on the 2D Haar wavelet transform, also supplied with the program code. This new method provides an opportunity to introduce horizontal, vertical and diagonal speckle contrasts; it may be used for processing highly anisotropic images of vascular trees. We provide the comparative analysis of the accuracy of vascular pattern detection and the processing times with a special attention to details of the used MATLAB procedures.
NASA Astrophysics Data System (ADS)
Kim, Hojin; Choi, In Ho; Lee, Sanghyun; Won, Dong-Joon; Oh, Yong Suk; Kwon, Donghoon; Sung, Hyung Jin; Jeon, Sangmin; Kim, Joonwon
2017-04-01
This paper presents a deterministic bead-in-droplet ejection (BIDE) technique that regulates the precise distribution of microbeads in an ejected droplet. The deterministic BIDE was realized through the effective integration of a microfluidic single-particle handling technique with a liquid dispensing system. The integrated bead dispenser facilitates the transfer of the desired number of beads into a dispensing volume and the on-demand ejection of bead-encapsulated droplets. Single bead-encapsulated droplets were ejected every 3 s without any failure. Multiple-bead dispensing with deterministic control of the number of beads was demonstrated to emphasize the originality and quality of the proposed dispensing technique. The dispenser was mounted using a plug-socket type connection, and the dispensing process was completely automated using a programmed sequence without any microscopic observation. To demonstrate a potential application of the technique, bead-based streptavidin-biotin binding assay in an evaporating droplet was conducted using ultralow numbers of beads. The results evidenced the number of beads in the droplet crucially influences the reliability of the assay. Therefore, the proposed deterministic bead-in-droplet technology can be utilized to deliver desired beads onto a reaction site, particularly to reliably and efficiently enrich and detect target biomolecules.
Kim, Hojin; Choi, In Ho; Lee, Sanghyun; Won, Dong-Joon; Oh, Yong Suk; Kwon, Donghoon; Sung, Hyung Jin; Jeon, Sangmin; Kim, Joonwon
2017-04-10
This paper presents a deterministic bead-in-droplet ejection (BIDE) technique that regulates the precise distribution of microbeads in an ejected droplet. The deterministic BIDE was realized through the effective integration of a microfluidic single-particle handling technique with a liquid dispensing system. The integrated bead dispenser facilitates the transfer of the desired number of beads into a dispensing volume and the on-demand ejection of bead-encapsulated droplets. Single bead-encapsulated droplets were ejected every 3 s without any failure. Multiple-bead dispensing with deterministic control of the number of beads was demonstrated to emphasize the originality and quality of the proposed dispensing technique. The dispenser was mounted using a plug-socket type connection, and the dispensing process was completely automated using a programmed sequence without any microscopic observation. To demonstrate a potential application of the technique, bead-based streptavidin-biotin binding assay in an evaporating droplet was conducted using ultralow numbers of beads. The results evidenced the number of beads in the droplet crucially influences the reliability of the assay. Therefore, the proposed deterministic bead-in-droplet technology can be utilized to deliver desired beads onto a reaction site, particularly to reliably and efficiently enrich and detect target biomolecules.
Kim, Hojin; Choi, In Ho; Lee, Sanghyun; Won, Dong-Joon; Oh, Yong Suk; Kwon, Donghoon; Sung, Hyung Jin; Jeon, Sangmin; Kim, Joonwon
2017-01-01
This paper presents a deterministic bead-in-droplet ejection (BIDE) technique that regulates the precise distribution of microbeads in an ejected droplet. The deterministic BIDE was realized through the effective integration of a microfluidic single-particle handling technique with a liquid dispensing system. The integrated bead dispenser facilitates the transfer of the desired number of beads into a dispensing volume and the on-demand ejection of bead-encapsulated droplets. Single bead–encapsulated droplets were ejected every 3 s without any failure. Multiple-bead dispensing with deterministic control of the number of beads was demonstrated to emphasize the originality and quality of the proposed dispensing technique. The dispenser was mounted using a plug-socket type connection, and the dispensing process was completely automated using a programmed sequence without any microscopic observation. To demonstrate a potential application of the technique, bead-based streptavidin–biotin binding assay in an evaporating droplet was conducted using ultralow numbers of beads. The results evidenced the number of beads in the droplet crucially influences the reliability of the assay. Therefore, the proposed deterministic bead-in-droplet technology can be utilized to deliver desired beads onto a reaction site, particularly to reliably and efficiently enrich and detect target biomolecules. PMID:28393911
DOE Office of Scientific and Technical Information (OSTI.GOV)
Portmann, Greg; /LBL, Berkeley; Safranek, James
The LOCO algorithm has been used by many accelerators around the world. Although the uses for LOCO vary, the most common use has been to find calibration errors and correct the optics functions. The light source community in particular has made extensive use of the LOCO algorithms to tightly control the beta function and coupling. Maintaining high quality beam parameters requires constant attention so a relatively large effort was put into software development for the LOCO application. The LOCO code was originally written in FORTRAN. This code worked fine but it was somewhat awkward to use. For instance, the FORTRANmore » code itself did not calculate the model response matrix. It required a separate modeling code such as MAD to calculate the model matrix then one manually loads the data into the LOCO code. As the number of people interested in LOCO grew, it required making it easier to use. The decision to port LOCO to Matlab was relatively easy. It's best to use a matrix programming language with good graphics capability; Matlab was also being used for high level machine control; and the accelerator modeling code AT, [5], was already developed for Matlab. Since LOCO requires collecting and processing a relative large amount of data, it is very helpful to have the LOCO code compatible with the high level machine control, [3]. A number of new features were added while porting the code from FORTRAN and new methods continue to evolve, [7][9]. Although Matlab LOCO was written with AT as the underlying tracking code, a mechanism to connect to other modeling codes has been provided.« less
Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK
2014-01-01
Background Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system’s set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This “code-based” approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. Results As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. Conclusions The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts. PMID:24725437
Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK.
Wang, Kaier; Steyn-Ross, Moira L; Steyn-Ross, D Alistair; Wilson, Marcus T; Sleigh, Jamie W; Shiraishi, Yoichi
2014-04-11
Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system's set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This "code-based" approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts.
Dung Tuan Nguyen
2012-01-01
Forest harvest scheduling has been modeled using deterministic and stochastic programming models. Past models seldom address explicit spatial forest management concerns under the influence of natural disturbances. In this research study, we employ multistage full recourse stochastic programming models to explore the challenges and advantages of building spatial...
Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.
Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N
2009-10-27
The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.
A novel approach based on preference-based index for interval bilevel linear programming problem.
Ren, Aihong; Wang, Yuping; Xue, Xingsi
2017-01-01
This paper proposes a new methodology for solving the interval bilevel linear programming problem in which all coefficients of both objective functions and constraints are considered as interval numbers. In order to keep as much uncertainty of the original constraint region as possible, the original problem is first converted into an interval bilevel programming problem with interval coefficients in both objective functions only through normal variation of interval number and chance-constrained programming. With the consideration of different preferences of different decision makers, the concept of the preference level that the interval objective function is preferred to a target interval is defined based on the preference-based index. Then a preference-based deterministic bilevel programming problem is constructed in terms of the preference level and the order relation [Formula: see text]. Furthermore, the concept of a preference δ -optimal solution is given. Subsequently, the constructed deterministic nonlinear bilevel problem is solved with the help of estimation of distribution algorithm. Finally, several numerical examples are provided to demonstrate the effectiveness of the proposed approach.
A Switching-Mode Power Supply Design Tool to Improve Learning in a Power Electronics Course
ERIC Educational Resources Information Center
Miaja, P. F.; Lamar, D. G.; de Azpeitia, M.; Rodriguez, A.; Rodriguez, M.; Hernando, M. M.
2011-01-01
The static design of ac/dc and dc/dc switching-mode power supplies (SMPS) relies on a simple but repetitive process. Although specific spreadsheets, available in various computer-aided design (CAD) programs, are widely used, they are difficult to use in educational applications. In this paper, a graphic tool programmed in MATLAB is presented,…
Pandey, Anil Kumar; Saroha, Kartik; Sharma, Param Dev; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
In this study, we have developed a simple image processing application in MATLAB that uses suprathreshold stochastic resonance (SSR) and helps the user to visualize abdominopelvic tumor on the exported prediuretic positron emission tomography/computed tomography (PET/CT) images. A brainstorming session was conducted for requirement analysis for the program. It was decided that program should load the screen captured PET/CT images and then produces output images in a window with a slider control that should enable the user to view the best image that visualizes the tumor, if present. The program was implemented on personal computer using Microsoft Windows and MATLAB R2013b. The program has option for the user to select the input image. For the selected image, it displays output images generated using SSR in a separate window having a slider control. The slider control enables the user to view images and select one which seems to provide the best visualization of the area(s) of interest. The developed application enables the user to select, process, and view output images in the process of utilizing SSR to detect the presence of abdominopelvic tumor on prediuretic PET/CT image.
Using MATLAB Software on the Peregrine System | High-Performance Computing
| NREL MATLAB Software on the Peregrine System Using MATLAB Software on the Peregrine System Learn how to use MATLAB software on the Peregrine system. Running MATLAB in Batch Mode Using the node. Understanding Versions and Licenses Learn about the MATLAB software versions and licenses
General MACOS Interface for Modeling and Analysis for Controlled Optical Systems
NASA Technical Reports Server (NTRS)
Sigrist, Norbert; Basinger, Scott A.; Redding, David C.
2012-01-01
The General MACOS Interface (GMI) for Modeling and Analysis for Controlled Optical Systems (MACOS) enables the use of MATLAB as a front-end for JPL s critical optical modeling package, MACOS. MACOS is JPL s in-house optical modeling software, which has proven to be a superb tool for advanced systems engineering of optical systems. GMI, coupled with MACOS, allows for seamless interfacing with modeling tools from other disciplines to make possible integration of dynamics, structures, and thermal models with the addition of control systems for deformable optics and other actuated optics. This software package is designed as a tool for analysts to quickly and easily use MACOS without needing to be an expert at programming MACOS. The strength of MACOS is its ability to interface with various modeling/development platforms, allowing evaluation of system performance with thermal, mechanical, and optical modeling parameter variations. GMI provides an improved means for accessing selected key MACOS functionalities. The main objective of GMI is to marry the vast mathematical and graphical capabilities of MATLAB with the powerful optical analysis engine of MACOS, thereby providing a useful tool to anyone who can program in MATLAB. GMI also improves modeling efficiency by eliminating the need to write an interface function for each task/project, reducing error sources, speeding up user/modeling tasks, and making MACOS well suited for fast prototyping.
2013-12-01
Programming code in the Python language used in AIS data preprocessing is contained in Appendix A. The MATLAB programming code used to apply the Hough...described in Chapter III is applied to archived AIS data in this chapter. The implementation of the method, including programming techniques used, is...is contained in the second. To provide a proof of concept for the algorithm described in Chapter III, the PYTHON programming language was used for
Biyikli, Emre; To, Albert C.
2015-01-01
A new topology optimization method called the Proportional Topology Optimization (PTO) is presented. As a non-sensitivity method, PTO is simple to understand, easy to implement, and is also efficient and accurate at the same time. It is implemented into two MATLAB programs to solve the stress constrained and minimum compliance problems. Descriptions of the algorithm and computer programs are provided in detail. The method is applied to solve three numerical examples for both types of problems. The method shows comparable efficiency and accuracy with an existing optimality criteria method which computes sensitivities. Also, the PTO stress constrained algorithm and minimum compliance algorithm are compared by feeding output from one algorithm to the other in an alternative manner, where the former yields lower maximum stress and volume fraction but higher compliance compared to the latter. Advantages and disadvantages of the proposed method and future works are discussed. The computer programs are self-contained and publicly shared in the website www.ptomethod.org. PMID:26678849
Weight optimization of plane truss using genetic algorithm
NASA Astrophysics Data System (ADS)
Neeraja, D.; Kamireddy, Thejesh; Santosh Kumar, Potnuru; Simha Reddy, Vijay
2017-11-01
Optimization of structure on basis of weight has many practical benefits in every engineering field. The efficiency is proportionally related to its weight and hence weight optimization gains prime importance. Considering the field of civil engineering, weight optimized structural elements are economical and easier to transport to the site. In this study, genetic optimization algorithm for weight optimization of steel truss considering its shape, size and topology aspects has been developed in MATLAB. Material strength and Buckling stability have been adopted from IS 800-2007 code of construction steel. The constraints considered in the present study are fabrication, basic nodes, displacements, and compatibility. Genetic programming is a natural selection search technique intended to combine good solutions to a problem from many generations to improve the results. All solutions are generated randomly and represented individually by a binary string with similarities of natural chromosomes, and hence it is termed as genetic programming. The outcome of the study is a MATLAB program, which can optimise a steel truss and display the optimised topology along with element shapes, deflections, and stress results.
Peer Learning in a MATLAB Programming Course
NASA Astrophysics Data System (ADS)
Reckinger, Shanon
2016-11-01
Three forms of research-based peer learning were implemented in the design of a MATLAB programming course for mechanical engineering undergraduate students. First, a peer learning program was initiated. These undergraduate peer learning leaders played two roles in the course, (I) they were in the classroom helping students' with their work, and, (II) they led optional two hour helps sessions outside of the class time. The second form of peer learning was implemented through the inclusion of a peer discussion period following in class clicker quizzes. The third form of peer learning had the students creating video project assignments and posting them on YouTube to explain course topics to their peers. Several other more informal techniques were used to encourage peer learning. Student feedback in the form of both instructor-designed survey responses and formal course evaluations (quantitative and narrative) will be presented. Finally, effectiveness will be measured by formal assessment, direct and indirect to these peer learning methods. This will include both academic data/grades and pre/post test scores. Overall, the course design and its inclusion of these peer learning techniques demonstrate effectiveness.
NASA Technical Reports Server (NTRS)
Wieseman, Carol D.; Christhilf, David; Perry, Boyd, III
2012-01-01
An important objective of the Semi-Span Super-Sonic Transport (S4T) wind tunnel model program was the demonstration of Flutter Suppression (FS), Gust Load Alleviation (GLA), and Ride Quality Enhancement (RQE). It was critical to evaluate the stability and robustness of these control laws analytically before testing them and experimentally while testing them to ensure safety of the model and the wind tunnel. MATLAB based software was applied to evaluate the performance of closed-loop systems in terms of stability and robustness. Existing software tools were extended to use analytical representations of the S4T and the control laws to analyze and evaluate the control laws prior to testing. Lessons were learned about the complex windtunnel model and experimental testing. The open-loop flutter boundary was determined from the closed-loop systems. A MATLAB/Simulink Simulation developed under the program is available for future work to improve the CPE process. This paper is one of a series of that comprise a special session, which summarizes the S4T wind-tunnel program.
Using the Gurobi Solvers on the Peregrine System | High-Performance
Peregrine System Gurobi Optimizer is a suite of solvers for mathematical programming. It is licensed for ('GRB_MATLAB_PATH') >> path(path,grb) Gurobi and GAMS GAMS is a high-level modeling system for mathematical
76 FR 72220 - Incorporation of Risk Management Concepts in Regulatory Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-22
... and support the adoption of improved designs or processes. \\1\\ A deterministic approach to regulation... longstanding goal to move toward more risk-informed, performance- based approaches in its regulatory programs... regulatory approach that would continue to ensure the safe and secure use of nuclear material. As part of...
More-Realistic Digital Modeling of a Human Body
NASA Technical Reports Server (NTRS)
Rogge, Renee
2010-01-01
A MATLAB computer program has been written to enable improved (relative to an older program) modeling of a human body for purposes of designing space suits and other hardware with which an astronaut must interact. The older program implements a kinematic model based on traditional anthropometric measurements that do provide important volume and surface information. The present program generates a three-dimensional (3D) whole-body model from 3D body-scan data. The program utilizes thin-plate spline theory to reposition the model without need for additional scans.
Some selected quantitative methods of thermal image analysis in Matlab.
Koprowski, Robert
2016-05-01
The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.
Soranzo, Alessandro; Grassi, Massimo
2014-01-01
PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.
PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing
Soranzo, Alessandro; Grassi, Massimo
2014-01-01
PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment. PMID:25101013
Mathematical model of ambulance resources in Saint-Petersburg
NASA Astrophysics Data System (ADS)
Shavidze, G. G.; Balykina, Y. E.; Lejnina, E. A.; Svirkin, M. V.
2016-06-01
Emergency medical system is one of the main elements in city infrastructure. The article contains analysis of existing system of ambulance resource distribution. Paper considers the idea of using multiperiodicity as a tool to increase the efficiency of the Emergency Medical Services. The program developed in programming environment Matlab helps to evaluate the changes in the functioning of the system of emergency medical service.
Zrenner, Christoph; Eytan, Danny; Wallach, Avner; Thier, Peter; Marom, Shimon
2010-01-01
Distinct modules of the neural circuitry interact with each other and (through the motor-sensory loop) with the environment, forming a complex dynamic system. Neuro-prosthetic devices seeking to modulate or restore CNS function need to interact with the information flow at the level of neural modules electrically, bi-directionally and in real-time. A set of freely available generic tools is presented that allow computationally demanding multi-channel short-latency bi-directional interactions to be realized in in vivo and in vitro preparations using standard PC data acquisition and processing hardware and software (Mathworks Matlab and Simulink). A commercially available 60-channel extracellular multi-electrode recording and stimulation set-up connected to an ex vivo developing cortical neuronal culture is used as a model system to validate the method. We demonstrate how complex high-bandwidth (>10 MBit/s) neural recording data can be analyzed in real-time while simultaneously generating specific complex electrical stimulation feedback with deterministically timed responses at sub-millisecond resolution. PMID:21060803
Nonlinear mode decomposition: A noise-robust, adaptive decomposition method
NASA Astrophysics Data System (ADS)
Iatsenko, Dmytro; McClintock, Peter V. E.; Stefanovska, Aneta
2015-09-01
The signals emanating from complex systems are usually composed of a mixture of different oscillations which, for a reliable analysis, should be separated from each other and from the inevitable background of noise. Here we introduce an adaptive decomposition tool—nonlinear mode decomposition (NMD)—which decomposes a given signal into a set of physically meaningful oscillations for any wave form, simultaneously removing the noise. NMD is based on the powerful combination of time-frequency analysis techniques—which, together with the adaptive choice of their parameters, make it extremely noise robust—and surrogate data tests used to identify interdependent oscillations and to distinguish deterministic from random activity. We illustrate the application of NMD to both simulated and real signals and demonstrate its qualitative and quantitative superiority over other approaches, such as (ensemble) empirical mode decomposition, Karhunen-Loève expansion, and independent component analysis. We point out that NMD is likely to be applicable and useful in many different areas of research, such as geophysics, finance, and the life sciences. The necessary matlab codes for running NMD are freely available for download.
Using Probabilistic Information in Solving Resource Allocation Problems for a Decentralized Firm
1978-09-01
deterministic equivalent form of HIQ’s problem (5) by an approach similar to the one used in stochastic programming with simple recourse. See Ziemba [38) or, in...1964). 38. Ziemba , W.T., "Stochastic Programs with Simple Recourse," Technical Report 72-15, Stanford University, Department of Operations Research
Xue, Zhaoguo; Sun, Mei; Dong, Taige; Tang, Zhiqiang; Zhao, Yaolong; Wang, Junzhuan; Wei, Xianlong; Yu, Linwei; Chen, Qing; Xu, Jun; Shi, Yi; Chen, Kunji; Roca I Cabarrocas, Pere
2017-12-13
Line-shape engineering is a key strategy to endow extra stretchability to 1D silicon nanowires (SiNWs) grown with self-assembly processes. We here demonstrate a deterministic line-shape programming of in-plane SiNWs into extremely stretchable springs or arbitrary 2D patterns with the aid of indium droplets that absorb amorphous Si precursor thin film to produce ultralong c-Si NWs along programmed step edges. A reliable and faithful single run growth of c-SiNWs over turning tracks with different local curvatures has been established, while high resolution transmission electron microscopy analysis reveals a high quality monolike crystallinity in the line-shaped engineered SiNW springs. Excitingly, in situ scanning electron microscopy stretching and current-voltage characterizations also demonstrate a superelastic and robust electric transport carried by the SiNW springs even under large stretching of more than 200%. We suggest that this highly reliable line-shape programming approach holds a strong promise to extend the mature c-Si technology into the development of a new generation of high performance biofriendly and stretchable electronics.
Nanofiber Nerve Guide for Peripheral Nerve Repair and Regeneration
2014-01-01
observing cell migration using live - cell imaging microscopy, and analyzing cell migration with our MATLAB-based programs. Our studies...are then pipetted into the chamber and their path of migration is observed using a live - cell imaging microscope (Fig. 6d). Utilizing this migration
Optical Excitations and Energy Transfer in Nanoparticle Waveguides
2009-03-01
All calculations were performed using our own codes given in the Appendix section. The calculations were performed using Scilab programming package...January 2007, invited Speaker) 12. Scilab is a free software compatible to the famous Matlab package. It can be found at their webpage http
NASA Astrophysics Data System (ADS)
Himr, D.
2013-04-01
Article describes simulation of unsteady flow during water hammer with two programs, which use different numerical approaches to solve ordinary one dimensional differential equations describing the dynamics of hydraulic elements and pipes. First one is Matlab-Simulink-SimHydraulics, which is a commercial software developed to solve the dynamics of general hydraulic systems. It defines them with block elements. The other software is called HYDRA and it is based on the Lax-Wendrff numerical method, which serves as a tool to solve the momentum and continuity equations. This program was developed in Matlab by Brno University of Technology. Experimental measurements were performed on a simple test rig, which consists of an elastic pipe with strong damping connecting two reservoirs. Water hammer is induced with fast closing the valve. Physical properties of liquid and pipe elasticity parameters were considered in both simulations, which are in very good agreement and differences in comparison with experimental data are minimal.
MATLAB implementation of a dynamic clamp with bandwidth >125 KHz capable of generating INa at 37°C
Clausen, Chris; Valiunas, Virginijus; Brink, Peter R.; Cohen, Ira S.
2012-01-01
We describe the construction of a dynamic clamp with bandwidth >125 KHz that utilizes a high performance, yet low cost, standard home/office PC interfaced with a high-speed (16 bit) data acquisition module. High bandwidth is achieved by exploiting recently available software advances (code-generation technology, optimized real-time kernel). Dynamic-clamp programs are constructed using Simulink, a visual programming language. Blocks for computation of membrane currents are written in the high-level matlab language; no programming in C is required. The instrument can be used in single- or dual-cell configurations, with the capability to modify programs while experiments are in progress. We describe an algorithm for computing the fast transient Na+ current (INa) in real time, and test its accuracy and stability using rate constants appropriate for 37°C. We then construct a program capable of supplying three currents to a cell preparation: INa, the hyperpolarizing-activated inward pacemaker current (If), and an inward-rectifier K+ current (IK1). The program corrects for the IR drop due to electrode current flow, and also records all voltages and currents. We tested this program on dual patch-clamped HEK293 cells where the dynamic clamp controls a current-clamp amplifier and a voltage-clamp amplifier controls membrane potential, and current-clamped HEK293 cells where the dynamic clamp produces spontaneous pacing behavior exhibiting Na+ spikes in otherwise passive cells. PMID:23224681
An interactive program on digitizing historical seismograms
NASA Astrophysics Data System (ADS)
Xu, Yihe; Xu, Tao
2014-02-01
Retrieving information from analog seismograms is of great importance since they are considered as the unique sources that provide quantitative information of historical earthquakes. We present an algorithm for automatic digitization of the seismograms as an inversion problem that forms an interactive program using Matlab® GUI. The program integrates automatic digitization with manual digitization and users can easily switch between the two modalities and carry out different combinations for the optimal results. Several examples about applying the interactive program are given to illustrate the merits of the method.
Elliptical quantum dots as on-demand single photons sources with deterministic polarization states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teng, Chu-Hsiang; Demory, Brandon; Ku, Pei-Cheng, E-mail: peicheng@umich.edu
In quantum information, control of the single photon's polarization is essential. Here, we demonstrate single photon generation in a pre-programmed and deterministic polarization state, on a chip-scale platform, utilizing site-controlled elliptical quantum dots (QDs) synthesized by a top-down approach. The polarization from the QD emission is found to be linear with a high degree of linear polarization and parallel to the long axis of the ellipse. Single photon emission with orthogonal polarizations is achieved, and the dependence of the degree of linear polarization on the QD geometry is analyzed.
PETSc Users Manual Revision 3.3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, S.; Brown, J.; Buschelman, K.
This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication. PETSc includes an expanding suite of parallel linear, nonlinear equation solvers and time integrators that may be used in application codes written in Fortran, C, C++, Python, and MATLAB (sequential). PETSc provides many of the mechanisms neededmore » within parallel application codes, such as parallel matrix and vector assembly routines. The library is organized hierarchically, enabling users to employ the level of abstraction that is most appropriate for a particular problem. By using techniques of object-oriented programming, PETSc provides enormous flexibility for users. PETSc is a sophisticated set of software tools; as such, for some users it initially has a much steeper learning curve than a simple subroutine library. In particular, for individuals without some computer science background, experience programming in C, C++ or Fortran and experience using a debugger such as gdb or dbx, it may require a significant amount of time to take full advantage of the features that enable efficient software use. However, the power of the PETSc design and the algorithms it incorporates may make the efficient implementation of many application codes simpler than “rolling them” yourself; For many tasks a package such as MATLAB is often the best tool; PETSc is not intended for the classes of problems for which effective MATLAB code can be written. PETSc also has a MATLAB interface, so portions of your code can be written in MATLAB to “try out” the PETSc solvers. The resulting code will not be scalable however because currently MATLAB is inherently not scalable; and PETSc should not be used to attempt to provide a “parallel linear solver” in an otherwise sequential code. Certainly all parts of a previously sequential code need not be parallelized but the matrix generation portion must be parallelized to expect any kind of reasonable performance. Do not expect to generate your matrix sequentially and then “use PETSc” to solve the linear system in parallel. Since PETSc is under continued development, small changes in usage and calling sequences of routines will occur. PETSc is supported; see the web site http://www.mcs.anl.gov/petsc for information on contacting support. A http://www.mcs.anl.gov/petsc/publications may be found a list of publications and web sites that feature work involving PETSc. We welcome any reports of corrections for this document.« less
PETSc Users Manual Revision 3.4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, S.; Brown, J.; Buschelman, K.
This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication. PETSc includes an expanding suite of parallel linear, nonlinear equation solvers and time integrators that may be used in application codes written in Fortran, C, C++, Python, and MATLAB (sequential). PETSc provides many of the mechanisms neededmore » within parallel application codes, such as parallel matrix and vector assembly routines. The library is organized hierarchically, enabling users to employ the level of abstraction that is most appropriate for a particular problem. By using techniques of object-oriented programming, PETSc provides enormous flexibility for users. PETSc is a sophisticated set of software tools; as such, for some users it initially has a much steeper learning curve than a simple subroutine library. In particular, for individuals without some computer science background, experience programming in C, C++ or Fortran and experience using a debugger such as gdb or dbx, it may require a significant amount of time to take full advantage of the features that enable efficient software use. However, the power of the PETSc design and the algorithms it incorporates may make the efficient implementation of many application codes simpler than “rolling them” yourself; For many tasks a package such as MATLAB is often the best tool; PETSc is not intended for the classes of problems for which effective MATLAB code can be written. PETSc also has a MATLAB interface, so portions of your code can be written in MATLAB to “try out” the PETSc solvers. The resulting code will not be scalable however because currently MATLAB is inherently not scalable; and PETSc should not be used to attempt to provide a “parallel linear solver” in an otherwise sequential code. Certainly all parts of a previously sequential code need not be parallelized but the matrix generation portion must be parallelized to expect any kind of reasonable performance. Do not expect to generate your matrix sequentially and then “use PETSc” to solve the linear system in parallel. Since PETSc is under continued development, small changes in usage and calling sequences of routines will occur. PETSc is supported; see the web site http://www.mcs.anl.gov/petsc for information on contacting support. A http://www.mcs.anl.gov/petsc/publications may be found a list of publications and web sites that feature work involving PETSc. We welcome any reports of corrections for this document.« less
PETSc Users Manual Revision 3.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, S.; Abhyankar, S.; Adams, M.
This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication. PETSc includes an expanding suite of parallel linear, nonlinear equation solvers and time integrators that may be used in application codes written in Fortran, C, C++, Python, and MATLAB (sequential). PETSc provides many of the mechanisms neededmore » within parallel application codes, such as parallel matrix and vector assembly routines. The library is organized hierarchically, enabling users to employ the level of abstraction that is most appropriate for a particular problem. By using techniques of object-oriented programming, PETSc provides enormous flexibility for users. PETSc is a sophisticated set of software tools; as such, for some users it initially has a much steeper learning curve than a simple subroutine library. In particular, for individuals without some computer science background, experience programming in C, C++ or Fortran and experience using a debugger such as gdb or dbx, it may require a significant amount of time to take full advantage of the features that enable efficient software use. However, the power of the PETSc design and the algorithms it incorporates may make the efficient implementation of many application codes simpler than “rolling them” yourself. ;For many tasks a package such as MATLAB is often the best tool; PETSc is not intended for the classes of problems for which effective MATLAB code can be written. PETSc also has a MATLAB interface, so portions of your code can be written in MATLAB to “try out” the PETSc solvers. The resulting code will not be scalable however because currently MATLAB is inherently not scalable; and PETSc should not be used to attempt to provide a “parallel linear solver” in an otherwise sequential code. Certainly all parts of a previously sequential code need not be parallelized but the matrix generation portion must be parallelized to expect any kind of reasonable performance. Do not expect to generate your matrix sequentially and then “use PETSc” to solve the linear system in parallel. Since PETSc is under continued development, small changes in usage and calling sequences of routines will occur. PETSc is supported; see the web site http://www.mcs.anl.gov/petsc for information on contacting support. A http://www.mcs.anl.gov/petsc/publications may be found a list of publications and web sites that feature work involving PETSc. We welcome any reports of corrections for this document.« less
Navigation Constellation Design Using a Multi-Objective Genetic Algorithm
2015-03-26
programs. This specific tool not only offers high fidelity simulations, but it also offers the visual aid provided by STK . The ability to...MATLAB and STK . STK is a program that allows users to model, analyze, and visualize space systems. Users can create objects such as satellites and...position dilution of precision (PDOP) and system cost. This thesis utilized Satellite Tool Kit ( STK ) to calculate PDOP values of navigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mou, Q.; Benmore, C. J.; Yarger, J. L.
2015-06-01
XISF is a MATLAB program developed to separate intermolecular structure factors from total X-ray scattering structure factors for molecular liquids and amorphous solids. The program is built on a trust-region-reflective optimization routine with the r.m.s. deviations of atoms physically constrained. XISF has been optimized for performance and can separate intermolecular structure factors of complex molecules.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mou, Q.; Benmore, C. J.; Yarger, J. L.
2015-05-09
XISFis a MATLAB program developed to separate intermolecular structure factors from total X-ray scattering structure factors for molecular liquids and amorphous solids. The program is built on a trust-region-reflective optimization routine with the r.m.s. deviations of atoms physically constrained.XISFhas been optimized for performance and can separate intermolecular structure factors of complex molecules.
Theory research of seam recognition and welding torch pose control based on machine vision
NASA Astrophysics Data System (ADS)
Long, Qiang; Zhai, Peng; Liu, Miao; He, Kai; Wang, Chunyang
2017-03-01
At present, the automation requirement of the welding become higher, so a method of the welding information extraction by vision sensor is proposed in this paper, and the simulation with the MATLAB has been conducted. Besides, in order to improve the quality of robot automatic welding, an information retrieval method for welding torch pose control by visual sensor is attempted. Considering the demands of welding technology and engineering habits, the relative coordinate systems and variables are strictly defined, and established the mathematical model of the welding pose, and verified its feasibility by using the MATLAB simulation in the paper, these works lay a foundation for the development of welding off-line programming system with high precision and quality.
Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package
ERIC Educational Resources Information Center
Ibrahim, Dogan
2009-01-01
The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…
Programming with non-heap memory in the real time specification for Java
NASA Technical Reports Server (NTRS)
Bollella, G.; Canham, T.; Carson, V.; Champlin, V.; Dvorak, D.; Giovannoni, B.; Indictor, M.; Meyer, K.; Reinholtz, A.; Murray, K.
2003-01-01
The Real-Time Specification for Java (RTSJ) provides facilities for deterministic, real-time execution in a language that is otherwise subject to variable latencies in memory allocation and garbage collection.
Simulating electron energy loss spectroscopy with the MNPBEM toolbox
NASA Astrophysics Data System (ADS)
Hohenester, Ulrich
2014-03-01
Within the MNPBEM toolbox, we show how to simulate electron energy loss spectroscopy (EELS) of plasmonic nanoparticles using a boundary element method approach. The methodology underlying our approach closely follows the concepts developed by García de Abajo and coworkers (Garcia de Abajo, 2010). We introduce two classes eelsret and eelsstat that allow in combination with our recently developed MNPBEM toolbox for a simple, robust, and efficient computation of EEL spectra and maps. The classes are accompanied by a number of demo programs for EELS simulation of metallic nanospheres, nanodisks, and nanotriangles, and for electron trajectories passing by or penetrating through the metallic nanoparticles. We also discuss how to compute electric fields induced by the electron beam and cathodoluminescence. Catalogue identifier: AEKJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 38886 No. of bytes in distributed program, including test data, etc.: 1222650 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b). Computer: Any which supports Matlab 7.11.0 (R2010b). Operating system: Any which supports Matlab 7.11.0 (R2010b). RAM:≥1 GB Classification: 18. Catalogue identifier of previous version: AEKJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 370 External routines: MESH2D available at www.mathworks.com Does the new version supersede the previous version?: Yes Nature of problem: Simulation of electron energy loss spectroscopy (EELS) for plasmonic nanoparticles. Solution method: Boundary element method using electromagnetic potentials. Reasons for new version: The new version of the toolbox includes two additional classes for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles, and corrects a few minor bugs and inconsistencies. Summary of revisions: New classes “eelsstat” and “eelsret” for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles have been added. A few minor errors in the implementation of dipole excitation have been corrected. Running time: Depending on surface discretization between seconds and hours.
Using MATLAB Software on the Peregrine System | High-Performance Computing
Learn how to run MATLAB software in batch mode on the Peregrine system. Below is an example MATLAB job in batch (non-interactive) mode. To try the example out, create both matlabTest.sub and /$USER. In this example, it is also the directory into which MATLAB will write the output file x.dat
Assessing and Upgrading Ocean Mixing for the Study of Climate Change
NASA Astrophysics Data System (ADS)
Howard, A. M.; Fells, J.; Lindo, F.; Tulsee, V.; Canuto, V.; Cheng, Y.; Dubovikov, M. S.; Leboissetier, A.
2016-12-01
Climate is critical. Climate variability affects us all; Climate Change is a burning issue. Droughts, floods, other extreme events, and Global Warming's effects on these and problems such as sea-level rise and ecosystem disruption threaten lives. Citizens must be informed to make decisions concerning climate such as "business as usual" vs. mitigating emissions to keep warming within bounds. Medgar Evers undergraduates aid NASA research while learning climate science and developing computer&math skills. To make useful predictions we must realistically model each component of the climate system, including the ocean, whose critical role includes transporting&storing heat and dissolved CO2. We need physically based parameterizations of key ocean processes that can't be put explicitly in a global climate model, e.g. vertical&lateral mixing. The NASA-GISS turbulence group uses theory to model mixing including: 1) a comprehensive scheme for small scale vertical mixing, including convection&shear, internal waves & double-diffusion, and bottom tides 2) a new parameterization for the lateral&vertical mixing by mesoscale eddies. For better understanding we write our own programs. To assess the modelling MATLAB programs visualize and calculate statistics, including means, standard deviations and correlations, on NASA-GISS OGCM output with different mixing schemes and help us study drift from observations. We also try to upgrade the schemes, e.g. the bottom tidal mixing parameterizations' roughness, calculated from high resolution topographic data using Gaussian weighting functions with cut-offs. We study the effects of their parameters to improve them. A FORTRAN program extracts topography data subsets of manageable size for a MATLAB program, tested on idealized cases, to visualize&calculate roughness on. Students are introduced to modeling a complex system, gain a deeper appreciation of climate science, programming skills and familiarity with MATLAB, while furthering climate science by improving our mixing schemes. We are incorporating climate research into our college curriculum. The PI is both a member of the turbulence group at NASA-GISS and an associate professor at Medgar Evers College of CUNY, an urban minority serving institution in central Brooklyn. Supported by NSF Award AGS-1359293.
Stochastic Dynamic Mixed-Integer Programming (SD-MIP)
2015-05-05
stochastic linear programming ( SLP ) problems. By using a combination of ideas from cutting plane theory of deterministic MIP (especially disjunctive...developed to date. b) As part of this project, we have also developed tools for very large scale Stochastic Linear Programming ( SLP ). There are...several reasons for this. First, SLP models continue to challenge many of the fastest computers to date, and many applications within the DoD (e.g
EEGVIS: A MATLAB Toolbox for Browsing, Exploring, and Viewing Large Datasets.
Robbins, Kay A
2012-01-01
Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data and to indentify interesting or problematic data segments. Researchers also do not have a convenient method of method of visually assessing the effects of applying any stage in a processing pipeline. EEGVIS is a MATLAB toolbox that allows users to quickly explore multi-channel EEG and other large array-based data sets using multi-scale drill-down techniques. Customizable summary views reveal potentially interesting sections of data, which users can explore further by clicking to examine using detailed viewing components. The viewer and a companion browser are built on our MoBBED framework, which has a library of modular viewing components that can be mixed and matched to best reveal structure. Users can easily create new viewers for their specific data without any programming during the exploration process. These viewers automatically support pan, zoom, resizing of individual components, and cursor exploration. The toolbox can be used directly in MATLAB at any stage in a processing pipeline, as a plug-in for EEGLAB, or as a standalone precompiled application without MATLAB running. EEGVIS and its supporting packages are freely available under the GNU general public license at http://visual.cs.utsa.edu/eegvis.
A Collection of Nonlinear Aircraft Simulations in MATLAB
NASA Technical Reports Server (NTRS)
Garza, Frederico R.; Morelli, Eugene A.
2003-01-01
Nonlinear six degree-of-freedom simulations for a variety of aircraft were created using MATLAB. Data for aircraft geometry, aerodynamic characteristics, mass / inertia properties, and engine characteristics were obtained from open literature publications documenting wind tunnel experiments and flight tests. Each nonlinear simulation was implemented within a common framework in MATLAB, and includes an interface with another commercially-available program to read pilot inputs and produce a three-dimensional (3-D) display of the simulated airplane motion. Aircraft simulations include the General Dynamics F-16 Fighting Falcon, Convair F-106B Delta Dart, Grumman F-14 Tomcat, McDonnell Douglas F-4 Phantom, NASA Langley Free-Flying Aircraft for Sub-scale Experimental Research (FASER), NASA HL-20 Lifting Body, NASA / DARPA X-31 Enhanced Fighter Maneuverability Demonstrator, and the Vought A-7 Corsair II. All nonlinear simulations and 3-D displays run in real time in response to pilot inputs, using contemporary desktop personal computer hardware. The simulations can also be run in batch mode. Each nonlinear simulation includes the full nonlinear dynamics of the bare airframe, with a scaled direct connection from pilot inputs to control surface deflections to provide adequate pilot control. Since all the nonlinear simulations are implemented entirely in MATLAB, user-defined control laws can be added in a straightforward fashion, and the simulations are portable across various computing platforms. Routines for trim, linearization, and numerical integration are included. The general nonlinear simulation framework and the specifics for each particular aircraft are documented.
H-Bridge Inverter Loading Analysis for an Energy Management System
2013-06-01
In order to accomplish the stated objectives, a physics-based model of the system was developed in MATLAB/Simulink. The system was also implemented ...functional architecture and then compile the high level design down to VHDL in order to program the designed functions to the FPGA. B. INSULATED
Small Internal Combustion Engine Testing for a Hybrid-Electric Remotely-Piloted Aircraft
2011-03-01
differential equations (ODEs) were formed and solved for numerically using various solvers in MATLAB . From these solutions, engine performance...program 5. □ Make sure eddy-current absorber and sprockets are free of debris and that no loose materials are close enough to become entangled
An Elementary Algorithm to Evaluate Trigonometric Functions to High Precision
ERIC Educational Resources Information Center
Johansson, B. Tomas
2018-01-01
Evaluation of the cosine function is done via a simple Cordic-like algorithm, together with a package for handling arbitrary-precision arithmetic in the computer program Matlab. Approximations to the cosine function having hundreds of correct decimals are presented with a discussion around errors and implementation.
Interactive Reliability Model for Whisker-toughened Ceramics
NASA Technical Reports Server (NTRS)
Palko, Joseph L.
1993-01-01
Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.
Strategic Mobility 21: Modeling, Simulation, and Analysis
2010-04-14
using AnyLogic , which is a Java programmed, multi-method simulation modeling tool developed by XJ Technologies. The last section examines the academic... simulation model from an Arena platform to an AnyLogic based Web Service. MATLAB is useful for small problems with few nodes, but GAMS/CPLEX is better... Transportation Modeling Studio TM . The SCASN modeling and simulation program was designed to be generic in nature to allow for use by both commercial and
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
Tyson, Reny B; Nowacek, Douglas P; Miller, Patrick J O
2007-09-01
Nonlinear phenomena or nonlinearities in animal vocalizations include features such as subharmonics, deterministic chaos, biphonation, and frequency jumps that until recently were generally ignored in acoustic analyses. Recent documentation of these phenomena in several species suggests that they may play a communicative role, though the exact function is still under investigation. Here, qualitative descriptions and quantitative analyses of nonlinearities in the vocalizations of killer whales (Orcinus orca) and North Atlantic right whales (Eubalaena glacialis) are provided. All four nonlinear features were present in both species, with at least one feature occurring in 92.4% of killer and 65.7% of right whale vocalizations analyzed. Occurrence of biphonation varied the most between species, being present in 89.0% of killer whale vocalizations and only 20.4% of right whale vocalizations. Because deterministic chaos is qualitatively and quantitatively different than random or Gaussian noise, a program (TISEAN) designed specifically to identify deterministic chaos to confirm the presence of this nonlinearity was used. All segments tested in this software indicate that both species do indeed exhibit deterministic chaos. The results of this study provide confirmation that such features are common in the vocalizations of cetacean species and lay the groundwork for future studies.
Sridhar, Vishnu B; Tian, Peifang; Dale, Anders M; Devor, Anna; Saisan, Payam A
2014-01-01
We present a database client software-Neurovascular Network Explorer 1.0 (NNE 1.0)-that uses MATLAB(®) based Graphical User Interface (GUI) for interaction with a database of 2-photon single-vessel diameter measurements from our previous publication (Tian et al., 2010). These data are of particular interest for modeling the hemodynamic response. NNE 1.0 is downloaded by the user and then runs either as a MATLAB script or as a standalone program on a Windows platform. The GUI allows browsing the database according to parameters specified by the user, simple manipulation and visualization of the retrieved records (such as averaging and peak-normalization), and export of the results. Further, we provide NNE 1.0 source code. With this source code, the user can database their own experimental results, given the appropriate data structure and naming conventions, and thus share their data in a user-friendly format with other investigators. NNE 1.0 provides an example of seamless and low-cost solution for sharing of experimental data by a regular size neuroscience laboratory and may serve as a general template, facilitating dissemination of biological results and accelerating data-driven modeling approaches.
MultiElec: A MATLAB Based Application for MEA Data Analysis.
Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R
2015-01-01
We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.
SEQassembly: A Practical Tools Program for Coding Sequences Splicing
NASA Astrophysics Data System (ADS)
Lee, Hongbin; Yang, Hang; Fu, Lei; Qin, Long; Li, Huili; He, Feng; Wang, Bo; Wu, Xiaoming
CDS (Coding Sequences) is a portion of mRNA sequences, which are composed by a number of exon sequence segments. The construction of CDS sequence is important for profound genetic analysis such as genotyping. A program in MATLAB environment is presented, which can process batch of samples sequences into code segments under the guide of reference exon models, and splice these code segments of same sample source into CDS according to the exon order in queue file. This program is useful in transcriptional polymorphism detection and gene function study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baca, Renee Nicole; Congdon, Michael L.; Brake, Matthew Robert
In 2012, a Matlab GUI for the prediction of the coefficient of restitution was developed in order to enable the formulation of more accurate Finite Element Analysis (FEA) models of components. This report details the development of a new Rebound Dynamics GUI, and how it differs from the previously developed program. The new GUI includes several new features, such as source and citation documentation for the material database, as well as a multiple materials impact modeler for use with LMS Virtual.Lab Motion (LMS VLM), and a rigid body dynamics modeling software. The Rebound Dynamics GUI has been designed to workmore » with LMS VLM to enable straightforward incorporation of velocity-dependent coefficients of restitution in rigid body dynamics simulations.« less
Spectrum image analysis tool - A flexible MATLAB solution to analyze EEL and CL spectrum images.
Schmidt, Franz-Philipp; Hofer, Ferdinand; Krenn, Joachim R
2017-02-01
Spectrum imaging techniques, gaining simultaneously structural (image) and spectroscopic data, require appropriate and careful processing to extract information of the dataset. In this article we introduce a MATLAB based software that uses three dimensional data (EEL/CL spectrum image in dm3 format (Gatan Inc.'s DigitalMicrograph ® )) as input. A graphical user interface enables a fast and easy mapping of spectral dependent images and position dependent spectra. First, data processing such as background subtraction, deconvolution and denoising, second, multiple display options including an EEL/CL moviemaker and, third, the applicability on a large amount of data sets with a small work load makes this program an interesting tool to visualize otherwise hidden details. Copyright © 2016 Elsevier Ltd. All rights reserved.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.
Lee, Leng-Feng; Umberger, Brian R
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB
Lee, Leng-Feng
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184
NASA Astrophysics Data System (ADS)
Scally, Lawrence J.
This program was implemented by Lawrence J. Scally for a Ph.D. under the EECE department at the University of Colorado at Boulder with most funding provided by the U.S. Army. Professor Gasiewski is the advisor and guider for the entire program; he has a strong history decades ago in this type of program. This program is developing a more advanced than previous years transmissometer, called Terahertz Atmospheric and Ionospheric Propagation, Absorption and Scattering System (TAIPAS), on an open path between the University of Colorado EE building roof and the mesa on owned by National Institute of Standards and Technology (NIST); NIST has invested money, location and support for the program. Besides designing and building the transmissometer, that has never be accomplished at this level, the system also analyzes the atmospheric propagation of frequencies by scanning between 320 GHz and 340 GHz, which includes the peak absorption frequency at 325.1529 GHz due to water absorption. The processing and characterization of the deterministic and random propagation characteristics of the atmosphere in the real world was significantly started; this will be executed with varies aerosols for decades on the permanently mounted system that is accessible 24/7 via a network over the CU Virtual Private Network (VPN).
The role of predictive uncertainty in the operational management of reservoirs
NASA Astrophysics Data System (ADS)
Todini, E.
2014-09-01
The present work deals with the operational management of multi-purpose reservoirs, whose optimisation-based rules are derived, in the planning phase, via deterministic (linear and nonlinear programming, dynamic programming, etc.) or via stochastic (generally stochastic dynamic programming) approaches. In operation, the resulting deterministic or stochastic optimised operating rules are then triggered based on inflow predictions. In order to fully benefit from predictions, one must avoid using them as direct inputs to the reservoirs, but rather assess the "predictive knowledge" in terms of a predictive probability density to be operationally used in the decision making process for the estimation of expected benefits and/or expected losses. Using a theoretical and extremely simplified case, it will be shown why directly using model forecasts instead of the full predictive density leads to less robust reservoir management decisions. Moreover, the effectiveness and the tangible benefits for using the entire predictive probability density instead of the model predicted values will be demonstrated on the basis of the Lake Como management system, operational since 1997, as well as on the basis of a case study on the lake of Aswan.
NASA Astrophysics Data System (ADS)
Wang, Fengyu
Traditional deterministic reserve requirements rely on ad-hoc, rule of thumb methods to determine adequate reserve in order to ensure a reliable unit commitment. Since congestion and uncertainties exist in the system, both the quantity and the location of reserves are essential to ensure system reliability and market efficiency. The modeling of operating reserves in the existing deterministic reserve requirements acquire the operating reserves on a zonal basis and do not fully capture the impact of congestion. The purpose of a reserve zone is to ensure that operating reserves are spread across the network. Operating reserves are shared inside each reserve zone, but intra-zonal congestion may block the deliverability of operating reserves within a zone. Thus, improving reserve policies such as reserve zones may improve the location and deliverability of reserve. As more non-dispatchable renewable resources are integrated into the grid, it will become increasingly difficult to predict the transfer capabilities and the network congestion. At the same time, renewable resources require operators to acquire more operating reserves. With existing deterministic reserve requirements unable to ensure optimal reserve locations, the importance of reserve location and reserve deliverability will increase. While stochastic programming can be used to determine reserve by explicitly modelling uncertainties, there are still scalability as well as pricing issues. Therefore, new methods to improve existing deterministic reserve requirements are desired. One key barrier of improving existing deterministic reserve requirements is its potential market impacts. A metric, quality of service, is proposed in this thesis to evaluate the price signal and market impacts of proposed hourly reserve zones. Three main goals of this thesis are: 1) to develop a theoretical and mathematical model to better locate reserve while maintaining the deterministic unit commitment and economic dispatch structure, especially with the consideration of renewables, 2) to develop a market settlement scheme of proposed dynamic reserve policies such that the market efficiency is improved, 3) to evaluate the market impacts and price signal of the proposed dynamic reserve policies.
Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.
Markiewicz, Tomasz
2011-03-30
The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8 GHz 4 GB RAM server with 768x576 pixel size, 1.28 Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server - 3.5 seconds, at remote analysis - 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system.
Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology
2011-01-01
Background The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. Methods In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. Results The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8GHz 4GB RAM server with 768x576 pixel size, 1.28Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server – 3.5 seconds, at remote analysis – 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. Conclusions The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system. PMID:21489188
Mechanisms and Mitigation of Hearing Loss from Blast Injury
2012-10-01
Apple Hill Drive Natick, MA 01760-2098 USA). The matlab program controlled the stimulus presentation and 11 Figure 2: Cochleostomies in scala ...Mechanisms and Mitigation of Hearing Loss from Blast Injury 5b. GRANT NUMBER W81XWH-10-2-0112 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) James R...gauge lm in the shock tube rupture membrane. Lessons learned Tympanic membrane rupture Data were highly variable, with one rupture at 7 PSI, another at
USDA-ARS?s Scientific Manuscript database
A rapid computer-aided program for profiling glucosinolates, “GLS-Finder", was developed. GLS-Finder is a Matlab script based expert system that is capable for qualitative and semi-quantitative analysis of glucosinolates in samples using data generated by ultra-high performance liquid chromatograph...
Automatic Rock Detection and Mapping from HiRISE Imagery
NASA Technical Reports Server (NTRS)
Huertas, Andres; Adams, Douglas S.; Cheng, Yang
2008-01-01
This system includes a C-code software program and a set of MATLAB software tools for statistical analysis and rock distribution mapping. The major functions include rock detection and rock detection validation. The rock detection code has been evolved into a production tool that can be used by engineers and geologists with minor training.
2014-12-01
4 5 BODY ...Jefferson University. 5 BODY 5.1 Training Component The training component of this research has been split into breast imaging and image...exclusively with Matlab (the programming language that will be used for the research component of this project). Additionally, the PI attended
2011-06-22
high degree of symmetry directly leads to a symmetry-enforced selection rule that can produce quantum entanglement [21, 22]. This report is organized...page.) Then, using a Matlab program, we converted the microscope image to a binary bitmap, from which we extract fiber radius at any given location
Simulated Analysis of Linear Reversible Enzyme Inhibition with SCILAB
ERIC Educational Resources Information Center
Antuch, Manuel; Ramos, Yaquelin; Álvarez, Rubén
2014-01-01
SCILAB is a lesser-known program (than MATLAB) for numeric simulations and has the advantage of being free software. A challenging software-based activity to analyze the most common linear reversible inhibition types with SCILAB is described. Students establish typical values for the concentration of enzyme, substrate, and inhibitor to simulate…
Large-Scale Dynamic Observation Planning for Unmanned Surface Vessels
2007-06-01
programming language. In addition, the useful development software NetBeans IDE is free and makes the use of Java very user-friendly. 92...3. We implemented the greedy and 3PAA algorithms in Java using the NetBeans IDE version 5.5. 4. The test datasets were generated in MATLAB. 5
ISMRM Raw Data Format: A Proposed Standard for MRI Raw Datasets
Inati, Souheil J.; Naegele, Joseph D.; Zwart, Nicholas R.; Roopchansingh, Vinai; Lizak, Martin J.; Hansen, David C.; Liu, Chia-Ying; Atkinson, David; Kellman, Peter; Kozerke, Sebastian; Xue, Hui; Campbell-Washburn, Adrienne E.; Sørensen, Thomas S.; Hansen, Michael S.
2015-01-01
Purpose This work proposes the ISMRM Raw Data (ISMRMRD) format as a common MR raw data format, which promotes algorithm and data sharing. Methods A file format consisting of a flexible header and tagged frames of k-space data was designed. Application Programming Interfaces were implemented in C/C++, MATLAB, and Python. Converters for Bruker, General Electric, Philips, and Siemens proprietary file formats were implemented in C++. Raw data were collected using MRI scanners from four vendors, converted to ISMRMRD format, and reconstructed using software implemented in three programming languages (C++, MATLAB, Python). Results Images were obtained by reconstructing the raw data from all vendors. The source code, raw data, and images comprising this work are shared online, serving as an example of an image reconstruction project following a paradigm of reproducible research. Conclusion The proposed raw data format solves a practical problem for the MRI community. It may serve as a foundation for reproducible research and collaborations. The ISMRMRD format is a completely open and community-driven format, and the scientific community is invited (including commercial vendors) to participate either as users or developers. PMID:26822475
Diagnosis of Lung Cancer by Fractal Analysis of Damaged DNA
Namazi, Hamidreza; Kiminezhadmalaie, Mona
2015-01-01
Cancer starts when cells in a part of the body start to grow out of control. In fact cells become cancer cells because of DNA damage. A DNA walk of a genome represents how the frequency of each nucleotide of a pairing nucleotide couple changes locally. In this research in order to study the cancer genes, DNA walk plots of genomes of patients with lung cancer were generated using a program written in MATLAB language. The data so obtained was checked for fractal property by computing the fractal dimension using a program written in MATLAB. Also, the correlation of damaged DNA was studied using the Hurst exponent measure. We have found that the damaged DNA sequences are exhibiting higher degree of fractality and less correlation compared with normal DNA sequences. So we confirmed this method can be used for early detection of lung cancer. The method introduced in this research not only is useful for diagnosis of lung cancer but also can be applied for detection and growth analysis of different types of cancers. PMID:26539245
Boyd, O.S.
2006-01-01
We have created a second-order finite-difference solution to the anisotropic elastic wave equation in three dimensions and implemented the solution as an efficient Matlab script. This program allows the user to generate synthetic seismograms for three-dimensional anisotropic earth structure. The code was written for teleseismic wave propagation in the 1-0.1 Hz frequency range but is of general utility and can be used at all scales of space and time. This program was created to help distinguish among various types of lithospheric structure given the uneven distribution of sources and receivers commonly utilized in passive source seismology. Several successful implementations have resulted in a better appreciation for subduction zone structure, the fate of a transform fault with depth, lithospheric delamination, and the effects of wavefield focusing and defocusing on attenuation. Companion scripts are provided which help the user prepare input to the finite-difference solution. Boundary conditions including specification of the initial wavefield, absorption and two types of reflection are available. ?? 2005 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
France, Lydéric; Nicollet, Christian
2010-06-01
MetaRep is a program based on our earlier program CMAS 3D. It is developed in MATLAB ® script. MetaRep objectives are to visualize and project major element compositions of mafic and pelitic rocks and their minerals in the pseudo-quaternary projections of the ACF-S, ACF-N, CMAS, AFM-K, AFM-S and AKF-S systems. These six systems are commonly used to describe metamorphic mineral assemblages and magmatic evolutions. Each system, made of four apices, can be represented in a tetrahedron that can be visualized in three dimensions with MetaRep; the four tetrahedron apices represent oxides or combination of oxides that define the composition of the projected rock or mineral. The three-dimensional representation allows one to obtain a better understanding of the topology of the relationships between the rocks and minerals and relations. From these systems, MetaRep can also project data in ternary plots (for example, the ACF, AFM and AKF ternary projections can be generated). A functional interface makes it easy to use and does not require any knowledge of MATLAB ® programming. To facilitate the use, MetaRep loads, from the main interface, data compiled in a Microsoft Excel ™ spreadsheet. Although useful for scientific research, the program is also a powerful tool for teaching. We propose an application example that, by using two combined systems (ACF-S and ACF-N), provides strong confirmation in the petrological interpretation.
Using STOQS and stoqstoolbox for in situ Measurement Data Access in Matlab
NASA Astrophysics Data System (ADS)
López-Castejón, F.; Schlining, B.; McCann, M. P.
2012-12-01
This poster presents the stoqstoolbox, an extension to Matlab that simplifies the loading of in situ measurement data directly from STOQS databases. STOQS (Spatial Temporal Oceanographic Query System) is a geospatial database tool designed to provide efficient access to data following the CF-NetCDF Discrete Samples Geometries convention. Data are loaded from CF-NetCDF files into a STOQS database where indexes are created on depth, spatial coordinates and other parameters, e.g. platform type. STOQS provides consistent, simple and efficient methods to query for data. For example, we can request all measurements with a standard_name of sea_water_temperature between two times and from between two depths. Data access is simpler because the data are retrieved by parameter irrespective of platform or mission file names. Access is more efficient because data are retrieved via the index on depth and only the requested data are retrieved from the database and transferred into the Matlab workspace. Applications in the stoqstoolbox query the STOQS database via an HTTP REST application programming interface; they follow the Data Access Object pattern, enabling highly customizable query construction. Data are loaded into Matlab structures that clearly indicate latitude, longitude, depth, measurement data value, and platform name. The stoqstoolbox is designed to be used in concert with other tools, such as nctoolbox, which can load data from any OPeNDAP data source. With these two toolboxes a user can easily work with in situ and other gridded data, such as from numerical models and remote sensing platforms. In order to show the capability of stoqstoolbox we will show an example of model validation using data collected during the May-June 2012 field experiment conducted by the Monterey Bay Aquarium Research Institute (MBARI) in Monterey Bay, California. The data are available from the STOQS server at http://odss.mbari.org/canon/stoqs_may2012/query/. Over 14 million data points of 18 parameters from 6 platforms measured over a 3-week period are available on this server. The model used for comparison is the Regional Ocean Modeling System developed by Jet Propulsion Laboratory for the Monterey Bay. The model output are loaded into Matlab using nctoolbox from the JPL server at http://ourocean.jpl.nasa.gov:8080/thredds/dodsC/MBNowcast. Model validation with in situ measurements can be difficult because of different file formats and because data may be spread across individual data systems for each platform. With stoqstoolbox the researcher must know only the URL of the STOQS server and the OPeNDAP URL of the model output. With selected depth and time constraints a user's Matlab program searches for all in situ measurements available for the same time, depth and variable of the model. STOQS and stoqstoolbox are open source software projects supported by MBARI and the David and Lucile Packard foundation. For more information please see http://code.google.com/p/stoqs.
INTEGRATED PLANNING MODEL - EPA APPLICATIONS
The Integrated Planning Model (IPM) is a multi-regional, dynamic, deterministic linear programming (LP) model of the electric power sector in the continental lower 48 states and the District of Columbia. It provides forecasts up to year 2050 of least-cost capacity expansion, elec...
SIGNUM: A Matlab, TIN-based landscape evolution model
NASA Astrophysics Data System (ADS)
Refice, A.; Giachetta, E.; Capolongo, D.
2012-08-01
Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.
Tools for Integrating Data Access from the IRIS DMC into Research Workflows
NASA Astrophysics Data System (ADS)
Reyes, C. G.; Suleiman, Y. Y.; Trabant, C.; Karstens, R.; Weertman, B. R.
2012-12-01
Web service interfaces at the IRIS Data Management Center (DMC) provide access to a vast archive of seismological and related geophysical data. These interfaces are designed to easily incorporate data access into data processing workflows. Examples of data that may be accessed include: time series data, related metadata, and earthquake information. The DMC has developed command line scripts, MATLAB® interfaces and a Java library to support a wide variety of data access needs. Users of these interfaces do not need to concern themselves with web service details, networking, or even (in most cases) data conversion. Fetch scripts allow access to the DMC archive and are a comfortable fit for command line users. These scripts are written in Perl and are well suited for automation and integration into existing workflows on most operating systems. For metdata and event information, the Fetch scripts even parse the returned data into simple text summaries. The IRIS Java Web Services Library (IRIS-WS Library) allows Java developers the ability to create programs that access the DMC archives seamlessly. By returning the data and information as native Java objects the Library insulates the developer from data formats, network programming and web service details. The MATLAB interfaces leverage this library to allow users access to the DMC archive directly from within MATLAB (r2009b or newer), returning data into variables for immediate use. Data users and research groups are developing other toolkits that use the DMC's web services. Notably, the ObsPy framework developed at LMU Munich is a Python Toolbox that allows seamless access to data and information via the DMC services. Another example is the MATLAB-based GISMO and Waveform Suite developments that can now access data via web services. In summary, there now exist a host of ways that researchers can bring IRIS DMC data directly into their workflows. MATLAB users can use irisFetch.m, command line users can use the various Fetch scripts, Java users can use the IRIS-WS library, and Python users may request data through ObsPy. To learn more about any of these clients see http://www.iris.edu/ws/wsclients/.
Giorli, Giacomo; Au, Whitlow W L; Ou, Hui; Jarvis, Susan; Morrissey, Ronald; Moretti, David
2015-05-01
The temporal occurrence of deep diving cetaceans in the Josephine Seamount High Seas Marine Protected Area (JSHSMPA), south-west Portugal, was monitored using a passive acoustic recorder. The recorder was deployed on 13 May 2010 at a depth of 814 m during the North Atlantic Treaty Organization Centre for Maritime Research and Experimentation cruise "Sirena10" and recovered on 6 June 2010. The recorder was programmed to record 40 s of data every 2 min. Acoustic data analysis, for the detection and classification of echolocation clicks, was performed using automatic detector/classification systems: M3R (Marine Mammal Monitoring on Navy Ranges), a custom matlab program, and an operator-supervised custom matlab program to assess the classification performance of the detector/classification systems. M3R CS-SVM algorithm contains templates to detect beaked whales, sperm whales, blackfish (pilot and false killer whales), and Risso's dolphins. The detections of each group of odontocetes was monitored as a function of time. Blackfish and Risso's dolphins were detected every day, while beaked whales and sperm whales were detected almost every day. The hourly distribution of detections reveals that blackfish and Risso's dolphins were more active at night, while beaked whales and sperm whales were more active during daylight hours.
Toward an Integrated Design, Inspection and Redundancy Research Program.
1984-01-01
William Creelman William H. Silcox National Marine Service Standard Oil Company of California St. Louis, Missouri San Francisco, California .-- N...develop physical models and generic tools for analyzing the effects of redundancy, reserve strength, and residual strength on the system behavior of marine...probabilistic analyses to be applicable to real-world problems, this program needs to provide - the deterministic physical models and generic tools upon
End of inevitability: programming and reprogramming.
Turksen, Kursad
2013-08-01
Stem cell commitment and differentiation leading to functional cell types and organs has generally been considered unidirectional and deterministic. Starting first with a landmark study 50 years ago, and now with more recent observations, this paradigm has been challenged, necessitating a rethink of what constitutes both programming and reprogramming processes, and how we can use this new understanding for new approaches to drug discovery and regenerative medicine.
Enhancements and Algorithms for Avionic Information Processing System Design Methodology.
1982-06-16
programming algorithm is enhanced by incorporating task precedence constraints and hardware failures. Stochastic network methods are used to analyze...allocations in the presence of random fluctuations. Graph theoretic methods are used to analyze hardware designs, and new designs are constructed with...There, spatial dynamic programming (SDP) was used to solve a static, deterministic software allocation problem. Under the current contract the SDP
Discrete Time McKean–Vlasov Control Problem: A Dynamic Programming Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pham, Huyên, E-mail: pham@math.univ-paris-diderot.fr; Wei, Xiaoli, E-mail: tyswxl@gmail.com
We consider the stochastic optimal control problem of nonlinear mean-field systems in discrete time. We reformulate the problem into a deterministic control problem with marginal distribution as controlled state variable, and prove that dynamic programming principle holds in its general form. We apply our method for solving explicitly the mean-variance portfolio selection and the multivariate linear-quadratic McKean–Vlasov control problem.
SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.
Smith, Lucas R; Barton, Elisabeth R
2014-01-01
Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection along with user correction. The output of the code provides data in accordance with established standards of practice. The results of the program have been validated using a small set of wild-type and mdx muscle sections. This program is the first freely available and open source image processing program designed to automate analysis of skeletal muscle histological sections.
Estimating haplotype frequencies by combining data from large DNA pools with database information.
Gasbarra, Dario; Kulathinal, Sangita; Pirinen, Matti; Sillanpää, Mikko J
2011-01-01
We assume that allele frequency data have been extracted from several large DNA pools, each containing genetic material of up to hundreds of sampled individuals. Our goal is to estimate the haplotype frequencies among the sampled individuals by combining the pooled allele frequency data with prior knowledge about the set of possible haplotypes. Such prior information can be obtained, for example, from a database such as HapMap. We present a Bayesian haplotyping method for pooled DNA based on a continuous approximation of the multinomial distribution. The proposed method is applicable when the sizes of the DNA pools and/or the number of considered loci exceed the limits of several earlier methods. In the example analyses, the proposed model clearly outperforms a deterministic greedy algorithm on real data from the HapMap database. With a small number of loci, the performance of the proposed method is similar to that of an EM-algorithm, which uses a multinormal approximation for the pooled allele frequencies, but which does not utilize prior information about the haplotypes. The method has been implemented using Matlab and the code is available upon request from the authors.
Patrick, Matthew R.; Kauahikaua, James P.; Antolik, Loren
2010-01-01
Webcams are now standard tools for volcano monitoring and are used at observatories in Alaska, the Cascades, Kamchatka, Hawai'i, Italy, and Japan, among other locations. Webcam images allow invaluable documentation of activity and provide a powerful comparative tool for interpreting other monitoring datastreams, such as seismicity and deformation. Automated image processing can improve the time efficiency and rigor of Webcam image interpretation, and potentially extract more information on eruptive activity. For instance, Lovick and others (2008) provided a suite of processing tools that performed such tasks as noise reduction, eliminating uninteresting images from an image collection, and detecting incandescence, with an application to dome activity at Mount St. Helens during 2007. In this paper, we present two very simple automated approaches for improved characterization and quantification of volcanic incandescence in Webcam images at Kilauea Volcano, Hawai`i. The techniques are implemented in MATLAB (version 2009b, Copyright: The Mathworks, Inc.) to take advantage of the ease of matrix operations. Incandescence is a useful indictor of the location and extent of active lava flows and also a potentially powerful proxy for activity levels at open vents. We apply our techniques to a period covering both summit and east rift zone activity at Kilauea during 2008?2009 and compare the results to complementary datasets (seismicity, tilt) to demonstrate their integrative potential. A great strength of this study is the demonstrated success of these tools in an operational setting at the Hawaiian Volcano Observatory (HVO) over the course of more than a year. Although applied only to Webcam images here, the techniques could be applied to any type of sequential images, such as time-lapse photography. We expect that these tools are applicable to many other volcano monitoring scenarios, and the two MATLAB scripts, as they are implemented at HVO, are included in the appendixes. These scripts would require minor to moderate modifications for use elsewhere, primarily to customize directory navigation. If the user has some familiarity with MATLAB, or programming in general, these modifications should be easy. Although we originally anticipated needing the Image Processing Toolbox, the scripts in the appendixes do not require it. Thus, only the base installation of MATLAB is needed. Because fairly basic MATLAB functions are used, we expect that the script can be run successfully by versions earlier than 2009b.
NASA Astrophysics Data System (ADS)
Králik, Juraj; Králik, Juraj
2017-07-01
The paper presents the results from the deterministic and probabilistic analysis of the accidental torsional effect of reinforced concrete tall buildings due to earthquake even. The core-column structural system was considered with various configurations in plane. The methodology of the seismic analysis of the building structures in Eurocode 8 and JCSS 2000 is discussed. The possibilities of the utilization the LHS method to analyze the extensive and robust tasks in FEM is presented. The influence of the various input parameters (material, geometry, soil, masses and others) is considered. The deterministic and probability analysis of the seismic resistance of the structure was calculated in the ANSYS program.
Loudspeaker equalization for auditory research.
MacDonald, Justin A; Tran, Phuong K
2007-02-01
The equalization of loudspeaker frequency response is necessary to conduct many types of well-controlled auditory experiments. This article introduces a program that includes functions to measure a loudspeaker's frequency response, design equalization filters, and apply the filters to a set of stimuli to be used in an auditory experiment. The filters can compensate for both magnitude and phase distortions introduced by the loudspeaker. A MATLAB script is included in the Appendix to illustrate the details of the equalization algorithm used in the program.
Computer aided design environment for the analysis and design of multi-body flexible structures
NASA Technical Reports Server (NTRS)
Ramakrishnan, Jayant V.; Singh, Ramen P.
1989-01-01
A computer aided design environment consisting of the programs NASTRAN, TREETOPS and MATLAB is presented in this paper. With links for data transfer between these programs, the integrated design of multi-body flexible structures is significantly enhanced. The CAD environment is used to model the Space Shuttle/Pinhole Occulater Facility. Then a controller is designed and evaluated in the nonlinear time history sense. Recent enhancements and ongoing research to add more capabilities are also described.
ERIC Educational Resources Information Center
Ocak, Mehmet
2008-01-01
This correlational study examined the relationship between gender and the students' attitude and prior knowledge of using one of the mathematical software programs (MATLAB). Participants were selected from one community college, one state university and one private college. Students were volunteers from three Calculus I classrooms (one class from…
NASA Astrophysics Data System (ADS)
Tolson, B.; Matott, L. S.; Gaffoor, T. A.; Asadzadeh, M.; Shafii, M.; Pomorski, P.; Xu, X.; Jahanpour, M.; Razavi, S.; Haghnegahdar, A.; Craig, J. R.
2015-12-01
We introduce asynchronous parallel implementations of the Dynamically Dimensioned Search (DDS) family of algorithms including DDS, discrete DDS, PA-DDS and DDS-AU. These parallel algorithms are unique from most existing parallel optimization algorithms in the water resources field in that parallel DDS is asynchronous and does not require an entire population (set of candidate solutions) to be evaluated before generating and then sending a new candidate solution for evaluation. One key advance in this study is developing the first parallel PA-DDS multi-objective optimization algorithm. The other key advance is enhancing the computational efficiency of solving optimization problems (such as model calibration) by combining a parallel optimization algorithm with the deterministic model pre-emption concept. These two efficiency techniques can only be combined because of the asynchronous nature of parallel DDS. Model pre-emption functions to terminate simulation model runs early, prior to completely simulating the model calibration period for example, when intermediate results indicate the candidate solution is so poor that it will definitely have no influence on the generation of further candidate solutions. The computational savings of deterministic model preemption available in serial implementations of population-based algorithms (e.g., PSO) disappear in synchronous parallel implementations as these algorithms. In addition to the key advances above, we implement the algorithms across a range of computation platforms (Windows and Unix-based operating systems from multi-core desktops to a supercomputer system) and package these for future modellers within a model-independent calibration software package called Ostrich as well as MATLAB versions. Results across multiple platforms and multiple case studies (from 4 to 64 processors) demonstrate the vast improvement over serial DDS-based algorithms and highlight the important role model pre-emption plays in the performance of parallel, pre-emptable DDS algorithms. Case studies include single- and multiple-objective optimization problems in water resources model calibration and in many cases linear or near linear speedups are observed.
Stochastic Optimization for Unit Commitment-A Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Qipeng P.; Wang, Jianhui; Liu, Andrew L.
2015-07-01
Optimization models have been widely used in the power industry to aid the decision-making process of scheduling and dispatching electric power generation resources, a process known as unit commitment (UC). Since UC's birth, there have been two major waves of revolution on UC research and real life practice. The first wave has made mixed integer programming stand out from the early solution and modeling approaches for deterministic UC, such as priority list, dynamic programming, and Lagrangian relaxation. With the high penetration of renewable energy, increasing deregulation of the electricity industry, and growing demands on system reliability, the next wave ismore » focused on transitioning from traditional deterministic approaches to stochastic optimization for unit commitment. Since the literature has grown rapidly in the past several years, this paper is to review the works that have contributed to the modeling and computational aspects of stochastic optimization (SO) based UC. Relevant lines of future research are also discussed to help transform research advances into real-world applications.« less
NASA Astrophysics Data System (ADS)
Zarindast, Atousa; Seyed Hosseini, Seyed Mohamad; Pishvaee, Mir Saman
2017-06-01
Robust supplier selection problem, in a scenario-based approach has been proposed, when the demand and exchange rates are subject to uncertainties. First, a deterministic multi-objective mixed integer linear programming is developed; then, the robust counterpart of the proposed mixed integer linear programming is presented using the recent extension in robust optimization theory. We discuss decision variables, respectively, by a two-stage stochastic planning model, a robust stochastic optimization planning model which integrates worst case scenario in modeling approach and finally by equivalent deterministic planning model. The experimental study is carried out to compare the performances of the three models. Robust model resulted in remarkable cost saving and it illustrated that to cope with such uncertainties, we should consider them in advance in our planning. In our case study different supplier were selected due to this uncertainties and since supplier selection is a strategic decision, it is crucial to consider these uncertainties in planning approach.
A Low-Cost Method of Ciliary Beat Frequency Measurement Using iPhone and MATLAB: Rabbit Study.
Chen, Jason J; Lemieux, Bryan T; Wong, Brian J F
2016-08-01
(1) To determine ciliary beat frequency (CBF) using a consumer-grade cellphone camera and MATLAB and (2) to evaluate the effectiveness and accuracy of the proposed method. Prospective animal study. Academic otolaryngology department research laboratory. Five ex vivo tracheal samples were extracted from 3 freshly euthanized (<3 hours postmortem) New Zealand white rabbits and incubated for 30 minutes in buffer at 23°C, buffer at 37°C, or 10% formalin at 23°C. Samples were sectioned transversely and observed under a phase-contrast microscope. Cilia movement was recorded through the eyepiece using an iPhone 6 at 240 frames per second (fps). Through MATLAB programming, the video of the 23°C sample was downsampled to 120, 60, and 30 fps, and Fourier analysis was performed on videos of all frame rates and conditions to determine CBF. CBF of the 23°C sample was also calculated manually frame by frame for verification. Recorded at 240 fps, the CBF at 23°C was 5.03 ± 0.4 Hz, and the CBF at 37°C was 9.08 ± 0.49 Hz (P < .001). The sample with 10% formalin did not display any data beyond DC noise. Compared with 240 fps, the means of other frame rates/methods (120, 60, 30 fps; manual counting) at 23°C all showed no statistical difference (P > .05). There is no significant difference between CBF measured via visual inspection and that analyzed by the developed program. Furthermore, all tested acquisition rates are shown to be effective, providing a fast and inexpensive alternative to current CBF measurement protocols. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.
Advances in Engineering Software for Lift Transportation Systems
NASA Astrophysics Data System (ADS)
Kazakoff, Alexander Borisoff
2012-03-01
In this paper an attempt is performed at computer modelling of ropeway ski lift systems. The logic in these systems is based on a travel form between the two terminals, which operates with high capacity cabins, chairs, gondolas or draw-bars. Computer codes AUTOCAD, MATLAB and Compaq-Visual Fortran - version 6.6 are used in the computer modelling. The rope systems computer modelling is organized in two stages in this paper. The first stage is organization of the ground relief profile and a design of the lift system as a whole, according to the terrain profile and the climatic and atmospheric conditions. The ground profile is prepared by the geodesists and is presented in an AUTOCAD view. The next step is the design of the lift itself which is performed by programmes using the computer code MATLAB. The second stage of the computer modelling is performed after the optimization of the co-ordinates and the lift profile using the computer code MATLAB. Then the co-ordinates and the parameters are inserted into a program written in Compaq Visual Fortran - version 6.6., which calculates 171 lift parameters, organized in 42 tables. The objective of the work presented in this paper is an attempt at computer modelling of the design and parameters derivation of the rope way systems and their computer variation and optimization.
CPMC-Lab: A MATLAB package for Constrained Path Monte Carlo calculations
NASA Astrophysics Data System (ADS)
Nguyen, Huy; Shi, Hao; Xu, Jie; Zhang, Shiwei
2014-12-01
We describe CPMC-Lab, a MATLAB program for the constrained-path and phaseless auxiliary-field Monte Carlo methods. These methods have allowed applications ranging from the study of strongly correlated models, such as the Hubbard model, to ab initio calculations in molecules and solids. The present package implements the full ground-state constrained-path Monte Carlo (CPMC) method in MATLAB with a graphical interface, using the Hubbard model as an example. The package can perform calculations in finite supercells in any dimensions, under periodic or twist boundary conditions. Importance sampling and all other algorithmic details of a total energy calculation are included and illustrated. This open-source tool allows users to experiment with various model and run parameters and visualize the results. It provides a direct and interactive environment to learn the method and study the code with minimal overhead for setup. Furthermore, the package can be easily generalized for auxiliary-field quantum Monte Carlo (AFQMC) calculations in many other models for correlated electron systems, and can serve as a template for developing a production code for AFQMC total energy calculations in real materials. Several illustrative studies are carried out in one- and two-dimensional lattices on total energy, kinetic energy, potential energy, and charge- and spin-gaps.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staschus, K.
1985-01-01
In this dissertation, efficient algorithms for electric-utility capacity expansion planning with renewable energy are developed. The algorithms include a deterministic phase that quickly finds a near-optimal expansion plan using derating and a linearized approximation to the time-dependent availability of nondispatchable energy sources. A probabilistic second phase needs comparatively few computer-time consuming probabilistic simulation iterations to modify this solution towards the optimal expansion plan. For the deterministic first phase, two algorithms, based on a Lagrangian Dual decomposition and a Generalized Benders Decomposition, are developed. The probabilistic second phase uses a Generalized Benders Decomposition approach. Extensive computational tests of the algorithms aremore » reported. Among the deterministic algorithms, the one based on Lagrangian Duality proves fastest. The two-phase approach is shown to save up to 80% in computing time as compared to a purely probabilistic algorithm. The algorithms are applied to determine the optimal expansion plan for the Tijuana-Mexicali subsystem of the Mexican electric utility system. A strong recommendation to push conservation programs in the desert city of Mexicali results from this implementation.« less
A cost-effectiveness analysis of two different antimicrobial stewardship programs.
Okumura, Lucas Miyake; Riveros, Bruno Salgado; Gomes-da-Silva, Monica Maria; Veroneze, Izelandia
2016-01-01
There is a lack of formal economic analysis to assess the efficiency of antimicrobial stewardship programs. Herein, we conducted a cost-effectiveness study to assess two different strategies of Antimicrobial Stewardship Programs. A 30-day Markov model was developed to analyze how cost-effective was a Bundled Antimicrobial Stewardship implemented in a university hospital in Brazil. Clinical data derived from a historical cohort that compared two different strategies of antimicrobial stewardship programs and had 30-day mortality as main outcome. Selected costs included: workload, cost of defined daily doses, length of stay, laboratory and imaging resources used to diagnose infections. Data were analyzed by deterministic and probabilistic sensitivity analysis to assess model's robustness, tornado diagram and Cost-Effectiveness Acceptability Curve. Bundled Strategy was more expensive (Cost difference US$ 2119.70), however, it was more efficient (US$ 27,549.15 vs 29,011.46). Deterministic and probabilistic sensitivity analysis suggested that critical variables did not alter final Incremental Cost-Effectiveness Ratio. Bundled Strategy had higher probabilities of being cost-effective, which was endorsed by cost-effectiveness acceptability curve. As health systems claim for efficient technologies, this study conclude that Bundled Antimicrobial Stewardship Program was more cost-effective, which means that stewardship strategies with such characteristics would be of special interest in a societal and clinical perspective. Copyright © 2016 Elsevier Editora Ltda. All rights reserved.
Introduction to TAFI - A Matlab® toolbox for analysis of flexural isostasy
NASA Astrophysics Data System (ADS)
Jha, S.; Harry, D. L.; Schutt, D.
2016-12-01
The isostatic response of vertical tectonic loads emplaced on thin elastic plates overlying inviscid substrate and the corresponding gravity anomalies are commonly modeled using well established theories and methodologies of flexural analysis. However, such analysis requires some mathematical and coding expertise on part of users. With that in mind, we designed a new interactive Matlab® toolbox called Toolbox for Analysis of Flexural Isostasy (TAFI). TAFI allows users to create forward models (2-D and 3-D) of flexural deformation of the lithosphere and resulting gravity anomaly. TAFI computes Green's Functions for flexure of the elastic plate subjected to point or line loads, and analytical solution for harmonic loads. Flexure due to non-impulsive, distributed 2-D or 3-D loads are computed by convolving the appropriate Green's function with a user-supplied spatially discretized load function. The gravity anomaly associated with each density interface is calculated by using the Fourier Transform of flexural deflection of these interfaces and estimating the gravity in the wavenumber domain. All models created in TAFI are based on Matlab's intrinsic functions and do not require any specialized toolbox, function or library except those distributed with TAFI. Modeling functions within TAFI can be called from Matlab workspace, from within user written programs or from the TAFI's graphical user interface (GUI). The GUI enables the user to model the flexural deflection of lithosphere interactively, enabling real time comparison of model fit with observed data constraining the flexural deformation and gravity, facilitating rapid search for best fitting flexural model. TAFI is a very useful teaching and research tool and have been tested rigorously in graduate level teaching and basic research environment.
Saulnier, Dell D; Persson, Lars-Åke; Streatfield, Peter Kim; Faruque, A S G; Rahman, Anisur
2016-01-01
Cholera outbreaks are a continuing problem in Bangladesh, and the timely detection of an outbreak is important for reducing morbidity and mortality. In Matlab, the ongoing Health and Demographic Surveillance System (HDSS) data records symptoms of diarrhea in children under the age of 5 years at the community level. Cholera surveillance in Matlab currently uses hospital-based data. The objective of this study is to determine whether increases in cholera in Matlab can be detected earlier by using HDSS diarrhea symptom data in a syndromic surveillance analysis, when compared to hospital admissions for cholera. HDSS diarrhea symptom data and hospital admissions for cholera in children under 5 years of age over a 2-year period were analyzed with the syndromic surveillance statistical program EARS (Early Aberration Reporting System). Dates when significant increases in either symptoms or cholera cases occurred were compared to one another. The analysis revealed that there were 43 days over 16 months when the cholera cases or diarrhea symptoms increased significantly. There were 8 months when both data sets detected days with significant increases. In 5 of the 8 months, increases in diarrheal symptoms occurred before increases of cholera cases. The increases in symptoms occurred between 1 and 15 days before the increases in cholera cases. The results suggest that the HDSS survey data may be able to detect an increase in cholera before an increase in hospital admissions is seen. However, there was no direct link between diarrheal symptom increases and cholera cases, and this, as well as other methodological weaknesses, should be taken into consideration.
A Series of MATLAB Learning Modules to Enhance Numerical Competency in Applied Marine Sciences
NASA Astrophysics Data System (ADS)
Fischer, A. M.; Lucieer, V.; Burke, C.
2016-12-01
Enhanced numerical competency to navigate the massive data landscapes are critical skills students need to effectively explore, analyse and visualize complex patterns in high-dimensional data for addressing the complexity of many of the world's problems. This is especially the case for interdisciplinary, undergraduate applied marine science programs, where students are required to demonstrate competency in methods and ideas across multiple disciplines. In response to this challenge, we have developed a series of repository-based data exploration, analysis and visualization modules in MATLAB for integration across various attending and online classes within the University of Tasmania. The primary focus of these modules is to teach students to collect, aggregate and interpret data from large on-line marine scientific data repositories to, 1) gain technical skills in discovering, accessing, managing and visualising large, numerous data sources, 2) interpret, analyse and design approaches to visualise these data, and 3) to address, through numerical approaches, complex, real-world problems, that the traditional scientific methods cannot address. All modules, implemented through a MATLAB live script, include a short recorded lecture to introduce the topic, a handout that gives an overview of the activities, an instructor's manual with a detailed methodology and discussion points, a student assessment (quiz and level-specific challenge task), and a survey. The marine science themes addressed through these modules include biodiversity, habitat mapping, algal blooms and sea surface temperature change and utilize a series of marine science and oceanographic data portals. Through these modules students, with minimal experience in MATLAB or numerical methods are introduced to array indexing, concatenation, sorting, and reshaping, principal component analysis, spectral analysis and unsupervised classification within the context of oceanographic processes, marine geology and marine community ecology.
The Office of Pesticide Programs models daily aquatic pesticide exposure values for 30 years in its risk assessments. However, only a fraction of that information is typically used in these assessments. The population model employed herein is a deterministic, density-dependent pe...
Probabilistic structural analysis methods for space transportation propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.
1991-01-01
Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .
Pinto Mariano, Adriano; Bastos Borba Costa, Caliane; de Franceschi de Angelis, Dejanira; Maugeri Filho, Francisco; Pires Atala, Daniel Ibraim; Wolf Maciel, Maria Regina; Maciel Filho, Rubens
2009-11-01
In this work, the mathematical optimization of a continuous flash fermentation process for the production of biobutanol was studied. The process consists of three interconnected units, as follows: fermentor, cell-retention system (tangential microfiltration), and vacuum flash vessel (responsible for the continuous recovery of butanol from the broth). The objective of the optimization was to maximize butanol productivity for a desired substrate conversion. Two strategies were compared for the optimization of the process. In one of them, the process was represented by a deterministic model with kinetic parameters determined experimentally and, in the other, by a statistical model obtained using the factorial design technique combined with simulation. For both strategies, the problem was written as a nonlinear programming problem and was solved with the sequential quadratic programming technique. The results showed that despite the very similar solutions obtained with both strategies, the problems found with the strategy using the deterministic model, such as lack of convergence and high computational time, make the use of the optimization strategy with the statistical model, which showed to be robust and fast, more suitable for the flash fermentation process, being recommended for real-time applications coupling optimization and control.
Teaching Computational Geophysics Classes using Active Learning Techniques
NASA Astrophysics Data System (ADS)
Keers, H.; Rondenay, S.; Harlap, Y.; Nordmo, I.
2016-12-01
We give an overview of our experience in teaching two computational geophysics classes at the undergraduate level. In particular we describe The first class is for most students the first programming class and assumes that the students have had an introductory course in geophysics. In this class the students are introduced to basic Matlab skills: use of variables, basic array and matrix definition and manipulation, basic statistics, 1D integration, plotting of lines and surfaces, making of .m files and basic debugging techniques. All of these concepts are applied to elementary but important concepts in earthquake and exploration geophysics (including epicentre location, computation of travel time curves for simple layered media plotting of 1D and 2D velocity models etc.). It is important to integrate the geophysics with the programming concepts: we found that this enhances students' understanding. Moreover, as this is a 3 year Bachelor program, and this class is taught in the 2nd semester, there is little time for a class that focusses on only programming. In the second class, which is optional and can be taken in the 4th or 6th semester, but often is also taken by Master students we extend the Matlab programming to include signal processing and ordinary and partial differential equations, again with emphasis on geophysics (such as ray tracing and solving the acoustic wave equation). This class also contains a project in which the students have to write a brief paper on a topic in computational geophysics, preferably with programming examples. When teaching these classes it was found that active learning techniques, in which the students actively participate in the class, either individually, in pairs or in groups, are indispensable. We give a brief overview of the various activities that we have developed when teaching theses classes.
The integrated model for solving the single-period deterministic inventory routing problem
NASA Astrophysics Data System (ADS)
Rahim, Mohd Kamarul Irwan Abdul; Abidin, Rahimi; Iteng, Rosman; Lamsali, Hendrik
2016-08-01
This paper discusses the problem of efficiently managing inventory and routing problems in a two-level supply chain system. Vendor Managed Inventory (VMI) policy is an integrating decisions between a supplier and his customers. We assumed that the demand at each customer is stationary and the warehouse is implementing a VMI. The objective of this paper is to minimize the inventory and the transportation costs of the customers for a two-level supply chain. The problem is to determine the delivery quantities, delivery times and routes to the customers for the single-period deterministic inventory routing problem (SP-DIRP) system. As a result, a linear mixed-integer program is developed for the solutions of the SP-DIRP problem.
NASA Astrophysics Data System (ADS)
Butykai, A.; Domínguez-García, P.; Mor, F. M.; Gaál, R.; Forró, L.; Jeney, S.
2017-11-01
The present document is an update of the previously published MatLab code for the calibration of optical tweezers in the high-resolution detection of the Brownian motion of non-spherical probes [1]. In this instance, an alternative version of the original code, based on the same physical theory [2], but focused on the automation of the calibration of measurements using spherical probes, is outlined. The new added code is useful for high-frequency microrheology studies, where the probe radius is known but the viscosity of the surrounding fluid maybe not. This extended calibration methodology is automatic, without the need of a user's interface. A code for calibration by means of thermal noise analysis [3] is also included; this is a method that can be applied when using viscoelastic fluids if the trap stiffness is previously estimated [4]. The new code can be executed in MatLab and using GNU Octave. Program Files doi:http://dx.doi.org/10.17632/s59f3gz729.1 Licensing provisions: GPLv3 Programming language: MatLab 2016a (MathWorks Inc.) and GNU Octave 4.0 Operating system: Linux and Windows. Supplementary material: A new document README.pdf includes basic running instructions for the new code. Journal reference of previous version: Computer Physics Communications, 196 (2015) 599 Does the new version supersede the previous version?: No. It adds alternative but compatible code while providing similar calibration factors. Nature of problem (approx. 50-250 words): The original code uses a MatLab-provided user's interface, which is not available in GNU Octave, and cannot be used outside of a proprietary software as MatLab. Besides, the process of calibration when using spherical probes needs an automatic method when calibrating big amounts of different data focused to microrheology. Solution method (approx. 50-250 words): The new code can be executed in the latest version of MatLab and using GNU Octave, a free and open-source alternative to MatLab. This code generates an automatic calibration process which requires only to write the input data in the main script. Additionally, we include a calibration method based on thermal noise statistics, which can be used with viscoelastic fluids if the trap stiffness is previously estimated. Reasons for the new version: This version extends the functionality of PFMCal for the particular case of spherical probes and unknown fluid viscosities. The extended code is automatic, works in different operating systems and it is compatible with GNU Octave. Summary of revisions: The original MatLab program in the previous version, which is executed by PFMCal.m, is not changed. Here, we have added two additional main archives named PFMCal_auto.m and PFMCal_histo.m, which implement automatic calculations of the calibration process and calibration through Boltzmann statistics, respectively. The process of calibration using this code for spherical beads is described in the README.pdf file provided in the new code submission. Here, we obtain different calibration factors, β (given in μm/V), according to [2], related to two statistical quantities: the mean-squared displacement (MSD), βMSD, and the velocity autocorrelation function (VAF), βVAF. Using that methodology, the trap stiffness, k, and the zero-shear viscosity of the fluid, η, can be calculated if the value of the particle's radius, a, is previously known. For comparison, we include in the extended code the method of calibration using the corner frequency of the power-spectral density (PSD) [5], providing a calibration factor βPSD. Besides, with the prior estimation of the trap stiffness, along with the known value of the particle's radius, we can use thermal noise statistics to obtain calibration factors, β, according to the quadratic form of the optical potential, βE, and related to the Gaussian distribution of the bead's positions, βσ2. This method has been demonstrated to be applicable to the calibration of optical tweezers when using non-Newtonian viscoelastic polymeric liquids [4]. An example of the results using this calibration process is summarized in Table 1. Using the data provided in the new code submission, for water and acetone fluids, we calculate all the calibration factors by using the original PFMCal.m and by the new non-GUI code PFMCal_auto.m and PFMCal_histo.m. Regarding the new code, PFMCal_auto.m returns η, k, βMSD, βVAF and βPSD, while PFMCal_histo.m provides βσ2 and βE. Table 1 shows how we obtain the expected viscosity of the two fluids at this temperature and how the different methods provide good agreement between trap stiffnesses and calibration factors. Additional comments including Restrictions and Unusual features (approx. 50-250 words): The original code, PFMCal.m, runs under MatLab using the Statistics Toolbox. The extended code, PFMCal_auto.m and PFMCal_histo.m, can be executed without modification using MatLab or GNU Octave. The code has been tested in Linux and Windows operating systems.
Software For Computer-Aided Design Of Control Systems
NASA Technical Reports Server (NTRS)
Wette, Matthew
1994-01-01
Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.
Coding in Senior School Mathematics with Live Editing
ERIC Educational Resources Information Center
Thompson, Ian
2017-01-01
In this paper, an example is offered of a problem-solving task for senior secondary school students which was given in the context of a story. As the story unfolds, the task requires progressively more complex forms of linear programming to be applied. Coding in MATLAB is used throughout the task in such a way that it supports the increasing…
Matpar: Parallel Extensions for MATLAB
NASA Technical Reports Server (NTRS)
Springer, P. L.
1998-01-01
Matpar is a set of client/server software that allows a MATLAB user to take advantage of a parallel computer for very large problems. The user can replace calls to certain built-in MATLAB functions with calls to Matpar functions.
Fully Automated Sunspot Detection and Classification Using SDO HMI Imagery in MATLAB
2014-03-27
FULLY AUTOMATED SUNSPOT DETECTION AND CLASSIFICATION USING SDO HMI IMAGERY IN MATLAB THESIS Gordon M. Spahr, Second Lieutenant, USAF AFIT-ENP-14-M-34...CLASSIFICATION USING SDO HMI IMAGERY IN MATLAB THESIS Presented to the Faculty Department of Engineering Physics Graduate School of Engineering and Management Air...DISTRIUBUTION UNLIMITED. AFIT-ENP-14-M-34 FULLY AUTOMATED SUNSPOT DETECTION AND CLASSIFICATION USING SDO HMI IMAGERY IN MATLAB Gordon M. Spahr, BS Second
2008-09-01
Behavioural Point Process Data 234 Appendix B: Matlab Code 258 Matlab Code Used in Chapter 2 (Porpoise Prey Capture Analysis) 258 Click Extraction and...Measurement of Click Properties 258 Envelope-based Click Detector 262 Matlab Code Used in Chapter 3 (Transmission Loss in Porpoise Habitats) ..267...Click Extraction from Data Wavefiles 267 Click Level Determination (Grand Manan Datasets) 270 Click Level Determination (Danish Datasets) 287 Matlab
Flexible missile autopilot design studies with PC-MATLAB/386
NASA Technical Reports Server (NTRS)
Ruth, Michael J.
1989-01-01
Development of a responsive, high-bandwidth missile autopilot for airframes which have structural modes of unusually low frequency presents a challenging design task. Such systems are viable candidates for modern, state-space control design methods. The PC-MATLAB interactive software package provides an environment well-suited to the development of candidate linear control laws for flexible missile autopilots. The strengths of MATLAB include: (1) exceptionally high speed (MATLAB's version for 80386-based PC's offers benchmarks approaching minicomputer and mainframe performance); (2) ability to handle large design models of several hundred degrees of freedom, if necessary; and (3) broad extensibility through user-defined functions. To characterize MATLAB capabilities, a simplified design example is presented. This involves interactive definition of an observer-based state-space compensator for a flexible missile autopilot design task. MATLAB capabilities and limitations, in the context of this design task, are then summarized.
Computing Across the Physics and Astrophysics Curriculum
NASA Astrophysics Data System (ADS)
DeGioia Eastwood, Kathy; James, M.; Dolle, E.
2012-01-01
Computational skills are essential in today's marketplace. Bachelors entering the STEM workforce report that their undergraduate education does not adequately prepare them to use scientific software and to write programs. Computation can also increase student learning; not only are the students actively engaged, but computational problems allow them to explore physical problems that are more realistic than the few that can be solved analytically. We have received a grant from the NSF CCLI Phase I program to integrate computing into our upper division curriculum. Our language of choice is Matlab; this language had already been chosen for our required sophomore course in Computational Physics because of its prevalence in industry. For two summers we have held faculty workshops to help our professors develop the needed expertise, and we are now in the implementation and evaluation stage. The end product will be a set of learning materials in the form of computational modules that we will make freely available. These modules will include the assignment, pedagogical goals, Matlab code, samples of student work, and instructor comments. At this meeting we present an overview of the project as well as modules written for a course in upper division stellar astrophysics. We acknowledge the support of the NSF through DUE-0837368.
Dynamic optimization case studies in DYNOPT tool
NASA Astrophysics Data System (ADS)
Ozana, Stepan; Pies, Martin; Docekal, Tomas
2016-06-01
Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.
Simulation for Wind Turbine Generators -- With FAST and MATLAB-Simulink Modules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, M.; Muljadi, E.; Jonkman, J.
This report presents the work done to develop generator and gearbox models in the Matrix Laboratory (MATLAB) environment and couple them to the National Renewable Energy Laboratory's Fatigue, Aerodynamics, Structures, and Turbulence (FAST) program. The goal of this project was to interface the superior aerodynamic and mechanical models of FAST to the excellent electrical generator models found in various Simulink libraries and applications. The scope was limited to Type 1, Type 2, and Type 3 generators and fairly basic gear-train models. Future work will include models of Type 4 generators and more-advanced gear-train models with increased degrees of freedom. Asmore » described in this study, implementation of the developed drivetrain model enables the software tool to be used in many ways. Several case studies are presented as examples of the many types of studies that can be performed using this tool.« less
Extraction of RBM frequency of a (10, 0) SWNT using MATLAB
NASA Astrophysics Data System (ADS)
Yadav, Monika; Negi, Sunita
2018-05-01
In the last few decades carbon nanotubes (CNT) have attracted great interest for future applications of these tubes in the field of electronics, composite material and drug delivery etc. The normal modes associated with a carbon nanotube are therefore important to be understood for a better understanding of their behavior. In this paper we report the frequency of the most important "Radial Breathing Mode (RBM)" associated with a single walled carbon nanotube. This is done with the help of a MATLAB program. We observe a RBM frequency associated with a (10, 0) single walled carbon nanotube at 340 cm-1. A very slow frequency component is also observed to be associated with the RBM of the carbon nanotube as reported earlier also. This method used for the extraction of RBM frequency is observed to be simple and fast as compared with the other methods.
GRace: a MATLAB-based application for fitting the discrimination-association model.
Stefanutti, Luca; Vianello, Michelangelo; Anselmi, Pasquale; Robusto, Egidio
2014-10-28
The Implicit Association Test (IAT) is a computerized two-choice discrimination task in which stimuli have to be categorized as belonging to target categories or attribute categories by pressing, as quickly and accurately as possible, one of two response keys. The discrimination association model has been recently proposed for the analysis of reaction time and accuracy of an individual respondent to the IAT. The model disentangles the influences of three qualitatively different components on the responses to the IAT: stimuli discrimination, automatic association, and termination criterion. The article presents General Race (GRace), a MATLAB-based application for fitting the discrimination association model to IAT data. GRace has been developed for Windows as a standalone application. It is user-friendly and does not require any programming experience. The use of GRace is illustrated on the data of a Coca Cola-Pepsi Cola IAT, and the results of the analysis are interpreted and discussed.
Splitting parameter yield (SPY): A program for semiautomatic analysis of shear-wave splitting
NASA Astrophysics Data System (ADS)
Zaccarelli, Lucia; Bianco, Francesca; Zaccarelli, Riccardo
2012-03-01
SPY is a Matlab algorithm that analyzes seismic waveforms in a semiautomatic way, providing estimates of the two observables of the anisotropy: the shear-wave splitting parameters. We chose to exploit those computational processes that require less intervention by the user, gaining objectivity and reliability as a result. The algorithm joins the covariance matrix and the cross-correlation techniques, and all the computation steps are interspersed by several automatic checks intended to verify the reliability of the yields. The resulting semiautomation generates two new advantages in the field of anisotropy studies: handling a huge amount of data at the same time, and comparing different yields. From this perspective, SPY has been developed in the Matlab environment, which is widespread, versatile, and user-friendly. Our intention is to provide the scientific community with a new monitoring tool for tracking the temporal variations of the crustal stress field.
Deterministic Execution of Ptides Programs
2013-05-15
at a time no later than 30+1+5 = 36. Assume the maximum clock synchronization error is . Therefore, the AddSubtract adder must delay processing the...the synchronization of the platform real- time clock to its peers in other system platforms. The portions of PtidyOS code that implement access to the...interesting opportunities for future research. References [1] Y. Zhao, E. A. Lee, and J. Liu, “A programming model for time - synchronized distributed real
NASA Astrophysics Data System (ADS)
Umansky, Moti; Weihs, Daphne
2012-08-01
In many physical and biophysical studies, single-particle tracking is utilized to reveal interactions, diffusion coefficients, active modes of driving motion, dynamic local structure, micromechanics, and microrheology. The basic analysis applied to those data is to determine the time-dependent mean-square displacement (MSD) of particle trajectories and perform time- and ensemble-averaging of similar motions. The motion of particles typically exhibits time-dependent power-law scaling, and only trajectories with qualitatively and quantitatively comparable MSD should be ensembled. Ensemble averaging trajectories that arise from different mechanisms, e.g., actively driven and diffusive, is incorrect and can result inaccurate correlations between structure, mechanics, and activity. We have developed an algorithm to automatically and accurately determine power-law scaling of experimentally measured single-particle MSD. Trajectories can then categorized and grouped according to user defined cutoffs of time, amplitudes, scaling exponent values, or combinations. Power-law fits are then provided for each trajectory alongside categorized groups of trajectories, histograms of power laws, and the ensemble-averaged MSD of each group. The codes are designed to be easily incorporated into existing user codes. We expect that this algorithm and program will be invaluable to anyone performing single-particle tracking, be it in physical or biophysical systems. Catalogue identifier: AEMD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 25 892 No. of bytes in distributed program, including test data, etc.: 5 572 780 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) version 7.11 (2010b) or higher, program should also be backwards compatible. Symbolic Math Toolboxes (5.5) is required. The Curve Fitting Toolbox (3.0) is recommended. Computer: Tested on Windows only, yet should work on any computer running MATLAB. In Windows 7, should be used as administrator, if the user is not the administrator the program may not be able to save outputs and temporary outputs to all locations. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.11 / 2010b or higher. Supplementary material: Sample output files (approx. 30 MBytes) are available. Classification: 12 External routines: Several MATLAB subfunctions (m-files), freely available on the web, were used as part of and included in, this code: count, NaN suite, parseArgs, roundsd, subaxis, wcov, wmean, and the executable pdfTK.exe. Nature of problem: In many physical and biophysical areas employing single-particle tracking, having the time-dependent power-laws governing the time-averaged meansquare displacement (MSD) of a single particle is crucial. Those power laws determine the mode-of-motion and hint at the underlying mechanisms driving motion. Accurate determination of the power laws that describe each trajectory will allow categorization into groups for further analysis of single trajectories or ensemble analysis, e.g. ensemble and time-averaged MSD. Solution method: The algorithm in the provided program automatically analyzes and fits time-dependent power laws to single particle trajectories, then group particles according to user defined cutoffs. It accepts time-dependent trajectories of several particles, each trajectory is run through the program, its time-averaged MSD is calculated, and power laws are determined in regions where the MSD is linear on a log-log scale. Our algorithm searches for high-curvature points in experimental data, here time-dependent MSD. Those serve as anchor points for determining the ranges of the power-law fits. Power-law scaling is then accurately determined and error estimations of the parameters and quality of fit are provided. After all single trajectory time-averaged MSDs are fit, we obtain cutoffs from the user to categorize and segment the power laws into groups; cutoff are either in exponents of the power laws, time of appearance of the fits, or both together. The trajectories are sorted according to the cutoffs and the time- and ensemble-averaged MSD of each group is provided, with histograms of the distributions of the exponents in each group. The program then allows the user to generate new trajectory files with trajectories segmented according to the determined groups, for any further required analysis. Additional comments: README file giving the names and a brief description of all the files that make-up the package and clear instructions on the installation and execution of the program is included in the distribution package. Running time: On an i5 Windows 7 machine with 4 GB RAM the automated parts of the run (excluding data loading and user input) take less than 45 minutes to analyze and save all stages for an 844 trajectory file, including optional PDF save. Trajectory length did not affect run time (tested up to 3600 frames/trajectory), which was on average 3.2±0.4 seconds per trajectory.
An energy management for series hybrid electric vehicle using improved dynamic programming
NASA Astrophysics Data System (ADS)
Peng, Hao; Yang, Yaoquan; Liu, Chunyu
2018-02-01
With the increasing numbers of hybrid electric vehicle (HEV), management for two energy sources, engine and battery, is more and more important to achieve the minimum fuel consumption. This paper introduces several working modes of series hybrid electric vehicle (SHEV) firstly and then describes the mathematical model of main relative components in SHEV. On the foundation of this model, dynamic programming is applied to distribute energy of engine and battery on the platform of matlab and acquires less fuel consumption compared with traditional control strategy. Besides, control rule recovering energy in brake profiles is added into dynamic programming, so shorter computing time is realized by improved dynamic programming and optimization on algorithm.
BasinVis 1.0: A MATLAB®-based program for sedimentary basin subsidence analysis and visualization
NASA Astrophysics Data System (ADS)
Lee, Eun Young; Novotny, Johannes; Wagreich, Michael
2016-06-01
Stratigraphic and structural mapping is important to understand the internal structure of sedimentary basins. Subsidence analysis provides significant insights for basin evolution. We designed a new software package to process and visualize stratigraphic setting and subsidence evolution of sedimentary basins from well data. BasinVis 1.0 is implemented in MATLAB®, a multi-paradigm numerical computing environment, and employs two numerical methods: interpolation and subsidence analysis. Five different interpolation methods (linear, natural, cubic spline, Kriging, and thin-plate spline) are provided in this program for surface modeling. The subsidence analysis consists of decompaction and backstripping techniques. BasinVis 1.0 incorporates five main processing steps; (1) setup (study area and stratigraphic units), (2) loading well data, (3) stratigraphic setting visualization, (4) subsidence parameter input, and (5) subsidence analysis and visualization. For in-depth analysis, our software provides cross-section and dip-slip fault backstripping tools. The graphical user interface guides users through the workflow and provides tools to analyze and export the results. Interpolation and subsidence results are cached to minimize redundant computations and improve the interactivity of the program. All 2D and 3D visualizations are created by using MATLAB plotting functions, which enables users to fine-tune the results using the full range of available plot options in MATLAB. We demonstrate all functions in a case study of Miocene sediment in the central Vienna Basin.
Transverse emittance and phase space program developed for use at the Fermilab A0 Photoinjector
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thurman-Keup, R.; Johnson, A.S.; Lumpkin, A.H.
2011-03-01
The Fermilab A0 Photoinjector is a 16 MeV high intensity, high brightness electron linac developed for advanced accelerator R&D. One of the key parameters for the electron beam is the transverse beam emittance. Here we report on a newly developed MATLAB based GUI program used for transverse emittance measurements using the multi-slit technique. This program combines the image acquisition and post-processing tools for determining the transverse phase space parameters with uncertainties. An integral part of accelerator research is a measurement of the beam phase space. Measurements of the transverse phase space can be accomplished by a variety of methods includingmore » multiple screens separated by drift spaces, or by sampling phase space via pepper pots or slits. In any case, the measurement of the phase space parameters, in particular the emittance, can be drastically simplified and sped up by automating the measurement in an intuitive fashion utilizing a graphical interface. At the A0 Photoinjector (A0PI), the control system is DOOCS, which originated at DESY. In addition, there is a library for interfacing to MATLAB, a graphically capable numerical analysis package sold by The Mathworks. It is this graphical package which was chosen as the basis for a graphical phase space measurement system due to its combination of analysis and display capabilities.« less
Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...
ImageJ-MATLAB: a bidirectional framework for scientific image analysis interoperability.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2017-02-15
ImageJ-MATLAB is a lightweight Java library facilitating bi-directional interoperability between MATLAB and ImageJ. By defining a standard for translation between matrix and image data structures, researchers are empowered to select the best tool for their image-analysis tasks. Freely available extension to ImageJ2 ( http://imagej.net/Downloads ). Installation and use instructions available at http://imagej.net/MATLAB_Scripting. Tested with ImageJ 2.0.0-rc-54 , Java 1.8.0_66 and MATLAB R2015b. eliceiri@wisc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments
NASA Astrophysics Data System (ADS)
Ashcroft, Brian Alan; Oosterkamp, Tjerk
2010-11-01
We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.
AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.
Ashcroft, Brian Alan; Oosterkamp, Tjerk
2010-11-01
We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.
Robust planning of dynamic wireless charging infrastructure for battery electric buses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zhaocai; Song, Ziqi
Battery electric buses with zero tailpipe emissions have great potential in improving environmental sustainability and livability of urban areas. However, the problems of high cost and limited range associated with on-board batteries have substantially limited the popularity of battery electric buses. The technology of dynamic wireless power transfer (DWPT), which provides bus operators with the ability to charge buses while in motion, may be able to effectively alleviate the drawbacks of electric buses. In this paper, we address the problem of simultaneously selecting the optimal location of the DWPT facilities and designing the optimal battery sizes of electric buses formore » a DWPT electric bus system. The problem is first constructed as a deterministic model in which the uncertainty of energy consumption and travel time of electric buses is neglected. The methodology of robust optimization (RO) is then adopted to address the uncertainty of energy consumption and travel time. The affinely adjustable robust counterpart (AARC) of the deterministic model is developed, and its equivalent tractable mathematical programming is derived. Both the deterministic model and the robust model are demonstrated with a real-world bus system. The results of our study demonstrate that the proposed deterministic model can effectively determine the allocation of DWPT facilities and the battery sizes of electric buses for a DWPT electric bus system; and the robust model can further provide optimal designs that are robust against the uncertainty of energy consumption and travel time for electric buses.« less
Robust planning of dynamic wireless charging infrastructure for battery electric buses
Liu, Zhaocai; Song, Ziqi
2017-10-01
Battery electric buses with zero tailpipe emissions have great potential in improving environmental sustainability and livability of urban areas. However, the problems of high cost and limited range associated with on-board batteries have substantially limited the popularity of battery electric buses. The technology of dynamic wireless power transfer (DWPT), which provides bus operators with the ability to charge buses while in motion, may be able to effectively alleviate the drawbacks of electric buses. In this paper, we address the problem of simultaneously selecting the optimal location of the DWPT facilities and designing the optimal battery sizes of electric buses formore » a DWPT electric bus system. The problem is first constructed as a deterministic model in which the uncertainty of energy consumption and travel time of electric buses is neglected. The methodology of robust optimization (RO) is then adopted to address the uncertainty of energy consumption and travel time. The affinely adjustable robust counterpart (AARC) of the deterministic model is developed, and its equivalent tractable mathematical programming is derived. Both the deterministic model and the robust model are demonstrated with a real-world bus system. The results of our study demonstrate that the proposed deterministic model can effectively determine the allocation of DWPT facilities and the battery sizes of electric buses for a DWPT electric bus system; and the robust model can further provide optimal designs that are robust against the uncertainty of energy consumption and travel time for electric buses.« less
NASA Astrophysics Data System (ADS)
Song, Yiliao; Qin, Shanshan; Qu, Jiansheng; Liu, Feng
2015-10-01
The issue of air quality regarding PM pollution levels in China is a focus of public attention. To address that issue, to date, a series of studies is in progress, including PM monitoring programs, PM source apportionment, and the enactment of new ambient air quality index standards. However, related research concerning computer modeling for PM future trends estimation is rare, despite its significance to forecasting and early warning systems. Thereby, a study regarding deterministic and interval forecasts of PM is performed. In this study, data on hourly and 12 h-averaged air pollutants are applied to forecast PM concentrations within the Yangtze River Delta (YRD) region of China. The characteristics of PM emissions have been primarily examined and analyzed using different distribution functions. To improve the distribution fitting that is crucial for estimating PM levels, an artificial intelligence algorithm is incorporated to select the optimal parameters. Following that step, an ANF model is used to conduct deterministic forecasts of PM. With the identified distributions and deterministic forecasts, different levels of PM intervals are estimated. The results indicate that the lognormal or gamma distributions are highly representative of the recorded PM data with a goodness-of-fit R2 of approximately 0.998. Furthermore, the results of the evaluation metrics (MSE, MAPE and CP, AW) also show high accuracy within the deterministic and interval forecasts of PM, indicating that this method enables the informative and effective quantification of future PM trends.
Analysis of whisker-toughened CMC structural components using an interactive reliability model
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Palko, Joseph L.
1992-01-01
Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.
NASA Astrophysics Data System (ADS)
Zhang, Hanqing; Stangner, Tim; Wiklund, Krister; Rodriguez, Alvaro; Andersson, Magnus
2017-10-01
We present a versatile and fast MATLAB program (UmUTracker) that automatically detects and tracks particles by analyzing video sequences acquired by either light microscopy or digital in-line holographic microscopy. Our program detects the 2D lateral positions of particles with an algorithm based on the isosceles triangle transform, and reconstructs their 3D axial positions by a fast implementation of the Rayleigh-Sommerfeld model using a radial intensity profile. To validate the accuracy and performance of our program, we first track the 2D position of polystyrene particles using bright field and digital holographic microscopy. Second, we determine the 3D particle position by analyzing synthetic and experimentally acquired holograms. Finally, to highlight the full program features, we profile the microfluidic flow in a 100 μm high flow chamber. This result agrees with computational fluid dynamic simulations. On a regular desktop computer UmUTracker can detect, analyze, and track multiple particles at 5 frames per second for a template size of 201 ×201 in a 1024 × 1024 image. To enhance usability and to make it easy to implement new functions we used object-oriented programming. UmUTracker is suitable for studies related to: particle dynamics, cell localization, colloids and microfluidic flow measurement. Program Files doi : http://dx.doi.org/10.17632/fkprs4s6xp.1 Licensing provisions : Creative Commons by 4.0 (CC by 4.0) Programming language : MATLAB Nature of problem: 3D multi-particle tracking is a common technique in physics, chemistry and biology. However, in terms of accuracy, reliable particle tracking is a challenging task since results depend on sample illumination, particle overlap, motion blur and noise from recording sensors. Additionally, the computational performance is also an issue if, for example, a computationally expensive process is executed, such as axial particle position reconstruction from digital holographic microscopy data. Versatile robust tracking programs handling these concerns and providing a powerful post-processing option are significantly limited. Solution method: UmUTracker is a multi-functional tool to extract particle positions from long video sequences acquired with either light microscopy or digital holographic microscopy. The program provides an easy-to-use graphical user interface (GUI) for both tracking and post-processing that does not require any programming skills to analyze data from particle tracking experiments. UmUTracker first conduct automatic 2D particle detection even under noisy conditions using a novel circle detector based on the isosceles triangle sampling technique with a multi-scale strategy. To reduce the computational load for 3D tracking, it uses an efficient implementation of the Rayleigh-Sommerfeld light propagation model. To analyze and visualize the data, an efficient data analysis step, which can for example show 4D flow visualization using 3D trajectories, is included. Additionally, UmUTracker is easy to modify with user-customized modules due to the object-oriented programming style Additional comments: Program obtainable from https://sourceforge.net/projects/umutracker/
Likelihood Ratio Test Polarimetric SAR Ship Detection Application
2005-12-01
menu. Under the Matlab menu, the user can export an area of an image to the MatlabTM MAT file format, as well as call RGB image and Pauli...must specify various parameters such as the area of the image to analyze. Export Image Area to MatlabTM (PoIGASP & COASP) Generates a MatlabTM file...represented by the Minister of National Defence, 2005 (0 Sa majest6 la reine, repr(sent(e par le ministre de la Defense nationale, 2005 Abstract This
NASA Astrophysics Data System (ADS)
Macian-Sorribes, Hector; Pulido-Velazquez, Manuel; Tilmant, Amaury
2015-04-01
Stochastic programming methods are better suited to deal with the inherent uncertainty of inflow time series in water resource management. However, one of the most important hurdles in their use in practical implementations is the lack of generalized Decision Support System (DSS) shells, usually based on a deterministic approach. The purpose of this contribution is to present a general-purpose DSS shell, named Explicit Stochastic Programming Advanced Tool (ESPAT), able to build and solve stochastic programming problems for most water resource systems. It implements a hydro-economic approach, optimizing the total system benefits as the sum of the benefits obtained by each user. It has been coded using GAMS, and implements a Microsoft Excel interface with a GAMS-Excel link that allows the user to introduce the required data and recover the results. Therefore, no GAMS skills are required to run the program. The tool is divided into four modules according to its capabilities: 1) the ESPATR module, which performs stochastic optimization procedures in surface water systems using a Stochastic Dual Dynamic Programming (SDDP) approach; 2) the ESPAT_RA module, which optimizes coupled surface-groundwater systems using a modified SDDP approach; 3) the ESPAT_SDP module, capable of performing stochastic optimization procedures in small-size surface systems using a standard SDP approach; and 4) the ESPAT_DET module, which implements a deterministic programming procedure using non-linear programming, able to solve deterministic optimization problems in complex surface-groundwater river basins. The case study of the Mijares river basin (Spain) is used to illustrate the method. It consists in two reservoirs in series, one aquifer and four agricultural demand sites currently managed using historical (XIV century) rights, which give priority to the most traditional irrigation district over the XX century agricultural developments. Its size makes it possible to use either the SDP or the SDDP methods. The independent use of surface and groundwater can be examined with and without the aquifer. The ESPAT_DET, ESPATR and ESPAT_SDP modules were executed for the surface system, while the ESPAT_RA and the ESPAT_DET modules were run for the surface-groundwater system. The surface system's results show a similar performance between the ESPAT_SDP and ESPATR modules, with outperform the one showed by the current policies besides being outperformed by the ESPAT_DET results, which have the advantage of the perfect foresight. The surface-groundwater system's results show a robust situation in which the differences between the module's results and the current policies are lower due the use of pumped groundwater in the XX century crops when surface water is scarce. The results are realistic, with the deterministic optimization outperforming the stochastic one, which at the same time outperforms the current policies; showing that the tool is able to stochastically optimize river-aquifer water resources systems. We are currently working in the application of these tools in the analysis of changes in systems' operation under global change conditions. ACKNOWLEDGEMENT: This study has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) funds.
Simulation Concept - How to Exploit Tools for Computing Hybrids
2010-06-01
biomolecular reactions ................................................................ 42 Figure 30: Overview of MATLAB Implementation...Figure 50: Adenine graphed using MATLAB (left) and OpenGL (right) ........................ 70 Figure 51: An overhead view of a thymine and adenine base...93 Figure 68: Response frequency solution from MATLAB
Vessel-Mounted ADCP Data Calibration and Correction
NASA Astrophysics Data System (ADS)
de Andrade, A. F.; Barreira, L. M.; Violante-Carvalho, N.
2013-05-01
A set of scripts for vessel-mounted ADCP (Acoustic Doppler Current Profiler) data processing is presented. The need for corrections in the data measured by a ship-mounted ADCP and the complexities found during installation, implementation and identification of tasks performed by currently available systems for data processing consist the main motivating factors for the development of a system that would be more practical in manipulation, open code and more manageable for the user. The proposed processing system consists of a set of scripts developed in Matlab TM programming language. The system is able to read the binary files provided by the data acquisition program VMDAS (Vessel Mounted Data Acquisition System), Teledyne RDInstruments proprietary, and calculate calibration factors to correct the data and visualize them after correction. For use the new system, it is only necessary that the ADCP data collected with VMDAS program is in a processing diretory and Matlab TM software be installed on the user's computer. Developed algorithms were extensively tested with ADCP data obtained during Oceano Sul III (Southern Ocean III - OSIII) cruise, conducted by Brazilian Navy aboard the R/V "Antares", from March 26th to May 10th 2007, in the oceanic region between the states of São Paulo and Rio Grande do Sul. For read the data the function rdradcp.m, developed by Rich Pawlowicz and available on his website (http://www.eos.ubc.ca/~rich/#RDADCP), was used. To calculate the calibration factors, alignment error (α) and sensitivity error (β) in Water Tracking and Bottom Tracking Modes, equations deduced by Joyce (1998), Pollard & Read (1989) and Trump & Marmorino (1996) were implemented in Matlab. To validate the calibration factors obtained in the processing system developed, the parameters were compared with the factors provided by CODAS (Common Ocean Data Access System, available at http://currents.soest.hawaii.edu/docs/doc/index.html), post-processing program. For the same data analyzed, the factors provided by both systems were similar. Thereafter, the values obtained were used to correct the data and finally matrices were saved with data corrected and they can be plotted. The values of volume transport of the Brazil Current (BC) were calculated using the corrected data by the two systems and proved quite close, confirming the quality of the correction of the system.
A Survey of Quantum Programming Languages: History, Methods, and Tools
2008-01-01
and entanglement , to achieve computational solutions to certain problems in less time (fewer computational cycles) than is possible using classical...superposition of quantum bits, entanglement , destructive measurement, and the no-cloning theorem. These differences must be thoroughly understood and even...computers using well-known languages such as C, C++, Java, and rapid prototyping languages such as Maple, Mathematica, and Matlab . A good on-line
Using the Parallel Computing Toolbox with MATLAB on the Peregrine System |
parallel pool took %g seconds.\\n', toc) % "single program multiple data" spmd fprintf('Worker %d says Hello World!\\n', labindex) end delete(gcp); % close the parallel pool exit To run the script on a compute node, create the file helloWorld.sub: #!/bin/bash #PBS -l walltime=05:00 #PBS -l nodes=1 #PBS -N
Numerical study of fluid motion in bioreactor with two mixers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheleva, I., E-mail: izheleva@uni-ruse.bg; Lecheva, A., E-mail: alecheva@uni-ruse.bg
2015-10-28
Numerical study of hydrodynamic laminar behavior of a viscous fluid in bioreactor with multiple mixers is provided in the present paper. The reactor is equipped with two disk impellers. The fluid motion is studied in stream function-vorticity formulation. The calculations are made by a computer program, written in MATLAB. The fluid structure is described and numerical results are graphically presented and commented.
Modeling of Habitat and Foraging Behavior of Beaked Whales in the Southern California Bight
2014-09-30
preference. APPROACH High-Frequency Acoustic Recording Packages ( HARPs , Wiggins & Hildebrand 2007) have collected acoustic data at 17 sites...signal processing for HARP data is performed using the MATLAB (Mathworks, Natick, MA) based custom program Triton (Wiggins & Hildebrand 2007) and... HARP data are stored with the remainder of metadata (e.g. project name, instrument location, detection settings, detection effort) in the database
NASA Astrophysics Data System (ADS)
Telang, Aparna S.; Bedekar, P. P.
2017-09-01
Load flow analysis is the initial and essential step for any power system computation. It is required for choosing better options for power system expansion to meet with ever increasing load demand. Implementation of Flexible AC Transmission System (FACTS) device like STATCOM, in the load flow, which is having fast and very flexible control, is one of the important tasks for power system researchers. This paper presents a simple and systematic approach for steady state power flow calculations with FACTS controller, static synchronous compensator (STATCOM) using command line usage of MATLAB tool-power system analysis toolbox (PSAT). The complexity of MATLAB language programming increases due to incorporation of STATCOM in an existing Newton-Raphson load flow algorithm. Thus, the main contribution of this paper is to show how command line usage of user friendly MATLAB tool, PSAT, can extensively be used for quicker and wider interpretation of the results of load flow with STATCOM. The novelty of this paper lies in the method of applying the load increase pattern, where the active and reactive loads have been changed simultaneously at all the load buses under consideration for creating stressed conditions for load flow analysis with STATCOM. The performance have been evaluated on many standard IEEE test systems and the results for standard IEEE-30 bus system, IEEE-57 bus system, and IEEE-118 bus system are presented.
Frith, Amy L; Naved, Ruchira T; Persson, Lars Ake; Rasmussen, Kathleen M; Frongillo, Edward A
2012-06-01
Food insecurity is detrimental to child development, yet little is known about the combined influence of food insecurity and nutritional interventions on child development in low-income countries. We proposed that women assigned to an early invitation time to start a prenatal food supplementation program could reduce the negative influence of food insecurity on maternal-infant interaction. A cohort of 180 mother-infant dyads were studied (born between May and October 2003) from among 3267 in the randomized controlled trial Maternal Infant Nutritional Interventions Matlab, which was conducted in Matlab, Bangladesh. At 8 wk gestation, women were randomly assigned an invitation time to start receiving food supplements (2.5 MJ/d; 6 d/wk) either early (~9 wk gestation; early-invitation group) or at the usual start time (~20 wk gestation; usual-invitation group) for the government program. Maternal-infant interaction was observed in homes with the use of the Nursing Child Assessment Satellite Training Feeding Scale, and food-insecurity status was obtained from questionnaires completed when infants were 3.4-4.0 mo old. By using a general linear model for maternal-infant interaction, we found a significant interaction (P = 0.012) between invitation time to start a prenatal food supplementation program and food insecurity. Those in the usual-invitation group with higher food insecurity scores (i.e., more food insecure) had a lower quality of maternal-infant interaction, but this relationship was ameliorated among those in the early-invitation group. Food insecurity limits the ability of mothers and infants to interact well, but an early invitation time to start a prenatal food supplementation program can support mother-infant interaction among those who are food insecure.
libRoadRunner: a high performance SBML simulation and analysis library
Somogyi, Endre T.; Bouteiller, Jean-Marie; Glazier, James A.; König, Matthias; Medley, J. Kyle; Swat, Maciej H.; Sauro, Herbert M.
2015-01-01
Motivation: This article presents libRoadRunner, an extensible, high-performance, cross-platform, open-source software library for the simulation and analysis of models expressed using Systems Biology Markup Language (SBML). SBML is the most widely used standard for representing dynamic networks, especially biochemical networks. libRoadRunner is fast enough to support large-scale problems such as tissue models, studies that require large numbers of repeated runs and interactive simulations. Results: libRoadRunner is a self-contained library, able to run both as a component inside other tools via its C++ and C bindings, and interactively through its Python interface. Its Python Application Programming Interface (API) is similar to the APIs of MATLAB (www.mathworks.com) and SciPy (http://www.scipy.org/), making it fast and easy to learn. libRoadRunner uses a custom Just-In-Time (JIT) compiler built on the widely used LLVM JIT compiler framework. It compiles SBML-specified models directly into native machine code for a variety of processors, making it appropriate for solving extremely large models or repeated runs. libRoadRunner is flexible, supporting the bulk of the SBML specification (except for delay and non-linear algebraic equations) including several SBML extensions (composition and distributions). It offers multiple deterministic and stochastic integrators, as well as tools for steady-state analysis, stability analysis and structural analysis of the stoichiometric matrix. Availability and implementation: libRoadRunner binary distributions are available for Mac OS X, Linux and Windows. The library is licensed under Apache License Version 2.0. libRoadRunner is also available for ARM-based computers such as the Raspberry Pi. http://www.libroadrunner.org provides online documentation, full build instructions, binaries and a git source repository. Contacts: hsauro@u.washington.edu or somogyie@indiana.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26085503
libRoadRunner: a high performance SBML simulation and analysis library.
Somogyi, Endre T; Bouteiller, Jean-Marie; Glazier, James A; König, Matthias; Medley, J Kyle; Swat, Maciej H; Sauro, Herbert M
2015-10-15
This article presents libRoadRunner, an extensible, high-performance, cross-platform, open-source software library for the simulation and analysis of models expressed using Systems Biology Markup Language (SBML). SBML is the most widely used standard for representing dynamic networks, especially biochemical networks. libRoadRunner is fast enough to support large-scale problems such as tissue models, studies that require large numbers of repeated runs and interactive simulations. libRoadRunner is a self-contained library, able to run both as a component inside other tools via its C++ and C bindings, and interactively through its Python interface. Its Python Application Programming Interface (API) is similar to the APIs of MATLAB ( WWWMATHWORKSCOM: ) and SciPy ( HTTP//WWWSCIPYORG/: ), making it fast and easy to learn. libRoadRunner uses a custom Just-In-Time (JIT) compiler built on the widely used LLVM JIT compiler framework. It compiles SBML-specified models directly into native machine code for a variety of processors, making it appropriate for solving extremely large models or repeated runs. libRoadRunner is flexible, supporting the bulk of the SBML specification (except for delay and non-linear algebraic equations) including several SBML extensions (composition and distributions). It offers multiple deterministic and stochastic integrators, as well as tools for steady-state analysis, stability analysis and structural analysis of the stoichiometric matrix. libRoadRunner binary distributions are available for Mac OS X, Linux and Windows. The library is licensed under Apache License Version 2.0. libRoadRunner is also available for ARM-based computers such as the Raspberry Pi. http://www.libroadrunner.org provides online documentation, full build instructions, binaries and a git source repository. hsauro@u.washington.edu or somogyie@indiana.edu Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.
Computer program for analysis of hemodynamic response to head-up tilt test
NASA Astrophysics Data System (ADS)
ŚwiÄ tek, Eliza; Cybulski, Gerard; Koźluk, Edward; PiÄ tkowska, Agnieszka; Niewiadomski, Wiktor
2014-11-01
The aim of this work was to create a computer program, written in the MATLAB environment, which enables the visualization and analysis of hemodynamic parameters recorded during a passive tilt test using the CNS Task Force Monitor System. The application was created to help in the assessment of the relationship between the values and dynamics of changes of the selected parameters and the risk of orthostatic syncope. The signal analysis included: R-R intervals (RRI), heart rate (HR), systolic blood pressure (sBP), diastolic blood pressure (dBP), mean blood pressure (mBP), stroke volume (SV), stroke index (SI), cardiac output (CO), cardiac index (CI), total peripheral resistance (TPR), total peripheral resistance index (TPRI), ventricular ejection time (LVET) and thoracic fluid content (TFC). The program enables the user to visualize waveforms for a selected parameter and to perform smoothing with selected moving average parameters. It allows one to construct the graph of means for any range, and the Poincare plot for a selected time range. The program automatically determines the average value of the parameter before tilt, its minimum and maximum value immediately after changing positions and the times of their occurrence. It is possible to correct the automatically detected points manually. For the RR interval, it determines the acceleration index (AI) and the brake index (BI). It is possible to save calculated values to an XLS with a name specified by user. The application has a user-friendly graphical interface and can run on a computer that has no MATLAB software.
MICRO-U 70.1: Training Model of an Instructional Institution, Users Manual.
ERIC Educational Resources Information Center
Springer, Colby H.
MICRO-U is a student demand driven deterministic model. Student enrollment, by degree program, is used to develop an Instructional Work Load Matrix. Linear equations using Weekly Student Contact Hours (WSCH), Full Time Equivalent (FTE) students, FTE faculty, and number of disciplines determine library, central administration, and physical plant…
A MATLAB-Aided Method for Teaching Calculus-Based Business Mathematics
ERIC Educational Resources Information Center
Liang, Jiajuan; Pan, William S. Y.
2009-01-01
MATLAB is a powerful package for numerical computation. MATLAB contains a rich pool of mathematical functions and provides flexible plotting functions for illustrating mathematical solutions. The course of calculus-based business mathematics consists of two major topics: 1) derivative and its applications in business; and 2) integration and its…
Using Matlab in a Multivariable Calculus Course.
ERIC Educational Resources Information Center
Schlatter, Mark D.
The benefits of high-level mathematics packages such as Matlab include both a computer algebra system and the ability to provide students with concrete visual examples. This paper discusses how both capabilities of Matlab were used in a multivariate calculus class. Graphical user interfaces which display three-dimensional surfaces, contour plots,…
Sparse Matrices in MATLAB: Design and Implementation
NASA Technical Reports Server (NTRS)
Gilbert, John R.; Moler, Cleve; Schreiber, Robert
1992-01-01
The matrix computation language and environment MATLAB is extended to include sparse matrix storage and operations. The only change to the outward appearance of the MATLAB language is a pair of commands to create full or sparse matrices. Nearly all the operations of MATLAB now apply equally to full or sparse matrices, without any explicit action by the user. The sparse data structure represents a matrix in space proportional to the number of nonzero entries, and most of the operations compute sparse results in time proportional to the number of arithmetic operations on nonzeros.
Mathematical Formulation used by MATLAB Code to Convert FTIR Interferograms to Calibrated Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, Derek Elswick
This report discusses the mathematical procedures used to convert raw interferograms from Fourier transform infrared (FTIR) sensors to calibrated spectra. The work discussed in this report was completed as part of the Helios project at Los Alamos National Laboratory. MATLAB code was developed to convert the raw interferograms to calibrated spectra. The report summarizes the developed MATLAB scripts and functions, along with a description of the mathematical methods used by the code. The first step in working with raw interferograms is to convert them to uncalibrated spectra by applying an apodization function to the raw data and then by performingmore » a Fourier transform. The developed MATLAB code also addresses phase error correction by applying the Mertz method. This report provides documentation for the MATLAB scripts.« less
Zimmermann, Patric; Green, Robert J; Haverkort, Maurits W; de Groot, Frank M F
2018-05-01
Some initial instructions for the Quanty4RIXS program written in MATLAB ® are provided. The program assists in the calculation of 1s 2p RIXS and 1s 2p RIXS-MCD spectra using Quanty. Furthermore, 1s XAS and 2p 3d RIXS calculations in different symmetries can also be performed. It includes the Hartree-Fock values for the Slater integrals and spin-orbit interactions for several 3d transition metal ions that are required to create the .lua scripts containing all necessary parameters and quantum mechanical definitions for the calculations. The program can be used free of charge and is designed to allow for further adjustments of the scripts. open access.
Deterministic chaotic dynamics of Raba River flow (Polish Carpathian Mountains)
NASA Astrophysics Data System (ADS)
Kędra, Mariola
2014-02-01
Is the underlying dynamics of river flow random or deterministic? If it is deterministic, is it deterministic chaotic? This issue is still controversial. The application of several independent methods, techniques and tools for studying daily river flow data gives consistent, reliable and clear-cut results to the question. The outcomes point out that the investigated discharge dynamics is not random but deterministic. Moreover, the results completely confirm the nonlinear deterministic chaotic nature of the studied process. The research was conducted on daily discharge from two selected gauging stations of the mountain river in southern Poland, the Raba River.
High-Fidelity Real-Time Trajectory Optimization for Reusable Launch Vehicles
2006-12-01
6.20 Max DR Yawing Moment History. ...............................................................270 Figure 6.21 Snapshot from MATLAB “Profile...Propagation using “ode45” (Euler Angles)...........................................330 Figure 6.114 Interpolated Elevon Controls using Various MATLAB ...Schemes.................332 Figure 6.115 Interpolated Flap Controls using Various MATLAB Schemes.....................333 Figure 6.116 Interpolated
A Matlab/Simulink-Based Interactive Module for Servo Systems Learning
ERIC Educational Resources Information Center
Aliane, N.
2010-01-01
This paper presents an interactive module for learning both the fundamental and practical issues of servo systems. This module, developed using Simulink in conjunction with the Matlab graphical user interface (Matlab-GUI) tool, is used to supplement conventional lectures in control engineering and robotics subjects. First, the paper introduces the…
Robust Programming Problems Based on the Mean-Variance Model Including Uncertainty Factors
NASA Astrophysics Data System (ADS)
Hasuike, Takashi; Ishii, Hiroaki
2009-01-01
This paper considers robust programming problems based on the mean-variance model including uncertainty sets and fuzzy factors. Since these problems are not well-defined problems due to fuzzy factors, it is hard to solve them directly. Therefore, introducing chance constraints, fuzzy goals and possibility measures, the proposed models are transformed into the deterministic equivalent problems. Furthermore, in order to solve these equivalent problems efficiently, the solution method is constructed introducing the mean-absolute deviation and doing the equivalent transformations.
Image Analysis Using Quantum Entropy Scale Space and Diffusion Concepts
2009-11-01
images using a combination of analytic methods and prototype Matlab and Mathematica programs. We investigated concepts of generalized entropy and...Schmidt strength from quantum logic gate decomposition. This form of entropy gives a measure of the nonlocal content of an entangling logic gate...11 We recall that the Schmidt number is an indicator of entanglement , but not a measure of entanglement . For instance, let us compare
[Design of hand-held heart rate variability acquisition and analysis system].
Li, Kaiyuan; Wang, Buqing; Wang, Weidong
2012-07-01
A design of handheld heart rate variability acquisition and analysis system is proposed. The system collects and stores the patient's ECG every five minutes through both hands touching on the electrodes, and then -uploads data to a PC through USB port. The system uses software written in LabVIEW to analyze heart rate variability parameters, The parameters calculated function is programmed and generated to components in Matlab.
ISMRM Raw data format: A proposed standard for MRI raw datasets.
Inati, Souheil J; Naegele, Joseph D; Zwart, Nicholas R; Roopchansingh, Vinai; Lizak, Martin J; Hansen, David C; Liu, Chia-Ying; Atkinson, David; Kellman, Peter; Kozerke, Sebastian; Xue, Hui; Campbell-Washburn, Adrienne E; Sørensen, Thomas S; Hansen, Michael S
2017-01-01
This work proposes the ISMRM Raw Data format as a common MR raw data format, which promotes algorithm and data sharing. A file format consisting of a flexible header and tagged frames of k-space data was designed. Application Programming Interfaces were implemented in C/C++, MATLAB, and Python. Converters for Bruker, General Electric, Philips, and Siemens proprietary file formats were implemented in C++. Raw data were collected using magnetic resonance imaging scanners from four vendors, converted to ISMRM Raw Data format, and reconstructed using software implemented in three programming languages (C++, MATLAB, Python). Images were obtained by reconstructing the raw data from all vendors. The source code, raw data, and images comprising this work are shared online, serving as an example of an image reconstruction project following a paradigm of reproducible research. The proposed raw data format solves a practical problem for the magnetic resonance imaging community. It may serve as a foundation for reproducible research and collaborations. The ISMRM Raw Data format is a completely open and community-driven format, and the scientific community is invited (including commercial vendors) to participate either as users or developers. Magn Reson Med 77:411-421, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Monitoring and Acquisition Real-time System (MARS)
NASA Technical Reports Server (NTRS)
Holland, Corbin
2013-01-01
MARS is a graphical user interface (GUI) written in MATLAB and Java, allowing the user to configure and control the Scalable Parallel Architecture for Real-Time Acquisition and Analysis (SPARTAA) data acquisition system. SPARTAA not only acquires data, but also allows for complex algorithms to be applied to the acquired data in real time. The MARS client allows the user to set up and configure all settings regarding the data channels attached to the system, as well as have complete control over starting and stopping data acquisition. It provides a unique "Test" programming environment, allowing the user to create tests consisting of a series of alarms, each of which contains any number of data channels. Each alarm is configured with a particular algorithm, determining the type of processing that will be applied on each data channel and tested against a defined threshold. Tests can be uploaded to SPARTAA, thereby teaching it how to process the data. The uniqueness of MARS is in its capability to be adaptable easily to many test configurations. MARS sends and receives protocols via TCP/IP, which allows for quick integration into almost any test environment. The use of MATLAB and Java as the programming languages allows for developers to integrate the software across multiple operating platforms.
Turner, Travis H
2005-03-30
An increasingly large corpus of clinical and experimental neuropsychological research has demonstrated the utility of measuring visual contrast sensitivity. Unfortunately, existing means of measuring contrast sensitivity can be prohibitively expensive, difficult to standardize, or lack reliability. Additionally, most existing tests do not allow full control over important characteristics, such as off-angle rotations, waveform, contrast, and spatial frequency. Ideally, researchers could manipulate characteristics and display stimuli in a computerized task designed to meet experimental needs. Thus far, 256-bit color limitation in standard cathode ray tube (CRT) monitors has been preclusive. To this end, the pointillism method (PM) was developed. Using MATLAB software, stimuli are created based on both mathematical and stochastic components, such that differences in regional luminance values of the gradient field closely approximate the desired contrast. This paper describes the method and examines its performance in sine and square-wave image sets from a range of contrast values. Results suggest the utility of the method for most experimental applications. Weaknesses in the current version, the need for validation and reliability studies, and considerations regarding applications are discussed. Syntax for the program is provided in an appendix, and a version of the program independent of MATLAB is available from the author.
Acoustics Research of Propulsion Systems
NASA Technical Reports Server (NTRS)
Gao, Ximing; Houston, Janice
2014-01-01
The liftoff phase induces high acoustic loading over a broad frequency range for a launch vehicle. These external acoustic environments are used in the prediction of the internal vibration responses of the vehicle and components. Present liftoff vehicle acoustic environment prediction methods utilize stationary data from previously conducted hold-down tests to generate 1/3 octave band Sound Pressure Level (SPL) spectra. In an effort to update the accuracy and quality of liftoff acoustic loading predictions, non-stationary flight data from the Ares I-X were processed in PC-Signal in two flight phases: simulated hold-down and liftoff. In conjunction, the Prediction of Acoustic Vehicle Environments (PAVE) program was developed in MATLAB to allow for efficient predictions of sound pressure levels (SPLs) as a function of station number along the vehicle using semi-empirical methods. This consisted of generating the Dimensionless Spectrum Function (DSF) and Dimensionless Source Location (DSL) curves from the Ares I-X flight data. These are then used in the MATLAB program to generate the 1/3 octave band SPL spectra. Concluding results show major differences in SPLs between the hold-down test data and the processed Ares I-X flight data making the Ares I-X flight data more practical for future vehicle acoustic environment predictions.
Increasing the computational efficient of digital cross correlation by a vectorization method
NASA Astrophysics Data System (ADS)
Chang, Ching-Yuan; Ma, Chien-Ching
2017-08-01
This study presents a vectorization method for use in MATLAB programming aimed at increasing the computational efficiency of digital cross correlation in sound and images, resulting in a speedup of 6.387 and 36.044 times compared with performance values obtained from looped expression. This work bridges the gap between matrix operations and loop iteration, preserving flexibility and efficiency in program testing. This paper uses numerical simulation to verify the speedup of the proposed vectorization method as well as experiments to measure the quantitative transient displacement response subjected to dynamic impact loading. The experiment involved the use of a high speed camera as well as a fiber optic system to measure the transient displacement in a cantilever beam under impact from a steel ball. Experimental measurement data obtained from the two methods are in excellent agreement in both the time and frequency domain, with discrepancies of only 0.68%. Numerical and experiment results demonstrate the efficacy of the proposed vectorization method with regard to computational speed in signal processing and high precision in the correlation algorithm. We also present the source code with which to build MATLAB-executable functions on Windows as well as Linux platforms, and provide a series of examples to demonstrate the application of the proposed vectorization method.
SPSS and SAS programs for generalizability theory analyses.
Mushquash, Christopher; O'Connor, Brian P
2006-08-01
The identification and reduction of measurement errors is a major challenge in psychological testing. Most investigators rely solely on classical test theory for assessing reliability, whereas most experts have long recommended using generalizability theory instead. One reason for the common neglect of generalizability theory is the absence of analytic facilities for this purpose in popular statistical software packages. This article provides a brief introduction to generalizability theory, describes easy to use SPSS, SAS, and MATLAB programs for conducting the recommended analyses, and provides an illustrative example, using data (N = 329) for the Rosenberg Self-Esteem Scale. Program output includes variance components, relative and absolute errors and generalizability coefficients, coefficients for D studies, and graphs of D study results.
COSMIC monthly progress report
NASA Technical Reports Server (NTRS)
1994-01-01
Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of April 1994. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are summarized. Five articles were prepared for publication in the NASA Tech Brief Journal. These articles (included in this report) describe the following software items: GAP 1.0 - Groove Analysis Program, Version 1.0; SUBTRANS - Subband/Transform MATLAB Functions for Image Processing; CSDM - COLD-SAT Dynamic Model; CASRE - Computer Aided Software Reliability Estimation; and XOPPS - OEL Project Planner/Scheduler Tool. Activities in the areas of marketing, customer service, benefits identification, maintenance and support, and disseminations are also described along with a budget summary.
InterFace: A software package for face image warping, averaging, and principal components analysis.
Kramer, Robin S S; Jenkins, Rob; Burton, A Mike
2017-12-01
We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.
Genetic algorithms using SISAL parallel programming language
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tejada, S.
1994-05-06
Genetic algorithms are a mathematical optimization technique developed by John Holland at the University of Michigan [1]. The SISAL programming language possesses many of the characteristics desired to implement genetic algorithms. SISAL is a deterministic, functional programming language which is inherently parallel. Because SISAL is functional and based on mathematical concepts, genetic algorithms can be efficiently translated into the language. Several of the steps involved in genetic algorithms, such as mutation, crossover, and fitness evaluation, can be parallelized using SISAL. In this paper I will l discuss the implementation and performance of parallel genetic algorithms in SISAL.
NASA Technical Reports Server (NTRS)
Rauw, Marc O.
1993-01-01
The design of advanced Automatic Aircraft Control Systems (AACS's) can be improved upon considerably if the designer can access all models and tools required for control system design and analysis through a graphical user interface, from within one software environment. This MSc-thesis presents the first step in the development of such an environment, which is currently being done at the Section for Stability and Control of Delft University of Technology, Faculty of Aerospace Engineering. The environment is implemented within the commercially available software package MATLAB/SIMULINK. The report consists of two parts. Part 1 gives a detailed description of the AACS design environment. The heart of this environment is formed by the SIMULINK implementation of a nonlinear aircraft model in block-diagram format. The model has been worked out for the old laboratory aircraft of the Faculty, the DeHavilland DHC-2 'Beaver', but due to its modular structure, it can easily be adapted for other aircraft. Part 1 also describes MATLAB programs which can be applied for finding steady-state trimmed-flight conditions and for linearization of the aircraft model, and it shows how the built-in simulation routines of SIMULINK have been used for open-loop analysis of the aircraft dynamics. Apart from the implementation of the models and tools, a thorough treatment of the theoretical backgrounds is presented. Part 2 of this report presents a part of an autopilot design process for the 'Beaver' aircraft, which clearly demonstrates the power and flexibility of the AACS design environment from part 1. Evaluations of all longitudinal and lateral control laws by means of nonlinear simulations are treated in detail. A floppy disk containing all relevant MATLAB programs and SIMULINK models is provided as a supplement.
A Tutorial on Parallel and Concurrent Programming in Haskell
NASA Astrophysics Data System (ADS)
Peyton Jones, Simon; Singh, Satnam
This practical tutorial introduces the features available in Haskell for writing parallel and concurrent programs. We first describe how to write semi-explicit parallel programs by using annotations to express opportunities for parallelism and to help control the granularity of parallelism for effective execution on modern operating systems and processors. We then describe the mechanisms provided by Haskell for writing explicitly parallel programs with a focus on the use of software transactional memory to help share information between threads. Finally, we show how nested data parallelism can be used to write deterministically parallel programs which allows programmers to use rich data types in data parallel programs which are automatically transformed into flat data parallel versions for efficient execution on multi-core processors.
A parameterization of nuclear track profiles in CR-39 detector
NASA Astrophysics Data System (ADS)
Azooz, A. A.; Al-Nia'emi, S. H.; Al-Jubbori, M. A.
2012-11-01
In this work, the empirical parameterization describing the alpha particles’ track depth in CR-39 detectors is extended to describe longitudinal track profiles against etching time for protons and alpha particles. MATLAB based software is developed for this purpose. The software calculates and plots the depth, diameter, range, residual range, saturation time, and etch rate versus etching time. The software predictions are compared with other experimental data and with results of calculations using the original software, TRACK_TEST, developed for alpha track calculations. The software related to this work is freely downloadable and performs calculations for protons in addition to alpha particles. Program summary Program title: CR39 Catalog identifier: AENA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENA_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Copyright (c) 2011, Aasim Azooz Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met • Redistributions of source code must retain the above copyright, this list of conditions and the following disclaimer. • Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution This software is provided by the copyright holders and contributors “as is” and any express or implied warranties, including, but not limited to, the implied warranties of merchantability and fitness for a particular purpose are disclaimed. In no event shall the copyright owner or contributors be liable for any direct, indirect, incidental, special, exemplary, or consequential damages (including, but not limited to, procurement of substitute goods or services; loss of use, data, or profits; or business interruption) however caused and on any theory of liability, whether in contract, strict liability, or tort (including negligence or otherwise) arising in any way out of the use of this software, even if advised of the possibility of such damage. No. of lines in distributed program, including test data, etc.: 15598 No. of bytes in distributed program, including test data, etc.: 3933244 Distribution format: tar.gz Programming language: MATLAB. Computer: Any Desktop or Laptop. Operating system: Windows 1998 or above (with MATLAB R13 or above installed). RAM: 512 Megabytes or higher Classification: 17.5. Nature of problem: A new semispherical parameterization of charged particle tracks in CR-39 SSNTD is carried out in a previous paper. This parameterization is developed here into a MATLAB based software to calculate the track length and track profile for any proton or alpha particle energy or etching time. This software is intended to compete with the TRACK_TEST [1] and TRACK_VISION [2] software currently in use by all people working in the field of SSNTD. Solution method: Based on fitting of experimental results of protons and alpha particles track lengths for various energies and etching times to a new semispherical formula with four free fitting parameters, the best set of energy independent parameters were found. These parameters are introduced into the software and the software is programmed to solve the set of equations to calculate the track depth, track etching rate as a function of both time and residual range for particles of normal and oblique incidence, the track longitudinal profile at both normal and oblique incidence, and the three dimensional track profile at normal incidence. Running time: 1-8 s on Pentium (4) 2 GHz CPU, 3 GB of RAM depending on the etching time value References: [1] ADWT_v1_0 Track_Test Computer program TRACK_TEST for calculating parameters and plotting profiles for etch pits in nuclear track materials. D. Nikezic, K.N. Yu Comput. Phys. Commun. 174(2006)160 [2] AEAF_v1_0 TRACK_VISION Computer program TRACK_VISION for simulating optical appearance of etched tracks in CR-39 nuclear track detectors. D. Nikezic, K.N. Yu Comput. Phys. Commun. 178(2008)591
Borse, N N; Hyder, A A; Bishai, D; Baker, T; Arifeen, S E
2011-11-01
Childhood drowning is a major public health problem that has been neglected in many low- and middle-income countries. In Matlab, rural Bangladesh, more than 40% of child deaths aged 1-4 years are due to drowning. The main objective of this paper was to develop and evaluate a childhood drowning risk prediction index. A literature review was carried out to document risk factors identified for childhood drowning in Bangladesh. The Newacheck model for special health care needs for children was adapted and applied to construct a childhood drowning risk index called "Potential Risk Estimation Drowning Index for Children" (PREDIC). Finally, the proposed PREDIC Index was applied to childhood drowning deaths and compared with the comparison group from children living in Matlab, Bangladesh. This pilot study used t-tests and Receiver Operating Characteristic (ROC) curve to analyze the results. The PREDIC index was applied to 302 drowning deaths and 624 children 0-4 years old living in Matlab. The results of t-test indicate that the drowned children had a statistically (t=-8.58, p=0.0001) significant higher mean PREDIC score (6.01) than those in comparison group (5.26). Drowning cases had a PREDIC score of 6 or more for 68% of the children however, the comparison group had 43% of the children with score of 6 or more which was statistically significant (t=-7.36, p<0.001). The area under the curve for the Receiver Operating Characteristic curve was 0.662. Index score construction was scientifically plausible; and the index is relatively complete, fairly accurate, and practical. The risk index can help identify and target high risk children with drowning prevention programs. PREDIC index needs to be further tested for its accuracy, feasibility and effectiveness in drowning risk reduction in Bangladesh and other countries. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Ortega, Leticia E.
2008-01-01
My research explores the challenges and questions that pre-service teachers in two English Education programs confronted with respect to the role of technology in their professional practices and identities. It is evident from the data that the decision to incorporate different technologies in their professional practices implied much more than…
System Simulation of Nuclear Power Plant by Coupling RELAP5 and Matlab/Simulink
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meng Lin; Dong Hou; Zhihong Xu
2006-07-01
Since RELAP5 code has general and advanced features in thermal-hydraulic computation, it has been widely used in transient and accident safety analysis, experiment planning analysis, and system simulation, etc. So we wish to design, analyze, verify a new Instrumentation And Control (I and C) system of Nuclear Power Plant (NPP) based on the best-estimated code, and even develop our engineering simulator. But because of limited function of simulating control and protection system in RELAP5, it is necessary to expand the function for high efficient, accurate, flexible design and simulation of I and C system. Matlab/Simulink, a scientific computation software, justmore » can compensate the limitation, which is a powerful tool in research and simulation of plant process control. The software is selected as I and C part to be coupled with RELAP5 code to realize system simulation of NPPs. There are two key techniques to be solved. One is the dynamic data exchange, by which Matlab/Simulink receives plant parameters and returns control results. Database is used to communicate the two codes. Accordingly, Dynamic Link Library (DLL) is applied to link database in RELAP5, while DLL and S-Function is applied in Matlab/Simulink. The other problem is synchronization between the two codes for ensuring consistency in global simulation time. Because Matlab/Simulink always computes faster than RELAP5, the simulation time is sent by RELAP5 and received by Matlab/Simulink. A time control subroutine is added into the simulation procedure of Matlab/Simulink to control its simulation advancement. Through these ways, Matlab/Simulink is dynamically coupled with RELAP5. Thus, in Matlab/Simulink, we can freely design control and protection logic of NPPs and test it with best-estimated plant model feedback. A test will be shown to illuminate that results of coupling calculation are nearly the same with one of single RELAP5 with control logic. In practice, a real Pressurized Water Reactor (PWR) is modeled by RELAP5 code, and its main control and protection system is duplicated by Matlab/Simulink. Some steady states and transients are calculated under control of these I and C systems, and the results are compared with the plant test curves. The application showed that it can do exact system simulation of NPPs by coupling RELAP5 and Matlab/Simulink. This paper will mainly focus on the coupling method, plant thermal-hydraulic model, main control logics, test and application results. (authors)« less
Elasto-limited plastic analysis of structures for probabilistic conditions
NASA Astrophysics Data System (ADS)
Movahedi Rad, M.
2018-06-01
With applying plastic analysis and design methods, significant saving in material can be obtained. However, as a result of this benefit excessive plastic deformations and large residual displacements might develop, which in turn might lead to unserviceability and collapse of the structure. In this study, for deterministic problem the residual deformation of structures is limited by considering a constraint on the complementary strain energy of the residual forces. For probabilistic problem the constraint for the complementary strain energy of the residual forces is given randomly and critical stresses updated during the iteration. Limit curves are presented for the plastic limit load factors. The results show that these constraints have significant effects on the load factors. The formulations of the deterministic and probabilistic problems lead to mathematical programming which are solved by the use of nonlinear algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huthmacher, Klaus; Molberg, Andreas K.; Rethfeld, Bärbel
2016-10-01
A split-step numerical method for calculating ultrafast free-electron dynamics in dielectrics is introduced. The two split steps, independently programmed in C++11 and FORTRAN 2003, are interfaced via the presented open source wrapper. The first step solves a deterministic extended multi-rate equation for the ionization, electron–phonon collisions, and single photon absorption by free-carriers. The second step is stochastic and models electron–electron collisions using Monte-Carlo techniques. This combination of deterministic and stochastic approaches is a unique and efficient method of calculating the nonlinear dynamics of 3D materials exposed to high intensity ultrashort pulses. Results from simulations solving the proposed model demonstrate howmore » electron–electron scattering relaxes the non-equilibrium electron distribution on the femtosecond time scale.« less
NASA Astrophysics Data System (ADS)
Windhari, Ayuty; Handayani, Gunawan
2015-04-01
The 3D inversion gravity anomaly to estimate topographical density using a matlab source code from gridded data provided by Parker Oldenburg algorithm based on fast Fourier transform was computed. We extend and improved the source code of 3DINVERT.M invented by Gomez Ortiz and Agarwal (2005) using the relationship between Fourier transform of the gravity anomaly and the sum of the Fourier transform from the topography density. We gave density contrast between the two media to apply the inversion. FFT routine was implemented to construct amplitude spectrum to the given mean depth. The results were presented as new graphics of inverted topography density, the gravity anomaly due to the inverted topography and the difference between the input gravity data and the computed ones. It terminates when the RMS error is lower than pre-assigned value used as convergence criterion or until maximum of iterations is reached. As an example, we used the matlab program on gravity data of Banten region, Indonesia.
Fluidica CFD software for fluids instruction
NASA Astrophysics Data System (ADS)
Colonius, Tim
2008-11-01
Fluidica is an open-source freely available Matlab graphical user interface (GUI) to to an immersed-boundary Navier- Stokes solver. The algorithm is programmed in Fortran and compiled into Matlab as mex-function. The user can create external flows about arbitrarily complex bodies and collections of free vortices. The code runs fast enough for complex 2D flows to be computed and visualized in real-time on the screen. This facilitates its use in homework and in the classroom for demonstrations of various potential-flow and viscous flow phenomena. The GUI has been written with the goal of allowing the student to learn how to use the software as she goes along. The user can select which quantities are viewed on the screen, including contours of various scalars, velocity vectors, streamlines, particle trajectories, streaklines, and finite-time Lyapunov exponents. In this talk, we demonstrate the software in the context of worked classroom examples demonstrating lift and drag, starting vortices, separation, and vortex dynamics.
Ensemble: a web-based system for psychology survey and experiment management.
Tomic, Stefan T; Janata, Petr
2007-08-01
We provide a description of Ensemble, a suite of Web-integrated modules for managing and analyzing data associated with psychology experiments in a small research lab. The system delivers interfaces via a Web browser for creating and presenting simple surveys without the need to author Web pages and with little or no programming effort. The surveys may be extended by selecting and presenting auditory and/or visual stimuli with MATLAB and Flash to enable a wide range of psychophysical and cognitive experiments which do not require the recording of precise reaction times. Additionally, one is provided with the ability to administer and present experiments remotely. The software technologies employed by the various modules of Ensemble are MySQL, PHP, MATLAB, and Flash. The code for Ensemble is open source and available to the public, so that its functions can be readily extended by users. We describe the architecture of the system, the functionality of each module, and provide basic examples of the interfaces.
IB2d: a Python and MATLAB implementation of the immersed boundary method.
Battista, Nicholas A; Strickland, W Christopher; Miller, Laura A
2017-03-29
The development of fluid-structure interaction (FSI) software involves trade-offs between ease of use, generality, performance, and cost. Typically there are large learning curves when using low-level software to model the interaction of an elastic structure immersed in a uniform density fluid. Many existing codes are not publicly available, and the commercial software that exists usually requires expensive licenses and may not be as robust or allow the necessary flexibility that in house codes can provide. We present an open source immersed boundary software package, IB2d, with full implementations in both MATLAB and Python, that is capable of running a vast range of biomechanics models and is accessible to scientists who have experience in high-level programming environments. IB2d contains multiple options for constructing material properties of the fiber structure, as well as the advection-diffusion of a chemical gradient, muscle mechanics models, and artificial forcing to drive boundaries with a preferred motion.
A Series of Computational Neuroscience Labs Increases Comfort with MATLAB.
Nichols, David F
2015-01-01
Computational simulations allow for a low-cost, reliable means to demonstrate complex and often times inaccessible concepts to undergraduates. However, students without prior computer programming training may find working with code-based simulations to be intimidating and distracting. A series of computational neuroscience labs involving the Hodgkin-Huxley equations, an Integrate-and-Fire model, and a Hopfield Memory network were used in an undergraduate neuroscience laboratory component of an introductory level course. Using short focused surveys before and after each lab, student comfort levels were shown to increase drastically from a majority of students being uncomfortable or with neutral feelings about working in the MATLAB environment to a vast majority of students being comfortable working in the environment. Though change was reported within each lab, a series of labs was necessary in order to establish a lasting high level of comfort. Comfort working with code is important as a first step in acquiring computational skills that are required to address many questions within neuroscience.
A Series of Computational Neuroscience Labs Increases Comfort with MATLAB
Nichols, David F.
2015-01-01
Computational simulations allow for a low-cost, reliable means to demonstrate complex and often times inaccessible concepts to undergraduates. However, students without prior computer programming training may find working with code-based simulations to be intimidating and distracting. A series of computational neuroscience labs involving the Hodgkin-Huxley equations, an Integrate-and-Fire model, and a Hopfield Memory network were used in an undergraduate neuroscience laboratory component of an introductory level course. Using short focused surveys before and after each lab, student comfort levels were shown to increase drastically from a majority of students being uncomfortable or with neutral feelings about working in the MATLAB environment to a vast majority of students being comfortable working in the environment. Though change was reported within each lab, a series of labs was necessary in order to establish a lasting high level of comfort. Comfort working with code is important as a first step in acquiring computational skills that are required to address many questions within neuroscience. PMID:26557798
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
2006-01-01
This report provides a user guide for the Compressible Flow Toolbox, a collection of algorithms that solve almost 300 linear and nonlinear classical compressible flow relations. The algorithms, implemented in the popular MATLAB programming language, are useful for analysis of one-dimensional steady flow with constant entropy, friction, heat transfer, or shock discontinuities. The solutions do not include any gas dissociative effects. The toolbox also contains functions for comparing and validating the equation-solving algorithms against solutions previously published in the open literature. The classical equations solved by the Compressible Flow Toolbox are: isentropic-flow equations, Fanno flow equations (pertaining to flow of an ideal gas in a pipe with friction), Rayleigh flow equations (pertaining to frictionless flow of an ideal gas, with heat transfer, in a pipe of constant cross section.), normal-shock equations, oblique-shock equations, and Prandtl-Meyer expansion equations. At the time this report was published, the Compressible Flow Toolbox was available without cost from the NASA Software Repository.
Using 3-D Numerical Weather Data in Piloted Simulations
NASA Technical Reports Server (NTRS)
Daniels, Taumi S.
2016-01-01
This report describes the process of acquiring and using 3-D numerical model weather data sets in NASA Langley's Research Flight Deck (RFD). A set of software tools implement the process and can be used for other purposes as well. Given time and location information of a weather phenomenon of interest, the user can download associated numerical weather model data. These data are created by the National Oceanic and Atmospheric Administration (NOAA) High Resolution Rapid Refresh (HRRR) model, and are then processed using a set of Mathworks' Matlab(TradeMark) scripts to create the usable 3-D weather data sets. Each data set includes radar re ectivity, water vapor, component winds, temperature, supercooled liquid water, turbulence, pressure, altitude, land elevation, relative humidity, and water phases. An open-source data processing program, wgrib2, is available from NOAA online, and is used along with Matlab scripts. These scripts are described with sucient detail to make future modi cations. These software tools have been used to generate 3-D weather data for various RFD experiments.
Enhanced modeling features within TREETOPS
NASA Technical Reports Server (NTRS)
Vandervoort, R. J.; Kumar, Manoj N.
1989-01-01
The original motivation for TREETOPS was to build a generic multi-body simulation and remove the burden of writing multi-body equations from the engineers. The motivation of the enhancement was twofold: (1) to extend the menu of built-in features (sensors, actuators, constraints, etc.) that did not require user code; and (2) to extend the control system design capabilities by linking with other government funded software (NASTRAN and MATLAB). These enhancements also serve to bridge the gap between structures and control groups. It is common on large space programs for the structures groups to build hi-fidelity models of the structure using NASTRAN and for the controls group to build lower order models because they lack the tools to incorporate the former into their analysis. Now the controls engineers can accept the hi-fidelity NASTRAN models into TREETOPS, add sensors and actuators, perform model reduction and couple the result directly into MATLAB to perform their design. The controller can then be imported directly into TREETOPS for non-linear, time-history simulation.
MOCCASIN: converting MATLAB ODE models to SBML.
Gómez, Harold F; Hucka, Michael; Keating, Sarah M; Nudelman, German; Iber, Dagmar; Sealfon, Stuart C
2016-06-15
MATLAB is popular in biological research for creating and simulating models that use ordinary differential equations (ODEs). However, sharing or using these models outside of MATLAB is often problematic. A community standard such as Systems Biology Markup Language (SBML) can serve as a neutral exchange format, but translating models from MATLAB to SBML can be challenging-especially for legacy models not written with translation in mind. We developed MOCCASIN (Model ODE Converter for Creating Automated SBML INteroperability) to help. MOCCASIN can convert ODE-based MATLAB models of biochemical reaction networks into the SBML format. MOCCASIN is available under the terms of the LGPL 2.1 license (http://www.gnu.org/licenses/lgpl-2.1.html). Source code, binaries and test cases can be freely obtained from https://github.com/sbmlteam/moccasin : mhucka@caltech.edu More information is available at https://github.com/sbmlteam/moccasin. © The Author 2016. Published by Oxford University Press.
Teaching Computational Thinking: Deciding to Take Small Steps in a Curriculum
NASA Astrophysics Data System (ADS)
Madoff, R. D.; Putkonen, J.
2016-12-01
While computational thinking and reasoning are not necessarily the same as computer programming, programs such as MATLAB can provide the medium through which the logical and computational thinking at the foundation of science can be taught, learned, and experienced. And while math and computer anxiety are often discussed as critical obstacles to students' progress in their geoscience curriculum, it is here suggested that an unfamiliarity with the computational and logical reasoning is what poses a first stumbling block, in addition to the hurdle of expending the effort to learn how to translate a computational problem into the appropriate computer syntax in order to achieve the intended results. Because computational thinking is so vital for all fields, there is a need to initiate many and to build support in the curriculum for it. This presentation focuses on elements to bring into the teaching of computational thinking that are intended as additions to learning MATLAB programming as a basic tool. Such elements include: highlighting a key concept, discussing a basic geoscience problem where the concept would show up, having the student draw or outline a sketch of what they think an operation needs to do in order to perform a desired result, and then finding the relevant syntax to work with. This iterative pedagogy simulates what someone with more experience in programming does, so it discloses the thinking process in the black box of a result. Intended as only a very early stage introduction, advanced applications would need to be developed as students go through an academic program. The objective would be to expose and introduce computational thinking to majors and non-majors and to alleviate some of the math and computer anxiety so that students would choose to advance on with programming or modeling, whether it is built into a 4-year curriculum or not.
Kinematic analysis of the finger exoskeleton using MATLAB/Simulink.
Nasiłowski, Krzysztof; Awrejcewicz, Jan; Lewandowski, Donat
2014-01-01
A paralyzed and not fully functional part of human body can be supported by the properly designed exoskeleton system with motoric abilities. It can help in rehabilitation, or movement of a disabled/paralyzed limb. Both suitably selected geometry and specialized software are studied applying the MATLAB environment. A finger exoskeleton was the base for MATLAB/Simulink model. Specialized software, such as MATLAB/Simulink give us an opportunity to optimize calculation reaching precise results, which help in next steps of design process. The calculations carried out yield information regarding movement relation between three functionally connected actuators and showed distance and velocity changes during the whole simulation time.
Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela
2013-05-01
Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% <20%. In conclusion, our MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML
NASA Technical Reports Server (NTRS)
Stueber, Thomas J.; Paxson, Daniel E.
2014-01-01
The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.
NASA Astrophysics Data System (ADS)
Lee, Eun Young; Novotny, Johannes; Wagreich, Michael
2015-04-01
In recent years, 3D visualization of sedimentary basins has become increasingly popular. Stratigraphic and structural mapping is highly important to understand the internal setting of sedimentary basins. And subsequent subsidence analysis provides significant insights for basin evolution. This study focused on developing a simple and user-friendly program which allows geologists to analyze and model sedimentary basin data. The developed program is aimed at stratigraphic and subsidence modelling of sedimentary basins from wells or stratigraphic profile data. This program is mainly based on two numerical methods; surface interpolation and subsidence analysis. For surface visualization four different interpolation techniques (Linear, Natural, Cubic Spline, and Thin-Plate Spline) are provided in this program. The subsidence analysis consists of decompaction and backstripping techniques. The numerical methods are computed in MATLAB® which is a multi-paradigm numerical computing environment used extensively in academic, research, and industrial fields. This program consists of five main processing steps; 1) setup (study area and stratigraphic units), 2) loading of well data, 3) stratigraphic modelling (depth distribution and isopach plots), 4) subsidence parameter input, and 5) subsidence modelling (subsided depth and subsidence rate plots). The graphical user interface intuitively guides users through all process stages and provides tools to analyse and export the results. Interpolation and subsidence results are cached to minimize redundant computations and improve the interactivity of the program. All 2D and 3D visualizations are created by using MATLAB plotting functions, which enables users to fine-tune the visualization results using the full range of available plot options in MATLAB. All functions of this program are illustrated with a case study of Miocene sediments in the Vienna Basin. The basin is an ideal place to test this program, because sufficient data is available to analyse and model stratigraphic setting and subsidence evolution of the basin. The study area covers approximately 1200 km2 including 110 data points in the central part of the Vienna Basin.
Reproductive preferences in Matlab, Bangladesh: levels, motivation and differentials.
Razzaque, A
1996-03-01
This study provides evidence that aspirations for a smaller family and poverty both determined the reduction in family size preferences in the Matlab area of Bangladesh. Data are obtained from a variety of data sets: the 1990 Knowledge, Attitude, and Practice Survey; the 1982 Socioeconomic Survey; and the 1991 Qualitative Survey. Both treatment and nontreatment areas of Matlab experienced a fertility decline during 1976-90, from 6.9 to 3.6 children/woman in the treatment area and from 7.2 to 5.2 in the control area. In this study, multiple classification analysis and logistic regression analysis were conducted. Findings indicate that mean desired family sizes were similar in both areas and slightly higher in the treatment area. Desired family size declined during 1975-90. Most of the decline probably occurred prior to 1985. Findings from qualitative interviews indicate that most women reported that the smaller desired family size was related to the direct economic cost of children. Women also reported that family planning was now available and that in the past there were more resources for caring for large families. Mothers-in-law were open to informing their daughters-in-law about the desire for small families. This motivation for a small family among older and younger women was not present 10 years ago. Findings reveal that desired family size did not vary by age, family size, socioeconomic group, or existence of the Family Planning and Health Services Program.
Nagy, Peter; Szabó, Ágnes; Váradi, Tímea; Kovács, Tamás; Batta, Gyula; Szöllősi, János
2016-04-01
Fluorescence or Förster resonance energy transfer (FRET) remains one of the most widely used methods for assessing protein clustering and conformation. Although it is a method with solid physical foundations, many applications of FRET fall short of providing quantitative results due to inappropriate calibration and controls. This shortcoming is especially valid for microscopy where currently available tools have limited or no capability at all to display parameter distributions or to perform gating. Since users of multiparameter flow cytometry usually apply these tools, the absence of these features in applications developed for microscopic FRET analysis is a significant limitation. Therefore, we developed a graphical user interface-controlled Matlab application for the evaluation of ratiometric, intensity-based microscopic FRET measurements. The program can calculate all the necessary overspill and spectroscopic correction factors and the FRET efficiency and it displays the results on histograms and dot plots. Gating on plots and mask images can be used to limit the calculation to certain parts of the image. It is an important feature of the program that the calculated parameters can be determined by regression methods, maximum likelihood estimation (MLE) and from summed intensities in addition to pixel-by-pixel evaluation. The confidence interval of calculated parameters can be estimated using parameter simulations if the approximate average number of detected photons is known. The program is not only user-friendly, but it provides rich output, it gives the user freedom to choose from different calculation modes and it gives insight into the reliability and distribution of the calculated parameters. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.
A Matlab Program for Textural Classification Using Neural Networks
NASA Astrophysics Data System (ADS)
Leite, E. P.; de Souza, C.
2008-12-01
A new MATLAB code that provides tools to perform classification of textural images for applications in the Geosciences is presented. The program, here coined TEXTNN, comprises the computation of variogram maps in the frequency domain for specific lag distances in the neighborhood of a pixel. The result is then converted back to spatial domain, where directional or ominidirectional semivariograms are extracted. Feature vectors are built with textural information composed of the semivariance values at these lag distances and, moreover, with histogram measures of mean, standard deviation and weighted fill-ratio. This procedure is applied to a selected group of pixels or to all pixels in an image using a moving window. A feed- forward back-propagation Neural Network can then be designed and trained on feature vectors of predefined classes (training set). The training phase minimizes the mean-squared error on the training set. Additionally, at each iteration, the mean-squared error for every validation is assessed and a test set is evaluated. The program also calculates contingency matrices, global accuracy and kappa coefficient for the three data sets, allowing a quantitative appraisal of the predictive power of the Neural Network models. The interpreter is able to select the best model obtained from a k-fold cross-validation or to use a unique split-sample data set for classification of all pixels in a given textural image. The code is opened to the geoscientific community and is very flexible, allowing the experienced user to modify it as necessary. The performance of the algorithms and the end-user program were tested using synthetic images, orbital SAR (RADARSAT) imagery for oil seepage detection, and airborne, multi-polarimetric SAR imagery for geologic mapping. The overall results proved very promising.
NASA Astrophysics Data System (ADS)
Ozgun, Ozlem; Apaydin, Gökhan; Kuzuoglu, Mustafa; Sevgi, Levent
2011-12-01
A MATLAB-based one-way and two-way split-step parabolic equation software tool (PETOOL) has been developed with a user-friendly graphical user interface (GUI) for the analysis and visualization of radio-wave propagation over variable terrain and through homogeneous and inhomogeneous atmosphere. The tool has a unique feature over existing one-way parabolic equation (PE)-based codes, because it utilizes the two-way split-step parabolic equation (SSPE) approach with wide-angle propagator, which is a recursive forward-backward algorithm to incorporate both forward and backward waves into the solution in the presence of variable terrain. First, the formulation of the classical one-way SSPE and the relatively-novel two-way SSPE is presented, with particular emphasis on their capabilities and the limitations. Next, the structure and the GUI capabilities of the PETOOL software tool are discussed in detail. The calibration of PETOOL is performed and demonstrated via analytical comparisons and/or representative canonical tests performed against the Geometric Optic (GO) + Uniform Theory of Diffraction (UTD). The tool can be used for research and/or educational purposes to investigate the effects of a variety of user-defined terrain and range-dependent refractivity profiles in electromagnetic wave propagation. Program summaryProgram title: PETOOL (Parabolic Equation Toolbox) Catalogue identifier: AEJS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 143 349 No. of bytes in distributed program, including test data, etc.: 23 280 251 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) 2010a. Partial Differential Toolbox and Curve Fitting Toolbox required Computer: PC Operating system: Windows XP and Vista Classification: 10 Nature of problem: Simulation of radio-wave propagation over variable terrain on the Earth's surface, and through homogeneous and inhomogeneous atmosphere. Solution method: The program implements one-way and two-way Split-Step Parabolic Equation (SSPE) algorithm, with wide-angle propagator. The SSPE is, in general, an initial-value problem starting from a reference range (typically from an antenna), and marching out in range by obtaining the field along the vertical direction at each range step, through the use of step-by-step Fourier transformations. The two-way algorithm incorporates the backward-propagating waves into the standard one-way SSPE by utilizing an iterative forward-backward scheme for modeling multipath effects over a staircase-approximated terrain. Unusual features: This is the first software package implementing a recursive forward-backward SSPE algorithm to account for the multipath effects during radio-wave propagation, and enabling the user to easily analyze and visualize the results of the two-way propagation with GUI capabilities. Running time: Problem dependent. Typically, it is about 1.5 ms (for conducting ground) and 4 ms (for lossy ground) per range step for a vertical field profile of vector length 1500, on Intel Core 2 Duo 1.6 GHz with 2 GB RAM under Windows Vista.
NASA Technical Reports Server (NTRS)
Davis, M. W.
1984-01-01
A Real-Time Self-Adaptive (RTSA) active vibration controller was used as the framework in developing a computer program for a generic controller that can be used to alleviate helicopter vibration. Based upon on-line identification of system parameters, the generic controller minimizes vibration in the fuselage by closed-loop implementation of higher harmonic control in the main rotor system. The new generic controller incorporates a set of improved algorithms that gives the capability to readily define many different configurations by selecting one of three different controller types (deterministic, cautious, and dual), one of two linear system models (local and global), and one or more of several methods of applying limits on control inputs (external and/or internal limits on higher harmonic pitch amplitude and rate). A helicopter rotor simulation analysis was used to evaluate the algorithms associated with the alternative controller types as applied to the four-bladed H-34 rotor mounted on the NASA Ames Rotor Test Apparatus (RTA) which represents the fuselage. After proper tuning all three controllers provide more effective vibration reduction and converge more quickly and smoothly with smaller control inputs than the initial RTSA controller (deterministic with external pitch-rate limiting). It is demonstrated that internal limiting of the control inputs a significantly improves the overall performance of the deterministic controller.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Fei; Huang, Yongxi
Here, we develop a multistage, stochastic mixed-integer model to support biofuel supply chain expansion under evolving uncertainties. By utilizing the block-separable recourse property, we reformulate the multistage program in an equivalent two-stage program and solve it using an enhanced nested decomposition method with maximal non-dominated cuts. We conduct extensive numerical experiments and demonstrate the application of the model and algorithm in a case study based on the South Carolina settings. The value of multistage stochastic programming method is also explored by comparing the model solution with the counterparts of an expected value based deterministic model and a two-stage stochastic model.
Xie, Fei; Huang, Yongxi
2018-02-04
Here, we develop a multistage, stochastic mixed-integer model to support biofuel supply chain expansion under evolving uncertainties. By utilizing the block-separable recourse property, we reformulate the multistage program in an equivalent two-stage program and solve it using an enhanced nested decomposition method with maximal non-dominated cuts. We conduct extensive numerical experiments and demonstrate the application of the model and algorithm in a case study based on the South Carolina settings. The value of multistage stochastic programming method is also explored by comparing the model solution with the counterparts of an expected value based deterministic model and a two-stage stochastic model.
MATLAB Software Versions and Licenses for the Peregrine System |
: Feature usage info: Users of MATLAB: (Total of 6 licenses issued; Total of ... licenses in use) Users of Compiler: (Total of 1 license issued; Total of ... licenses in use) Users of Distrib_Computing_Toolbox : (Total of 4 licenses issued; Total of ... licenses in use) Users of MATLAB_Distrib_Comp_Engine: (Total of
ERIC Educational Resources Information Center
Rogers, Mandy; Rowan, Leonie; Walker, Rachel
2014-01-01
Within literature relating to the broad field of boys' education attention is regularly drawn to the significant difference between essentialist and anti-essentialist accounts of "the boy problem" and the limitations of gender-based educational reforms which rely upon deterministic notions of what boys are "really" like and, by…
Multi-Strain Deterministic Chaos in Dengue Epidemiology, A Challenge for Computational Mathematics
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Kooi, Bob W.; Stollenwerk, Nico
2009-09-01
Recently, we have analysed epidemiological models of competing strains of pathogens and hence differences in transmission for first versus secondary infection due to interaction of the strains with previously aquired immunities, as has been described for dengue fever, known as antibody dependent enhancement (ADE). These models show a rich variety of dynamics through bifurcations up to deterministic chaos. Including temporary cross-immunity even enlarges the parameter range of such chaotic attractors, and also gives rise to various coexisting attractors, which are difficult to identify by standard numerical bifurcation programs using continuation methods. A combination of techniques, including classical bifurcation plots and Lyapunov exponent spectra has to be applied in comparison to get further insight into such dynamical structures. Especially, Lyapunov spectra, which quantify the predictability horizon in the epidemiological system, are computationally very demanding. We show ways to speed up computations of such Lyapunov spectra by a factor of more than ten by parallelizing previously used sequential C programs. Such fast computations of Lyapunov spectra will be especially of use in future investigations of seasonally forced versions of the present models, as they are needed for data analysis.
Scalable Replay with Partial-Order Dependencies for Message-Logging Fault Tolerance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lifflander, Jonathan; Meneses, Esteban; Menon, Harshita
2014-09-22
Deterministic replay of a parallel application is commonly used for discovering bugs or to recover from a hard fault with message-logging fault tolerance. For message passing programs, a major source of overhead during forward execution is recording the order in which messages are sent and received. During replay, this ordering must be used to deterministically reproduce the execution. Previous work in replay algorithms often makes minimal assumptions about the programming model and application in order to maintain generality. However, in many cases, only a partial order must be recorded due to determinism intrinsic in the code, ordering constraints imposed bymore » the execution model, and events that are commutative (their relative execution order during replay does not need to be reproduced exactly). In this paper, we present a novel algebraic framework for reasoning about the minimum dependencies required to represent the partial order for different concurrent orderings and interleavings. By exploiting this theory, we improve on an existing scalable message-logging fault tolerance scheme. The improved scheme scales to 131,072 cores on an IBM BlueGene/P with up to 2x lower overhead than one that records a total order.« less
Parameter estimation in spiking neural networks: a reverse-engineering approach.
Rostro-Gonzalez, H; Cessac, B; Vieville, T
2012-04-01
This paper presents a reverse engineering approach for parameter estimation in spiking neural networks (SNNs). We consider the deterministic evolution of a time-discretized network with spiking neurons, where synaptic transmission has delays, modeled as a neural network of the generalized integrate and fire type. Our approach aims at by-passing the fact that the parameter estimation in SNN results in a non-deterministic polynomial-time hard problem when delays are to be considered. Here, this assumption has been reformulated as a linear programming (LP) problem in order to perform the solution in a polynomial time. Besides, the LP problem formulation makes the fact that the reverse engineering of a neural network can be performed from the observation of the spike times explicit. Furthermore, we point out how the LP adjustment mechanism is local to each neuron and has the same structure as a 'Hebbian' rule. Finally, we present a generalization of this approach to the design of input-output (I/O) transformations as a practical method to 'program' a spiking network, i.e. find a set of parameters allowing us to exactly reproduce the network output, given an input. Numerical verifications and illustrations are provided.
[Stochastic model of infectious diseases transmission].
Ruiz-Ramírez, Juan; Hernández-Rodríguez, Gabriela Eréndira
2009-01-01
Propose a mathematic model that shows how population structure affects the size of infectious disease epidemics. This study was conducted during 2004 at the University of Colima. It used generalized small-world network topology to represent contacts that occurred within and between families. To that end, two programs in MATLAB were conducted to calculate the efficiency of the network. The development of a program in the C programming language was also required, that represents the stochastic susceptible-infectious-removed model, and simultaneous results were obtained for the number of infected people. An increased number of families connected by meeting sites impacted the size of the infectious diseases by roughly 400%. Population structure influences the rapid spread of infectious diseases, reaching epidemic effects.
Competitive Facility Location with Random Demands
NASA Astrophysics Data System (ADS)
Uno, Takeshi; Katagiri, Hideki; Kato, Kosuke
2009-10-01
This paper proposes a new location problem of competitive facilities, e.g. shops and stores, with uncertain demands in the plane. By representing the demands for facilities as random variables, the location problem is formulated to a stochastic programming problem, and for finding its solution, three deterministic programming problems: expectation maximizing problem, probability maximizing problem, and satisfying level maximizing problem are considered. After showing that one of their optimal solutions can be found by solving 0-1 programming problems, their solution method is proposed by improving the tabu search algorithm with strategic vibration. Efficiency of the solution method is shown by applying to numerical examples of the facility location problems.
Fuzzy linear model for production optimization of mining systems with multiple entities
NASA Astrophysics Data System (ADS)
Vujic, Slobodan; Benovic, Tomo; Miljanovic, Igor; Hudej, Marjan; Milutinovic, Aleksandar; Pavlovic, Petar
2011-12-01
Planning and production optimization within multiple mines or several work sites (entities) mining systems by using fuzzy linear programming (LP) was studied. LP is the most commonly used operations research methods in mining engineering. After the introductory review of properties and limitations of applying LP, short reviews of the general settings of deterministic and fuzzy LP models are presented. With the purpose of comparative analysis, the application of both LP models is presented using the example of the Bauxite Basin Niksic with five mines. After the assessment, LP is an efficient mathematical modeling tool in production planning and solving many other single-criteria optimization problems of mining engineering. After the comparison of advantages and deficiencies of both deterministic and fuzzy LP models, the conclusion presents benefits of the fuzzy LP model but is also stating that seeking the optimal plan of production means to accomplish the overall analysis that will encompass the LP model approaches.
The value of believing in free will: encouraging a belief in determinism increases cheating.
Vohs, Kathleen D; Schooler, Jonathan W
2008-01-01
Does moral behavior draw on a belief in free will? Two experiments examined whether inducing participants to believe that human behavior is predetermined would encourage cheating. In Experiment 1, participants read either text that encouraged a belief in determinism (i.e., that portrayed behavior as the consequence of environmental and genetic factors) or neutral text. Exposure to the deterministic message increased cheating on a task in which participants could passively allow a flawed computer program to reveal answers to mathematical problems that they had been instructed to solve themselves. Moreover, increased cheating behavior was mediated by decreased belief in free will. In Experiment 2, participants who read deterministic statements cheated by overpaying themselves for performance on a cognitive task; participants who read statements endorsing free will did not. These findings suggest that the debate over free will has societal, as well as scientific and theoretical, implications.
On optimal control of linear systems in the presence of multiplicative noise
NASA Technical Reports Server (NTRS)
Joshi, S. M.
1976-01-01
This correspondence considers the problem of optimal regulator design for discrete time linear systems subjected to white state-dependent and control-dependent noise in addition to additive white noise in the input and the observations. A pseudo-deterministic problem is first defined in which multiplicative and additive input disturbances are present, but noise-free measurements of the complete state vector are available. This problem is solved via discrete dynamic programming. Next is formulated the problem in which the number of measurements is less than that of the state variables and the measurements are contaminated with state-dependent noise. The inseparability of control and estimation is brought into focus, and an 'enforced separation' solution is obtained via heuristic reasoning in which the control gains are shown to be the same as those in the pseudo-deterministic problem. An optimal linear state estimator is given in order to implement the controller.
Stochastic Watershed Models for Risk Based Decision Making
NASA Astrophysics Data System (ADS)
Vogel, R. M.
2017-12-01
Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation
NASA Astrophysics Data System (ADS)
Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing
2018-05-01
The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.
Using Agent Base Models to Optimize Large Scale Network for Large System Inventories
NASA Technical Reports Server (NTRS)
Shameldin, Ramez Ahmed; Bowling, Shannon R.
2010-01-01
The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.
NASA Technical Reports Server (NTRS)
Steck, Daniel
2009-01-01
This report documents the generation of an outbound Earth to Moon transfer preliminary database consisting of four cases calculated twice a day for a 19 year period. The database was desired as the first step in order for NASA to rapidly generate Earth to Moon trajectories for the Constellation Program using the Mission Assessment Post Processor. The completed database was created running a flight trajectory and optimization program, called Copernicus, in batch mode with the use of newly created Matlab functions. The database is accurate and has high data resolution. The techniques and scripts developed to generate the trajectory information will also be directly used in generating a comprehensive database.
Structural Deterministic Safety Factors Selection Criteria and Verification
NASA Technical Reports Server (NTRS)
Verderaime, V.
1992-01-01
Though current deterministic safety factors are arbitrarily and unaccountably specified, its ratio is rooted in resistive and applied stress probability distributions. This study approached the deterministic method from a probabilistic concept leading to a more systematic and coherent philosophy and criterion for designing more uniform and reliable high-performance structures. The deterministic method was noted to consist of three safety factors: a standard deviation multiplier of the applied stress distribution; a K-factor for the A- or B-basis material ultimate stress; and the conventional safety factor to ensure that the applied stress does not operate in the inelastic zone of metallic materials. The conventional safety factor is specifically defined as the ratio of ultimate-to-yield stresses. A deterministic safety index of the combined safety factors was derived from which the corresponding reliability proved the deterministic method is not reliability sensitive. The bases for selecting safety factors are presented and verification requirements are discussed. The suggested deterministic approach is applicable to all NASA, DOD, and commercial high-performance structures under static stresses.
Standardizing Methods for Weapons Accuracy and Effectiveness Evaluation
2014-06-01
37 B. MONTE CARLO APPROACH............................37 C. EXPECTED VALUE THEOREM..........................38 D. PHIT /PNM METHODOLOGY...MATLAB CODE – SR_CDF_DATA.......................96 F. MATLAB CODE – GE_EXTRACT........................98 G. MATLAB CODE - PHIT /PNM...Normal fit to test data.........................18 Figure 11. Double Normal fit to test data..................19 Figure 12. PHIT /PNM Methodology (from
Subband/Transform MATLAB Functions For Processing Images
NASA Technical Reports Server (NTRS)
Glover, D.
1995-01-01
SUBTRANS software is package of routines implementing image-data-processing functions for use with MATLAB*(TM) software. Provides capability to transform image data with block transforms and to produce spatial-frequency subbands of transformed data. Functions cascaded to provide further decomposition into more subbands. Also used in image-data-compression systems. For example, transforms used to prepare data for lossy compression. Written for use in MATLAB mathematical-analysis environment.
Detection and Classification of Objects in Synthetic Aperture Radar Imagery
2006-02-01
a higher False Alarm Rate (FAR). Currently, a standard edge detector is the Canny algorithm, which is available with the mathematics package MATLAB ...the algorithm used to calculate the Radon transform. The MATLAB implementation uses the built in Radon transform procedure, which is extremely... MATLAB code for a faster forward-backwards selection process has also been provided. In both cases, the feature selection was accomplished by using
Kemeny, Steven Frank; Clyne, Alisa Morss
2011-04-01
Fiber alignment plays a critical role in the structure and function of cells and tissues. While fiber alignment quantification is important to experimental analysis and several different methods for quantifying fiber alignment exist, many studies focus on qualitative rather than quantitative analysis perhaps due to the complexity of current fiber alignment methods. Speed and sensitivity were compared in edge detection and fast Fourier transform (FFT) for measuring actin fiber alignment in cells exposed to shear stress. While edge detection using matrix multiplication was consistently more sensitive than FFT, image processing time was significantly longer. However, when MATLAB functions were used to implement edge detection, MATLAB's efficient element-by-element calculations and fast filtering techniques reduced computation cost 100 times compared to the matrix multiplication edge detection method. The new computation time was comparable to the FFT method, and MATLAB edge detection produced well-distributed fiber angle distributions that statistically distinguished aligned and unaligned fibers in half as many sample images. When the FFT sensitivity was improved by dividing images into smaller subsections, processing time grew larger than the time required for MATLAB edge detection. Implementation of edge detection in MATLAB is simpler, faster, and more sensitive than FFT for fiber alignment quantification.
2013-03-01
operation. 2.1.2 Canine model The canine experiment (n ¼ 1) was performed as a validation of the correlation of visible reflectance imaging measurements...http://spiedl.org/terms with actual blood oxygenation. The canine laparotomy, as part of an animal protocol approved by the Institutional Animal Care and...All data analysis was performed using algorithms and software written in-house using the programming languages Matlab and IDL/ ENVI (ITT Visual
Simulation of Optical Resonators for Vertical-Cavity Surface-Emitting Lasers (vcsel)
NASA Astrophysics Data System (ADS)
Mansour, Mohy S.; Hassen, Mahmoud F. M.; El-Nozahey, Adel M.; Hafez, Alaa S.; Metry, Samer F.
2010-04-01
Simulation and modeling of the reflectivity and transmissivity of the multilayer DBR of VCSEL, as well as inside the active region quantum well are analyzed using the characteristic matrix method. The electric field intensity distributions inside such vertical-cavity structure are calculated. A software program under MATLAB environment is constructed for the simulation. This study was performed for two specific Bragg wavelengths 980 nm and 370 nm for achieving a resonant periodic gain (RPG)
2006-06-01
Scientific Research. 5PAM-Crash is a trademark of the ESI Group . 6MATLAB and SIMULINK are registered trademarks of the MathWorks. 14 maneuvers...Laboratory (ARL) to develop methodologies to evaluate robotic behavior algorithms that control the actions of individual robots or groups of robots...methodologies to evaluate robotic behavior algorithms that control the actions of individual robots or groups of robots acting as a team to perform a
NASA Technical Reports Server (NTRS)
Brenner, Richard; Lala, Jaynarayan H.; Nagle, Gail A.; Schor, Andrei; Turkovich, John
1994-01-01
This program demonstrated the integration of a number of technologies that can increase the availability and reliability of launch vehicles while lowering costs. Availability is increased with an advanced guidance algorithm that adapts trajectories in real-time. Reliability is increased with fault-tolerant computers and communication protocols. Costs are reduced by automatically generating code and documentation. This program was realized through the cooperative efforts of academia, industry, and government. The NASA-LaRC coordinated the effort, while Draper performed the integration. Georgia Institute of Technology supplied a weak Hamiltonian finite element method for optimal control problems. Martin Marietta used MATLAB to apply this method to a launch vehicle (FENOC). Draper supplied the fault-tolerant computing and software automation technology. The fault-tolerant technology includes sequential and parallel fault-tolerant processors (FTP & FTPP) and authentication protocols (AP) for communication. Fault-tolerant technology was incrementally incorporated. Development culminated with a heterogeneous network of workstations and fault-tolerant computers using AP. Draper's software automation system, ASTER, was used to specify a static guidance system based on FENOC, navigation, flight control (GN&C), models, and the interface to a user interface for mission control. ASTER generated Ada code for GN&C and C code for models. An algebraic transform engine (ATE) was developed to automatically translate MATLAB scripts into ASTER.
A Computer Program for Drip Irrigation System Design for Small Plots
NASA Astrophysics Data System (ADS)
Philipova, Nina; Nicheva, Olga; Kazandjiev, Valentin; Chilikova-Lubomirova, Mila
2012-12-01
A computer programhas been developed for design of surface drip irrigation system. It could be applied for calculation of small scale fields with an area up to 10 ha. The program includes two main parts: crop water requirements and hydraulic calculations of the system. It has been developed in Graphical User Interface in MATLAB and gives opportunity for selecting some parameters from tables such as: agro- physical soil properties, characteristics of the corresponding crop, climatic data. It allows the user of the program to assume and set a definite value, for example the emitter discharge, plot parameters and etc. Eight cases of system layout according to the water source layout and the number of plots of the system operation are laid into hydraulic section of the program. It includes the design of lateral, manifold, main line and pump calculations. The program has been compiled to work in Windows.
Schlaier, Juergen R; Beer, Anton L; Faltermeier, Rupert; Fellner, Claudia; Steib, Kathrin; Lange, Max; Greenlee, Mark W; Brawanski, Alexander T; Anthofer, Judith M
2017-06-01
This study compared tractography approaches for identifying cerebellar-thalamic fiber bundles relevant to planning target sites for deep brain stimulation (DBS). In particular, probabilistic and deterministic tracking of the dentate-rubro-thalamic tract (DRTT) and differences between the spatial courses of the DRTT and the cerebello-thalamo-cortical (CTC) tract were compared. Six patients with movement disorders were examined by magnetic resonance imaging (MRI), including two sets of diffusion-weighted images (12 and 64 directions). Probabilistic and deterministic tractography was applied on each diffusion-weighted dataset to delineate the DRTT. Results were compared with regard to their sensitivity in revealing the DRTT and additional fiber tracts and processing time. Two sets of regions-of-interests (ROIs) guided deterministic tractography of the DRTT or the CTC, respectively. Tract distances to an atlas-based reference target were compared. Probabilistic fiber tracking with 64 orientations detected the DRTT in all twelve hemispheres. Deterministic tracking detected the DRTT in nine (12 directions) and in only two (64 directions) hemispheres. Probabilistic tracking was more sensitive in detecting additional fibers (e.g. ansa lenticularis and medial forebrain bundle) than deterministic tracking. Probabilistic tracking lasted substantially longer than deterministic. Deterministic tracking was more sensitive in detecting the CTC than the DRTT. CTC tracts were located adjacent but consistently more posterior to DRTT tracts. These results suggest that probabilistic tracking is more sensitive and robust in detecting the DRTT but harder to implement than deterministic approaches. Although sensitivity of deterministic tracking is higher for the CTC than the DRTT, targets for DBS based on these tracts likely differ. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Matlab-Excel Interface for OpenDSS
DOE Office of Scientific and Technical Information (OSTI.GOV)
The software allows users of the OpenDSS grid modeling software to access their load flow models using a GUI interface developed in MATLAB. The circuit definitions are entered into a Microsoft Excel spreadsheet which makes circuit creation and editing a much simpler process than the basic text-based editors used in the native OpenDSS interface. Plot tools have been developed which can be accessed through a MATLAB GUI once the desired parameters have been simulated.
NASA Technical Reports Server (NTRS)
Howard, Joseph
2007-01-01
The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.
Integration of MATLAB Simulink(Registered Trademark) Models with the Vertical Motion Simulator
NASA Technical Reports Server (NTRS)
Lewis, Emily K.; Vuong, Nghia D.
2012-01-01
This paper describes the integration of MATLAB Simulink(Registered TradeMark) models into the Vertical Motion Simulator (VMS) at NASA Ames Research Center. The VMS is a high-fidelity, large motion flight simulator that is capable of simulating a variety of aerospace vehicles. Integrating MATLAB Simulink models into the VMS needed to retain the development flexibility of the MATLAB environment and allow rapid deployment of model changes. The process developed at the VMS was used successfully in a number of recent simulation experiments. This accomplishment demonstrated that the model integrity was preserved, while working within the hard real-time run environment of the VMS architecture, and maintaining the unique flexibility of the VMS to meet diverse research requirements.
Changing patient population in Dhaka Hospital and Matlab Hospital of icddr,b.
Das, S K; Rahman, A; Chisti, M J; Ahmed, S; Malek, M A; Salam, M A; Bardhan, P K; Faruque, A S G
2014-02-01
The Diarrhoeal Disease Surveillance System of icddr,b noted increasing number of patients ≥60 years at urban Dhaka and rural Matlab from 2001 to 2012. Shigella and Vibrio cholerae were more frequently isolated from elderly people than children under 5 years and adults aged 5-59 in both areas. The resistance observed to various drugs of Shigella in Dhaka and Matlab was trimethoprim-sulphamethoxazole (72-63%), ampicillin (43-55%), nalidixic acid (58-61%), mecillinam (12-9%), azithromycin (13-0%), ciprofloxacin (11-13%) and ceftriaxone (11-0%). Vibrio cholerae isolated in Dhaka and Matlab was resistant to trimethoprim-sulphamethoxazole (98-94%), furazolidone (100%), erythromycin (71-53%), tetracycline (46-44%), ciprofloxacin (3-10%) and azithromycin (3-0%). © 2013 John Wiley & Sons Ltd.
GEO2D - Two-Dimensional Computer Model of a Ground Source Heat Pump System
James Menart
2013-06-07
This file contains a zipped file that contains many files required to run GEO2D. GEO2D is a computer code for simulating ground source heat pump (GSHP) systems in two-dimensions. GEO2D performs a detailed finite difference simulation of the heat transfer occurring within the working fluid, the tube wall, the grout, and the ground. Both horizontal and vertical wells can be simulated with this program, but it should be noted that the vertical wall is modeled as a single tube. This program also models the heat pump in conjunction with the heat transfer occurring. GEO2D simulates the heat pump and ground loop as a system. Many results are produced by GEO2D as a function of time and position, such as heat transfer rates, temperatures and heat pump performance. On top of this information from an economic comparison between the geothermal system simulated and a comparable air heat pump systems or a comparable gas, oil or propane heating systems with a vapor compression air conditioner. The version of GEO2D in the attached file has been coupled to the DOE heating and cooling load software called ENERGYPLUS. This is a great convenience for the user because heating and cooling loads are an input to GEO2D. GEO2D is a user friendly program that uses a graphical user interface for inputs and outputs. These make entering data simple and they produce many plotted results that are easy to understand. In order to run GEO2D access to MATLAB is required. If this program is not available on your computer you can download the program MCRInstaller.exe, the 64 bit version, from the MATLAB website or from this geothermal depository. This is a free download which will enable you to run GEO2D..
Binomial tau-leap spatial stochastic simulation algorithm for applications in chemical kinetics.
Marquez-Lago, Tatiana T; Burrage, Kevin
2007-09-14
In cell biology, cell signaling pathway problems are often tackled with deterministic temporal models, well mixed stochastic simulators, and/or hybrid methods. But, in fact, three dimensional stochastic spatial modeling of reactions happening inside the cell is needed in order to fully understand these cell signaling pathways. This is because noise effects, low molecular concentrations, and spatial heterogeneity can all affect the cellular dynamics. However, there are ways in which important effects can be accounted without going to the extent of using highly resolved spatial simulators (such as single-particle software), hence reducing the overall computation time significantly. We present a new coarse grained modified version of the next subvolume method that allows the user to consider both diffusion and reaction events in relatively long simulation time spans as compared with the original method and other commonly used fully stochastic computational methods. Benchmarking of the simulation algorithm was performed through comparison with the next subvolume method and well mixed models (MATLAB), as well as stochastic particle reaction and transport simulations (CHEMCELL, Sandia National Laboratories). Additionally, we construct a model based on a set of chemical reactions in the epidermal growth factor receptor pathway. For this particular application and a bistable chemical system example, we analyze and outline the advantages of our presented binomial tau-leap spatial stochastic simulation algorithm, in terms of efficiency and accuracy, in scenarios of both molecular homogeneity and heterogeneity.
Approximate Dynamic Programming and Aerial Refueling
2007-06-01
by two Army Air Corps de Havilland DH -4Bs (9). While crude by modern standards, the passing of hoses be- tween planes is effectively the same approach...incorporating stochastic data sets. . . . . . . . . . . 106 55 Total Cost Stochastically Trained Simulations versus Deterministically Trained Simulations...incorporating stochastic data sets. 106 To create meaningful results when testing stochastic data, the data sets are av- eraged so that conclusions are not
Optimal control of hydroelectric facilities
NASA Astrophysics Data System (ADS)
Zhao, Guangzhi
This thesis considers a simple yet realistic model of pump-assisted hydroelectric facilities operating in a market with time-varying but deterministic power prices. Both deterministic and stochastic water inflows are considered. The fluid mechanical and engineering details of the facility are described by a model containing several parameters. We present a dynamic programming algorithm for optimizing either the total energy produced or the total cash generated by these plants. The algorithm allows us to give the optimal control strategy as a function of time and to see how this strategy, and the associated plant value, varies with water inflow and electricity price. We investigate various cases. For a single pumped storage facility experiencing deterministic power prices and water inflows, we investigate the varying behaviour for an oversimplified constant turbine- and pump-efficiency model with simple reservoir geometries. We then generalize this simple model to include more realistic turbine efficiencies, situations with more complicated reservoir geometry, and the introduction of dissipative switching costs between various control states. We find many results which reinforce our physical intuition about this complicated system as well as results which initially challenge, though later deepen, this intuition. One major lesson of this work is that the optimal control strategy does not differ much between two differing objectives of maximizing energy production and maximizing its cash value. We then turn our attention to the case of stochastic water inflows. We present a stochastic dynamic programming algorithm which can find an on-average optimal control in the face of this randomness. As the operator of a facility must be more cautious when inflows are random, the randomness destroys facility value. Following this insight we quantify exactly how much a perfect hydrological inflow forecast would be worth to a dam operator. In our final chapter we discuss the challenging problem of optimizing a sequence of two hydro dams sharing the same river system. The complexity of this problem is magnified and we just scratch its surface here. The thesis concludes with suggestions for future work in this fertile area. Keywords: dynamic programming, hydroelectric facility, optimization, optimal control, switching cost, turbine efficiency.
Hybrid photovoltaic/thermal (PV/T) solar systems simulation with Simulink/Matlab
DOE Office of Scientific and Technical Information (OSTI.GOV)
da Silva, R.M.; Fernandes, J.L.M.
The purpose of this work consists in thermodynamic modeling of hybrid photovoltaic-thermal (PV/T) solar systems, pursuing a modular strategy approach provided by Simulink/Matlab. PV/T solar systems are a recently emerging solar technology that allows for the simultaneous conversion of solar energy into both electricity and heat. This type of technology present some interesting advantages over the conventional ''side-by-side'' thermal and PV solar systems, such as higher combined electrical/thermal energy outputs per unit area, and a more uniform and aesthetical pleasant roof area. Despite the fact that early research on PV/T systems can be traced back to the seventies, only recentlymore » it has gained a renewed impetus. In this work, parametric studies and annual transient simulations of PV/T systems are undertaken in Simulink/Matlab. The obtained results show an average annual solar fraction of 67%, and a global overall efficiency of 24% (i.e. 15% thermal and 9% electrical), for a typical four-person single-family residence in Lisbon, with p-Si cells, and a collector area of 6 m{sup 2}. A sensitivity analysis performed on the PV/T collector suggests that the most important variable that should be addressed to improve thermal performance is the photovoltaic (PV) module emittance. Based on those results, some additional improvements are proposed, such as the use of vacuum, or a noble gas at low-pressure, to allow for the removal of PV cells encapsulation without air oxidation and degradation, and thus reducing the PV module emittance. Preliminary results show that this option allows for an 8% increase on optical thermal efficiency, and a substantial reduction of thermal losses, suggesting the possibility of working at higher fluid temperatures. The higher working temperatures negative effect in electrical efficiency was negligible, due to compensation by improved optical properties. The simulation results are compared with experimental data obtained from other authors and perform reasonably well. The Simulink modeling platform has been mainly used worldwide on simulation of control systems, digital signal processing and electric circuits, but there are very few examples of application to solar energy systems modeling. This work uses the modular environment of Simulink/Matlab to model individual PV/T system components, and to assemble the entire installation layout. The results show that the modular approach strategy provided by Matlab/Simulink environment is applicable to solar systems modeling, providing good code scalability, faster developing time, and simpler integration with external computational tools, when compared with traditional imperative-oriented programming languages. (author)« less
CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation
2013-01-01
The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening. PMID:23938087
CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation.
Hodneland, Erlend; Kögel, Tanja; Frei, Dominik Michael; Gerdes, Hans-Hermann; Lundervold, Arvid
2013-08-09
: The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening.
Ali, Mohammad; You, Young Ae; Sur, Dipika; Kanungo, Suman; Kim, Deok Ryun; Deen, Jacqueline; Lopez, Anna Lena; Wierzba, Thomas F; Bhattacharya, Sujit K; Clemens, John D
2016-01-20
The test-negative design (TND) has emerged as a simple method for evaluating vaccine effectiveness (VE). Its utility for evaluating oral cholera vaccine (OCV) effectiveness is unknown. We examined this method's validity in assessing OCV effectiveness by comparing the results of TND analyses with those of conventional cohort analyses. Randomized controlled trials of OCV were conducted in Matlab (Bangladesh) and Kolkata (India), and an observational cohort design was used in Zanzibar (Tanzania). For all three studies, VE using the TND was estimated from the odds ratio (OR) relating vaccination status to fecal test status (Vibrio cholerae O1 positive or negative) among diarrheal patients enrolled during surveillance (VE= (1-OR)×100%). In cohort analyses of these studies, we employed the Cox proportional hazard model for estimating VE (=1-hazard ratio)×100%). OCV effectiveness estimates obtained using the TND (Matlab: 51%, 95% CI:37-62%; Kolkata: 67%, 95% CI:57-75%) were similar to the cohort analyses of these RCTs (Matlab: 52%, 95% CI:43-60% and Kolkata: 66%, 95% CI:55-74%). The TND VE estimate for the Zanzibar data was 94% (95% CI:84-98%) compared with 82% (95% CI:58-93%) in the cohort analysis. After adjusting for residual confounding in the cohort analysis of the Zanzibar study, using a bias indicator condition, we observed almost no difference in the two estimates. Our findings suggest that the TND is a valid approach for evaluating OCV effectiveness in routine vaccination programs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Evaluation and testing of image quality of the Space Solar Extreme Ultraviolet Telescope
NASA Astrophysics Data System (ADS)
Peng, Jilong; Yi, Zhong; Zhou, Shuhong; Yu, Qian; Hou, Yinlong; Wang, Shanshan
2018-01-01
For the space solar extreme ultraviolet telescope, the star point test can not be performed in the x-ray band (19.5nm band) as there is not light source of bright enough. In this paper, the point spread function of the optical system is calculated to evaluate the imaging performance of the telescope system. Combined with the actual processing surface error, such as small grinding head processing and magnetorheological processing, the optical design software Zemax and data analysis software Matlab are used to directly calculate the system point spread function of the space solar extreme ultraviolet telescope. Matlab codes are programmed to generate the required surface error grid data. These surface error data is loaded to the specified surface of the telescope system by using the communication technique of DDE (Dynamic Data Exchange), which is used to connect Zemax and Matlab. As the different processing methods will lead to surface error with different size, distribution and spatial frequency, the impact of imaging is also different. Therefore, the characteristics of the surface error of different machining methods are studied. Combining with its position in the optical system and simulation its influence on the image quality, it is of great significance to reasonably choose the processing technology. Additionally, we have also analyzed the relationship between the surface error and the image quality evaluation. In order to ensure the final processing of the mirror to meet the requirements of the image quality, we should choose one or several methods to evaluate the surface error according to the different spatial frequency characteristics of the surface error.
A platform for dynamic simulation and control of movement based on OpenSim and MATLAB.
Mansouri, Misagh; Reinbolt, Jeffrey A
2012-05-11
Numerical simulations play an important role in solving complex engineering problems and have the potential to revolutionize medical decision making and treatment strategies. In this paper, we combine the rapid model-based design, control systems and powerful numerical method strengths of MATLAB/Simulink with the simulation and human movement dynamics strengths of OpenSim by developing a new interface between the two software tools. OpenSim is integrated with Simulink using the MATLAB S-function mechanism, and the interface is demonstrated using both open-loop and closed-loop control systems. While the open-loop system uses MATLAB/Simulink to separately reproduce the OpenSim Forward Dynamics Tool, the closed-loop system adds the unique feature of feedback control to OpenSim, which is necessary for most human movement simulations. An arm model example was successfully used in both open-loop and closed-loop cases. For the open-loop case, the simulation reproduced results from the OpenSim Forward Dynamics Tool with root mean square (RMS) differences of 0.03° for the shoulder elevation angle and 0.06° for the elbow flexion angle. MATLAB's variable step-size integrator reduced the time required to generate the forward dynamic simulation from 7.1s (OpenSim) to 2.9s (MATLAB). For the closed-loop case, a proportional-integral-derivative controller was used to successfully balance a pole on model's hand despite random force disturbances on the pole. The new interface presented here not only integrates the OpenSim and MATLAB/Simulink software tools, but also will allow neuroscientists, physiologists, biomechanists, and physical therapists to adapt and generate new solutions as treatments for musculoskeletal conditions. Copyright © 2012 Elsevier Ltd. All rights reserved.
Binning in Gaussian Kernel Regularization
2005-04-01
OSU-SVM Matlab package, the SVM trained on 966 bins has a comparable test classification rate as the SVM trained on 27,179 samples, but reduces the...71.40%) on 966 randomly sampled data. Using the OSU-SVM Matlab package, the SVM trained on 966 bins has a comparable test classification rate as the...the OSU-SVM Matlab package, the SVM trained on 966 bins has a comparable test classification rate as the SVM trained on 27,179 samples, and reduces
Dust Tsunamis, Blackouts and 50 deg C: Teaching MATLAB in East Africa
NASA Astrophysics Data System (ADS)
Trauth, M. H.
2016-12-01
MATLAB is the tool of choice when analyzing earth and environmental data from East Africa. The software and companion toolboxes helps to process satellite images and digital elevation models, to detect trends, cycles, and recurrent, characteristic types of climate transitions in climate time series, and to model the hydrological balance of ancient lakes. The advantage of MATLAB is that the user can do many different types of analyses with the same software, making the software very attractive for young scientists at African universities. Since 2009 we are organizing summer schools on the subject of data analysis with various tools including MATLAB in Ethiopia, Kenya and Tanzania. Throughout the summerschool, participants are instructed by teams of senior researchers, together with young scientists, some of which were participants of an earlier summerschool. The participants are themselves integrated in teaching, depending on previous knowledge, so that the boundary between teachers and learners constantly shifts or even dissolves. From the extraordinarily positive experience, but also the difficulties in teaching data analysis methods with MATLAB in East Africa is reported.
Messier, Erik
2016-08-01
A Multichannel Systems (MCS) microelectrode array data acquisition (DAQ) unit is used to collect multichannel electrograms (EGM) from a Langendorff perfused rabbit heart system to study sudden cardiac death (SCD). MCS provides software through which data being processed by the DAQ unit can be displayed and saved, but this software's combined utility with MATLAB is not very effective. MCSs software stores recorded EGM data in a MathCad (MCD) format, which is then converted to a text file format. These text files are very large, and it is therefore very time consuming to import the EGM data into MATLAB for real-time analysis. Therefore, customized MATLAB software was developed to control the acquisition of data from the MCS DAQ unit, and provide specific laboratory accommodations for this study of SCD. The developed DAQ unit control software will be able to accurately: provide real time display of EGM signals; record and save EGM signals in MATLAB in a desired format; and produce real time analysis of the EGM signals; all through an intuitive GUI.
Deterministic quantum dense coding networks
NASA Astrophysics Data System (ADS)
Roy, Saptarshi; Chanda, Titas; Das, Tamoghna; Sen(De), Aditi; Sen, Ujjwal
2018-07-01
We consider the scenario of deterministic classical information transmission between multiple senders and a single receiver, when they a priori share a multipartite quantum state - an attempt towards building a deterministic dense coding network. Specifically, we prove that in the case of two or three senders and a single receiver, generalized Greenberger-Horne-Zeilinger (gGHZ) states are not beneficial for sending classical information deterministically beyond the classical limit, except when the shared state is the GHZ state itself. On the other hand, three- and four-qubit generalized W (gW) states with specific parameters as well as the four-qubit Dicke states can provide a quantum advantage of sending the information in deterministic dense coding. Interestingly however, numerical simulations in the three-qubit scenario reveal that the percentage of states from the GHZ-class that are deterministic dense codeable is higher than that of states from the W-class.
Characterizing and Quantifying Time Dependent Night Sky Brightness In and Around Tucson, Arizona
NASA Astrophysics Data System (ADS)
Nydegger, Rachel
2014-01-01
As part of a Research Experience for Undergraduates (REU) program with the National Optical Astronomy Observatory (NOAO), I (with mentor Dr. Constance Walker of NOAO) characterized light pollution in and near Tucson, Arizona using eight Sky Quality Meters (SQMs). In order to analyze the data in a consistent way for comparison, we created a standard procedure for reduction and analysis using python and MATLAB. The series of python scripts remove faulty data and examine specifically anthropogenic light pollution by excluding contributions made by the sun, moon, and the Milky Way. We then use MATLAB codes to illustrate how the light pollution changes in relation to time, distance from the city, and airglow. Data are then analyzed by a recently developed sky brightness model created by Dan Duriscoe of the National Park Service. To quantify the measurements taken by SQMs, we tested the wavelength sensitivity of the devices used for the data collection. The findings from the laboratory testing have prompted innovations for the SQMs as well as given a sense of how data gathered by these devices should be treated.
Lawhern, Vernon; Hairston, W David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.
Lawhern, Vernon; Hairston, W. David; Robbins, Kay
2013-01-01
Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration. PMID:23638169
DOE Office of Scientific and Technical Information (OSTI.GOV)
Günay, E.
In this study, the modulus of elasticity and shear modulus values of single-walled carbon nanotubes SWCNTs were modelled by using both finite element method and the Matlab code. Initially, cylindrical armchair and zigzag single walled 3D space frames were demonstrated as carbon nanostructures. Thereafter, macro programs were written by the Matlab code producing the space truss for zigzag and armchair models. 3D space frames were introduced to the ANSYS software and then tension, compression and additionally torsion tests were performed on zigzag and armchair carbon nanotubes with BEAM4 element in obtaining the exact values of elastic and shear modulus values.more » In this study, two different boundary conditions were tested and especially used in torsion loading. The equivalent shear modulus data was found by averaging the corresponding values obtained from ten different nodal points on the nanotube path. Finally, in this study it was determined that the elastic constant values showed proportional changes by increasing the carbon nanotube diameters up to a certain level but beyond this level these values remained stable.« less
Remote sensing image segmentation based on Hadoop cloud platform
NASA Astrophysics Data System (ADS)
Li, Jie; Zhu, Lingling; Cao, Fubin
2018-01-01
To solve the problem that the remote sensing image segmentation speed is slow and the real-time performance is poor, this paper studies the method of remote sensing image segmentation based on Hadoop platform. On the basis of analyzing the structural characteristics of Hadoop cloud platform and its component MapReduce programming, this paper proposes a method of image segmentation based on the combination of OpenCV and Hadoop cloud platform. Firstly, the MapReduce image processing model of Hadoop cloud platform is designed, the input and output of image are customized and the segmentation method of the data file is rewritten. Then the Mean Shift image segmentation algorithm is implemented. Finally, this paper makes a segmentation experiment on remote sensing image, and uses MATLAB to realize the Mean Shift image segmentation algorithm to compare the same image segmentation experiment. The experimental results show that under the premise of ensuring good effect, the segmentation rate of remote sensing image segmentation based on Hadoop cloud Platform has been greatly improved compared with the single MATLAB image segmentation, and there is a great improvement in the effectiveness of image segmentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Latanision, R.M.
1990-12-01
Electrochemical corrosion is pervasive in virtually all engineering systems and in virtually all industrial circumstances. Although engineers now understand how to design systems to minimize corrosion in many instances, many fundamental questions remain poorly understood and, therefore, the development of corrosion control strategies is based more on empiricism than on a deep understanding of the processes by which metals corrode in electrolytes. Fluctuations in potential, or current, in electrochemical systems have been observed for many years. To date, all investigations of this phenomenon have utilized non-deterministic analyses. In this work it is proposed to study electrochemical noise from a deterministicmore » viewpoint by comparison of experimental parameters, such as first and second order moments (non-deterministic), with computer simulation of corrosion at metal surfaces. In this way it is proposed to analyze the origins of these fluctuations and to elucidate the relationship between these fluctuations and kinetic parameters associated with metal dissolution and cathodic reduction reactions. This research program addresses in essence two areas of interest: (a) computer modeling of corrosion processes in order to study the electrochemical processes on an atomistic scale, and (b) experimental investigations of fluctuations in electrochemical systems and correlation of experimental results with computer modeling. In effect, the noise generated by mathematical modeling will be analyzed and compared to experimental noise in electrochemical systems. 1 fig.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bianchini, G.; Burgio, N.; Carta, M.
The GUINEVERE experiment (Generation of Uninterrupted Intense Neutrons at the lead Venus Reactor) is an experimental program in support of the ADS technology presently carried out at SCK-CEN in Mol (Belgium). In the experiment a modified lay-out of the original thermal VENUS critical facility is coupled to an accelerator, built by the French body CNRS in Grenoble, working in both continuous and pulsed mode and delivering 14 MeV neutrons by bombardment of deuterons on a tritium-target. The modified lay-out of the facility consists of a fast subcritical core made of 30% U-235 enriched metallic Uranium in a lead matrix. Severalmore » off-line and on-line reactivity measurement techniques will be investigated during the experimental campaign. This report is focused on the simulation by deterministic (ERANOS French code) and Monte Carlo (MCNPX US code) calculations of three reactivity measurement techniques, Slope ({alpha}-fitting), Area-ratio and Source-jerk, applied to a GUINEVERE subcritical configuration (namely SC1). The inferred reactivity, in dollar units, by the Area-ratio method shows an overall agreement between the two deterministic and Monte Carlo computational approaches, whereas the MCNPX Source-jerk results are affected by large uncertainties and allow only partial conclusions about the comparison. Finally, no particular spatial dependence of the results is observed in the case of the GUINEVERE SC1 subcritical configuration. (authors)« less
Impact of Periodic Unsteadiness on Performance and Heat Load in Axial Flow Turbomachines
NASA Technical Reports Server (NTRS)
Sharma, Om P.; Stetson, Gary M.; Daniels, William A,; Greitzer, Edward M.; Blair, Michael F.; Dring, Robert P.
1997-01-01
Results of an analytical and experimental investigation, directed at the understanding of the impact of periodic unsteadiness on the time-averaged flows in axial flow turbomachines, are presented. Analysis of available experimental data, from a large-scale rotating rig (LSRR) (low speed rig), shows that in the time-averaged axisymmetric equations the magnitude of the terms representing the effect of periodic unsteadiness (deterministic stresses) are as large or larger than those due to random unsteadiness (turbulence). Numerical experiments, conducted to highlight physical mechanisms associated with the migration of combustor generated hot-streaks in turbine rotors, indicated that the effect can be simulated by accounting for deterministic stress like terms in the time-averaged mass and energy conservation equations. The experimental portion of this program shows that the aerodynamic loss for the second stator in a 1-1/2 stage turbine are influenced by the axial spacing between the second stator leading edge and the rotor trailing edge. However, the axial spacing has little impact on the heat transfer coefficient. These performance changes are believed to be associated with the change in deterministic stress at the inlet to the second stator. Data were also acquired to quantify the impact of indexing the first stator relative to the second stator. For the range of parameters examined, this effect was found to be of the same order as the effect of axial spacing.
2013-06-01
Radio is a software development toolkit that provides signal processing blocks to drive the SDR. GNU Radio has many strong points – it is actively...maintained with a large user base, new capabilities are constantly being added, and compiled C code is fast for many real-time applications such as...programming interface (API) makes learning the architecture a daunting task, even for the experienced software developer. This requirement poses many
Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.
Dave, Jaydev K; Gingold, Eric L
2013-01-01
The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.
United States Air Force Summer Faculty Research Program. Management Report. Volume 3
1988-12-01
xyz values are basically correct, we plotted the perspective view of a target using the xyz values with MATLAB (a matrix-based mathematics softwaie...initially included all the pixels of the image in calculating votes for each accumulator. Two target types, the M-60 tank and the fuel truck, were used...J.F., Gouin, H. and Gaviglio, J., "Evolution of the Reynolds Stress Tensor in a Shock Wave-Turbulence Interaction," Indian Journal of Technology, Vol
Spacecraft Guidance, Navigation, and Control Visualization Tool
NASA Technical Reports Server (NTRS)
Mandic, Milan; Acikmese, Behcet; Blackmore, Lars
2011-01-01
G-View is a 3D visualization tool for supporting spacecraft guidance, navigation, and control (GN&C) simulations relevant to small-body exploration and sampling (see figure). The tool is developed in MATLAB using Virtual Reality Toolbox and provides users with the ability to visualize the behavior of their simulations, regardless of which programming language (or machine) is used to generate simulation results. The only requirement is that multi-body simulation data is generated and placed in the proper format before applying G-View.
Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.
Koprowski, Robert
2015-11-01
The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2012-01-01
Schematic of the generator and power converters built in PLECS ............. 26 Figure 3.12. Block diagram of the MPPT control built in Matlab/Simulink...validated by simulation results done in Matlab/Simulink R2010a and PLECS . Figure 3.9 shows the block diagram of the hydrokinetic system built in Matlab...rectifier, boost converter and battery model built in PLECS . The battery bank on the load side is simulated by a constant dc voltage source. 25
An approach to modeling and optimization of integrated renewable energy system (ires)
NASA Astrophysics Data System (ADS)
Maheshwari, Zeel
The purpose of this study was to cost optimize electrical part of IRES (Integrated Renewable Energy Systems) using HOMER and maximize the utilization of resources using MATLAB programming. IRES is an effective and a viable strategy that can be employed to harness renewable energy resources to energize remote rural areas of developing countries. The resource- need matching, which is the basis for IRES makes it possible to provide energy in an efficient and cost effective manner. Modeling and optimization of IRES for a selected study area makes IRES more advantageous when compared to hybrid concepts. A remote rural area with a population of 700 in 120 households and 450 cattle is considered as an example for cost analysis and optimization. Mathematical models for key components of IRES such as biogas generator, hydropower generator, wind turbine, PV system and battery banks are developed. A discussion of the size of water reservoir required is also presented. Modeling of IRES on the basis of need to resource and resource to need matching is pursued to help in optimum use of resources for the needs. Fixed resources such as biogas and water are used in prioritized order whereas movable resources such as wind and solar can be used simultaneously for different priorities. IRES is cost optimized for electricity demand using HOMER software that is developed by the NREL (National Renewable Energy Laboratory). HOMER optimizes configuration for electrical demand only and does not consider other demands such as biogas for cooking and water for domestic and irrigation purposes. Hence an optimization program based on the need-resource modeling of IRES is performed in MATLAB. Optimization of the utilization of resources for several needs is performed. Results obtained from MATLAB clearly show that the available resources can fulfill the demand of the rural areas. Introduction of IRES in rural communities has many socio-economic implications. It brings about improvement in living environment and community welfare by supplying the basic needs such as biogas for cooking, water for domestic and irrigation purposes and electrical energy for lighting, communication, cold storage, educational and small- scale industrial purposes.
EASAMS' Ariane 5 on-board software experience
NASA Astrophysics Data System (ADS)
Birnie, Steven Andrew
The design and development of the prototype flight software for the Ariane 5 satellite launch vehicle is considered. This was specified as being representative of the eventual real flight program in terms of timing constraints and target computer loading. The usability of HOOD (Hierarchical Object Oriented Design) and Ada for development of such preemptive multitasking computer programs was verified. Features of the prototype development included: design methods supplementary to HOOD for representation of concurrency aspects; visibility of Ada enumerated type literals across HOOD parent-child interfaces; deterministic timings achieved by modification of Ada delays; and linking of interrupts to Ada task entries.
Active adaptive management for reintroduction of an animal population
Runge, Michael C.
2013-01-01
Captive animals are frequently reintroduced to the wild in the face of uncertainty, but that uncertainty can often be reduced over the course of the reintroduction effort, providing the opportunity for adaptive management. One common uncertainty in reintroductions is the short-term survival rate of released adults (a release cost), an important factor because it can affect whether releasing adults or juveniles is better. Information about this rate can improve the success of the reintroduction program, but does the expected gain offset the costs of obtaining the information? I explored this question for reintroduction of the griffon vulture (Gyps fulvus) by framing the management question as a belief Markov decision process, characterizing uncertainty about release cost with 2 information state variables, and finding the solution using stochastic dynamic programming. For a reintroduction program of fixed length (e.g., 5 years of releases), the optimal policy in the final release year resembles the deterministic solution: release either all adults or all juveniles depending on whether the point estimate for the survival rate in question is above or below a specific threshold. But the optimal policy in the earlier release years 1) includes release of a mixture of juveniles and adults under some circumstances, and 2) recommends release of adults even when the point estimate of survival is much less than the deterministic threshold. These results show that in an iterated decision setting, the optimal decision in early years can be quite different from that in later years because of the value of learning.
MatLab program for precision calibration of optical tweezers
NASA Astrophysics Data System (ADS)
Tolić-Nørrelykke, Iva Marija; Berg-Sørensen, Kirstine; Flyvbjerg, Henrik
2004-06-01
Optical tweezers are used as force transducers in many types of experiments. The force they exert in a given experiment is known only after a calibration. Computer codes that calibrate optical tweezers with high precision and reliability in the ( x, y)-plane orthogonal to the laser beam axis were written in MatLab (MathWorks Inc.) and are presented here. The calibration is based on the power spectrum of the Brownian motion of a dielectric bead trapped in the tweezers. Precision is achieved by accounting for a number of factors that affect this power spectrum. First, cross-talk between channels in 2D position measurements is tested for, and eliminated if detected. Then, the Lorentzian power spectrum that results from the Einstein-Ornstein-Uhlenbeck theory, is fitted to the low-frequency part of the experimental spectrum in order to obtain an initial guess for parameters to be fitted. Finally, a more complete theory is fitted, a theory that optionally accounts for the frequency dependence of the hydrodynamic drag force and hydrodynamic interaction with a nearby cover slip, for effects of finite sampling frequency (aliasing), for effects of anti-aliasing filters in the data acquisition electronics, and for unintended "virtual" filtering caused by the position detection system. Each of these effects can be left out or included as the user prefers, with user-defined parameters. Several tests are applied to the experimental data during calibration to ensure that the data comply with the theory used for their interpretation: Independence of x- and y-coordinates, Hooke's law, exponential distribution of power spectral values, uncorrelated Gaussian scatter of residual values. Results are given with statistical errors and covariance matrix. Program summaryTitle of program: tweezercalib Catalogue identifier: ADTV Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland. Program Summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTV Computer for which the program is designed and others on which it has been tested: General computer running MatLab (MathWorks Inc.). Programming language used: MatLab (MathWorks Inc.). Uses "Optimization Toolbox" and "Statistics Toolbox". Memory required to execute with typical data: Of order 4 times the size of the data file. High speed storage required: None No. of lines in distributed program, including test data, etc.: 133 183 No. of bytes in distributed program, including test data, etc.: 1 043 674 Distribution format: tar gzip file Nature of physical problem: Calibrate optical tweezers with precision by fitting theory to experimental power spectrum of position of bead doing Brownian motion in incompressible fluid, possibly near microscope cover slip, while trapped in optical tweezers. Thereby determine spring constant of optical trap and conversion factor for arbitrary-units-to-nanometers for detection system. Method of solution: Elimination of cross-talk between quadrant photo-diode's output channels for positions (optional). Check that distribution of recorded positions agrees with Boltzmann distribution of bead in harmonic trap. Data compression and noise reduction by blocking method applied to power spectrum. Full accounting for hydrodynamic effects: Frequency-dependent drag force and interaction with nearby cover slip (optional). Full accounting for electronic filters (optional), for "virtual filtering" caused by detection system (optional). Full accounting for aliasing caused by finite sampling rate (optional). Standard non-linear least-squares fitting. Statistical support for fit is given, with several plots suitable for inspection of consistency and quality of data and fit. Restrictions on the complexity of the problem: Data should be positions of bead doing Brownian motion while held by optical tweezers. For high precision in final results, data should be time series measured over a long time, with sufficiently high experimental sampling rate: The sampling rate should be well above the characteristic frequency of the trap, the so-called corner frequency. Thus, the sampling frequency should typically be larger than 10 kHz. The Fast Fourier Transform applied requires the time series to contain 2 n data points, and long measurement time is obtained with n>12-15. Finally, the optics should be set to ensure a harmonic trapping potential in the range of positions visited by the bead. The fitting procedure checks for harmonic potential. Typical running time: (Tens of) minutes Unusual features of the program: None References: The theoretical underpinnings for the procedure are found in [K. Berg-Sørensen, H. Flyvbjerg, Rev. Sci. Instrum. 75 (3) (2004) 594].
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Jih-Sheng
This paper introduces control system design based softwares, SIMNON and MATLAB/SIMULINK, for power electronics system simulation. A complete power electronics system typically consists of a rectifier bridge along with its smoothing capacitor, an inverter, and a motor. The system components, featuring discrete or continuous, linear or nonlinear, are modeled in mathematical equations. Inverter control methods,such as pulse-width-modulation and hysteresis current control, are expressed in either computer algorithms or digital circuits. After describing component models and control methods, computer programs are then developed for complete systems simulation. Simulation results are mainly used for studying system performances, such as input and outputmore » current harmonics, torque ripples, and speed responses. Key computer programs and simulation results are demonstrated for educational purposes.« less
Constrained multiple indicator kriging using sequential quadratic programming
NASA Astrophysics Data System (ADS)
Soltani-Mohammadi, Saeed; Erhan Tercan, A.
2012-11-01
Multiple indicator kriging (MIK) is a nonparametric method used to estimate conditional cumulative distribution functions (CCDF). Indicator estimates produced by MIK may not satisfy the order relations of a valid CCDF which is ordered and bounded between 0 and 1. In this paper a new method has been presented that guarantees the order relations of the cumulative distribution functions estimated by multiple indicator kriging. The method is based on minimizing the sum of kriging variances for each cutoff under unbiasedness and order relations constraints and solving constrained indicator kriging system by sequential quadratic programming. A computer code is written in the Matlab environment to implement the developed algorithm and the method is applied to the thickness data.
DEM interpolation weight calculation modulus based on maximum entropy
NASA Astrophysics Data System (ADS)
Chen, Tian-wei; Yang, Xia
2015-12-01
There is negative-weight in traditional interpolation of gridding DEM, in the article, the principle of Maximum Entropy is utilized to analyze the model system which depends on modulus of space weight. Negative-weight problem of the DEM interpolation is researched via building Maximum Entropy model, and adding nonnegative, first and second order's Moment constraints, the negative-weight problem is solved. The correctness and accuracy of the method was validated with genetic algorithm in matlab program. The method is compared with the method of Yang Chizhong interpolation and quadratic program. Comparison shows that the volume and scaling of Maximum Entropy's weight is fit to relations of space and the accuracy is superior to the latter two.
Research on an augmented Lagrangian penalty function algorithm for nonlinear programming
NASA Technical Reports Server (NTRS)
Frair, L.
1978-01-01
The augmented Lagrangian (ALAG) Penalty Function Algorithm for optimizing nonlinear mathematical models is discussed. The mathematical models of interest are deterministic in nature and finite dimensional optimization is assumed. A detailed review of penalty function techniques in general and the ALAG technique in particular is presented. Numerical experiments are conducted utilizing a number of nonlinear optimization problems to identify an efficient ALAG Penalty Function Technique for computer implementation.
LAVA: Large scale Automated Vulnerability Addition
2016-05-23
memory copy, e.g., are reasonable attack points. If the goal is to inject divide- by-zero, then arithmetic operations involving division will be...ways. First, it introduces deterministic record and replay , which can be used for iterated and expensive analyses that cannot be performed online... memory . Since our approach records the correspondence between source lines and program basic block execution, it would be just as easy to figure out
NASA Technical Reports Server (NTRS)
Gordon, T. E.
1995-01-01
The mirror assembly of the AXAF observatory consists of four concentric, confocal, Wolter type 1 telescopes. Each telescope includes two conical grazing incidence mirrors, a paraboloid followed by a hyperboloid. Fabrication of these state-or-the-art optics is now complete, with predicted performance that surpasses the goals of the program. The fabrication of these optics, whose size and requirements exceed those of any previous x-ray mirrors, presented a challenging task requiring the use of precision engineering in many different forms. Virtually all of the equipment used for this effort required precision engineering. Accurate metrology required deterministic support of the mirrors in order to model the gravity distortions which will not be present on orbit. The primary axial instrument, known as the Precision Metrology Station (PMS), was a unique scanning Fizeau interferometer. After metrology was complete, the optics were placed in specially designed Glass Support Fixtures (GSF's) for installation on the Automated Cylindrical Grinder/Polishers (ACG/P's). The GSF's were custom molded for each mirror element to match the shape of the outer surface to minimize distortions of the inner surface. The final performance of the telescope is expected to far exceed the original goals and expectations of the program.
Automatic design of synthetic gene circuits through mixed integer non-linear programming.
Huynh, Linh; Kececioglu, John; Köppe, Matthias; Tagkopoulos, Ilias
2012-01-01
Automatic design of synthetic gene circuits poses a significant challenge to synthetic biology, primarily due to the complexity of biological systems, and the lack of rigorous optimization methods that can cope with the combinatorial explosion as the number of biological parts increases. Current optimization methods for synthetic gene design rely on heuristic algorithms that are usually not deterministic, deliver sub-optimal solutions, and provide no guaranties on convergence or error bounds. Here, we introduce an optimization framework for the problem of part selection in synthetic gene circuits that is based on mixed integer non-linear programming (MINLP), which is a deterministic method that finds the globally optimal solution and guarantees convergence in finite time. Given a synthetic gene circuit, a library of characterized parts, and user-defined constraints, our method can find the optimal selection of parts that satisfy the constraints and best approximates the objective function given by the user. We evaluated the proposed method in the design of three synthetic circuits (a toggle switch, a transcriptional cascade, and a band detector), with both experimentally constructed and synthetic promoter libraries. Scalability and robustness analysis shows that the proposed framework scales well with the library size and the solution space. The work described here is a step towards a unifying, realistic framework for the automated design of biological circuits.
Development of an Operational TS Dataset Production System for the Data Assimilation System
NASA Astrophysics Data System (ADS)
Kim, Sung Dae; Park, Hyuk Min; Kim, Young Ho; Park, Kwang Soon
2017-04-01
An operational TS (Temperature and Salinity) dataset production system was developed to provide near real-time data to the data assimilation system periodically. It collects the latest 15 days' TS data of the north western pacific area (20°N - 55°N, 110°E - 150°E), applies QC tests to the archived data and supplies them to numerical prediction models of KIOST (Korea Institute of Ocean Science and Technology). The latest real-time TS data are collected from Argo GDAC and GTSPP data server every week. Argo data are downloaded from /latest_data directory of Argo GDAC. Because many duplicated data exist when all profile data are extracted from all Argo netCDF files, DB system is used to avoid duplication. All metadata (float ID, location, observation date and time, etc) of all Argo floats is stored into Database system and a Matlab program was developed to manipulate DB data, to check the duplication and to exclude duplicated data. GTSPP data are downloaded from /realtime directory of GTSPP data service. The latest data except ARGO data are extracted from the original data. Another Matlab program was coded to inspect all collected data using 10 QC tests and produce final dataset which can be used by the assimilation system. Three regional range tests to inspect annual, seasonal and monthly variations are included in the QC procedures. The C program was developed to provide regional ranges to data managers. It can calculate upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. The final TS dataset contains the latest 15 days' TS data in netCDF format. It is updated every week and transmitted to numerical modeler of KIOST for operational use.
The relationship between stochastic and deterministic quasi-steady state approximations.
Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R
2015-11-23
The quasi steady-state approximation (QSSA) is frequently used to reduce deterministic models of biochemical networks. The resulting equations provide a simplified description of the network in terms of non-elementary reaction functions (e.g. Hill functions). Such deterministic reductions are frequently a basis for heuristic stochastic models in which non-elementary reaction functions are used to define reaction propensities. Despite their popularity, it remains unclear when such stochastic reductions are valid. It is frequently assumed that the stochastic reduction can be trusted whenever its deterministic counterpart is accurate. However, a number of recent examples show that this is not necessarily the case. Here we explain the origin of these discrepancies, and demonstrate a clear relationship between the accuracy of the deterministic and the stochastic QSSA for examples widely used in biological systems. With an analysis of a two-state promoter model, and numerical simulations for a variety of other models, we find that the stochastic QSSA is accurate whenever its deterministic counterpart provides an accurate approximation over a range of initial conditions which cover the likely fluctuations from the quasi steady-state (QSS). We conjecture that this relationship provides a simple and computationally inexpensive way to test the accuracy of reduced stochastic models using deterministic simulations. The stochastic QSSA is one of the most popular multi-scale stochastic simulation methods. While the use of QSSA, and the resulting non-elementary functions has been justified in the deterministic case, it is not clear when their stochastic counterparts are accurate. In this study, we show how the accuracy of the stochastic QSSA can be tested using their deterministic counterparts providing a concrete method to test when non-elementary rate functions can be used in stochastic simulations.
Competitive Facility Location with Fuzzy Random Demands
NASA Astrophysics Data System (ADS)
Uno, Takeshi; Katagiri, Hideki; Kato, Kosuke
2010-10-01
This paper proposes a new location problem of competitive facilities, e.g. shops, with uncertainty and vagueness including demands for the facilities in a plane. By representing the demands for facilities as fuzzy random variables, the location problem can be formulated as a fuzzy random programming problem. For solving the fuzzy random programming problem, first the α-level sets for fuzzy numbers are used for transforming it to a stochastic programming problem, and secondly, by using their expectations and variances, it can be reformulated to a deterministic programming problem. After showing that one of their optimal solutions can be found by solving 0-1 programming problems, their solution method is proposed by improving the tabu search algorithm with strategic oscillation. The efficiency of the proposed method is shown by applying it to numerical examples of the facility location problems.
NASA Astrophysics Data System (ADS)
Vasant, P.; Ganesan, T.; Elamvazuthi, I.
2012-11-01
A fairly reasonable result was obtained for non-linear engineering problems using the optimization techniques such as neural network, genetic algorithms, and fuzzy logic independently in the past. Increasingly, hybrid techniques are being used to solve the non-linear problems to obtain better output. This paper discusses the use of neuro-genetic hybrid technique to optimize the geological structure mapping which is known as seismic survey. It involves the minimization of objective function subject to the requirement of geophysical and operational constraints. In this work, the optimization was initially performed using genetic programming, and followed by hybrid neuro-genetic programming approaches. Comparative studies and analysis were then carried out on the optimized results. The results indicate that the hybrid neuro-genetic hybrid technique produced better results compared to the stand-alone genetic programming method.
SU-E-T-22: A Deterministic Solver of the Boltzmann-Fokker-Planck Equation for Dose Calculation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, X; Gao, H; Paganetti, H
2015-06-15
Purpose: The Boltzmann-Fokker-Planck equation (BFPE) accurately models the migration of photons/charged particles in tissues. While the Monte Carlo (MC) method is popular for solving BFPE in a statistical manner, we aim to develop a deterministic BFPE solver based on various state-of-art numerical acceleration techniques for rapid and accurate dose calculation. Methods: Our BFPE solver is based on the structured grid that is maximally parallelizable, with the discretization in energy, angle and space, and its cross section coefficients are derived or directly imported from the Geant4 database. The physical processes that are taken into account are Compton scattering, photoelectric effect, pairmore » production for photons, and elastic scattering, ionization and bremsstrahlung for charged particles.While the spatial discretization is based on the diamond scheme, the angular discretization synergizes finite element method (FEM) and spherical harmonics (SH). Thus, SH is used to globally expand the scattering kernel and FFM is used to locally discretize the angular sphere. As a Result, this hybrid method (FEM-SH) is both accurate in dealing with forward-peaking scattering via FEM, and efficient for multi-energy-group computation via SH. In addition, FEM-SH enables the analytical integration in energy variable of delta scattering kernel for elastic scattering with reduced truncation error from the numerical integration based on the classic SH-based multi-energy-group method. Results: The accuracy of the proposed BFPE solver was benchmarked against Geant4 for photon dose calculation. In particular, FEM-SH had improved accuracy compared to FEM, while both were within 2% of the results obtained with Geant4. Conclusion: A deterministic solver of the Boltzmann-Fokker-Planck equation is developed for dose calculation, and benchmarked against Geant4. Xiang Hong and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less
Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation
NASA Astrophysics Data System (ADS)
Zhao, T.; Cai, X.; Yang, D.
2010-12-01
Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast
Space Trajectories Error Analysis (STEAP) Programs. Volume 1: Analytic manual, update
NASA Technical Reports Server (NTRS)
1971-01-01
Manual revisions are presented for the modified and expanded STEAP series. The STEAP 2 is composed of three independent but related programs: NOMAL for the generation of n-body nominal trajectories performing a number of deterministic guidance events; ERRAN for the linear error analysis and generalized covariance analysis along specific targeted trajectories; and SIMUL for testing the mathematical models used in the navigation and guidance process. The analytic manual provides general problem description, formulation, and solution and the detailed analysis of subroutines. The programmers' manual gives descriptions of the overall structure of the programs as well as the computational flow and analysis of the individual subroutines. The user's manual provides information on the input and output quantities of the programs. These are updates to N69-36472 and N69-36473.
NASA Astrophysics Data System (ADS)
Perraud, Jean-Michel; Bennett, James C.; Bridgart, Robert; Robertson, David E.
2016-04-01
Research undertaken through the Water Information Research and Development Alliance (WIRADA) has laid the foundations for continuous deterministic and ensemble short-term forecasting services. One output of this research is the software Short-term Water Information Forecasting Tools version 2 (SWIFT2). SWIFT2 is developed for use in research on short term streamflow forecasting techniques as well as operational forecasting services at the Australian Bureau of Meteorology. The variety of uses in research and operations requires a modular software system whose components can be arranged in applications that are fit for each particular purpose, without unnecessary software duplication. SWIFT2 modelling structures consist of sub-areas of hydrologic models, nodes and links with in-stream routing and reservoirs. While this modelling structure is customary, SWIFT2 is built from the ground up for computational and data intensive applications such as ensemble forecasts necessary for the estimation of the uncertainty in forecasts. Support for parallel computation on multiple processors or on a compute cluster is a primary use case. A convention is defined to store large multi-dimensional forecasting data and its metadata using the netCDF library. SWIFT2 is written in modern C++ with state of the art software engineering techniques and practices. A salient technical feature is a well-defined application programming interface (API) to facilitate access from different applications and technologies. SWIFT2 is already seamlessly accessible on Windows and Linux via packages in R, Python, Matlab and .NET languages such as C# and F#. Command line or graphical front-end applications are also feasible. This poster gives an overview of the technology stack, and illustrates the resulting features of SWIFT2 for users. Research and operational uses share the same common core C++ modelling shell for consistency, but augmented by different software modules suitable for each context. The accessibility via interactive modelling languages is particularly amenable to using SWIFT2 in exploratory research, with a dynamic and versatile experimental modelling workflow. This does not come at the expense of the stability and reliability required for use in operations, where only mature and stable components are used.
Deterministic Walks with Choice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beeler, Katy E.; Berenhaut, Kenneth S.; Cooper, Joshua N.
2014-01-10
This paper studies deterministic movement over toroidal grids, integrating local information, bounded memory and choice at individual nodes. The research is motivated by recent work on deterministic random walks, and applications in multi-agent systems. Several results regarding passing tokens through toroidal grids are discussed, as well as some open questions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Broderick, Robert; Quiroz, Jimmy; Grijalva, Santiago
2014-07-15
Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.
MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations
NASA Astrophysics Data System (ADS)
Vergara-Perez, Sandra; Marucho, Marcelo
2016-01-01
One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson-Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post-analysis of structural and electrical properties of biomolecules.
Web-based health services and clinical decision support.
Jegelevicius, Darius; Marozas, Vaidotas; Lukosevicius, Arunas; Patasius, Martynas
2004-01-01
The purpose of this study was the development of a Web-based e-health service for comprehensive assistance and clinical decision support. The service structure consists of a Web server, a PHP-based Web interface linked to a clinical SQL database, Java applets for interactive manipulation and visualization of signals and a Matlab server linked with signal and data processing algorithms implemented by Matlab programs. The service ensures diagnostic signal- and image analysis-sbased clinical decision support. By using the discussed methodology, a pilot service for pathology specialists for automatic calculation of the proliferation index has been developed. Physicians use a simple Web interface for uploading the pictures under investigation to the server; subsequently a Java applet interface is used for outlining the region of interest and, after processing on the server, the requested proliferation index value is calculated. There is also an "expert corner", where experts can submit their index estimates and comments on particular images, which is especially important for system developers. These expert evaluations are used for optimization and verification of automatic analysis algorithms. Decision support trials have been conducted for ECG and ophthalmology ultrasonic investigations of intraocular tumor differentiation. Data mining algorithms have been applied and decision support trees constructed. These services are under implementation by a Web-based system too. The study has shown that the Web-based structure ensures more effective, flexible and accessible services compared with standalone programs and is very convenient for biomedical engineers and physicians, especially in the development phase.
Chiang, Fu-Tsai; Li, Pei-Jung; Chung, Shih-Ping; Pan, Lung-Fa; Pan, Lung-Kwang
2016-01-01
ABSTRACT This study analyzed multiple biokinetic models using a dynamic water phantom. The phantom was custom-made with acrylic materials to model metabolic mechanisms in the human body. It had 4 spherical chambers of different sizes, connected by 8 ditches to form a complex and adjustable water loop. One infusion and drain pole connected the chambers to an auxiliary silicon-based hose, respectively. The radio-active compound solution (TC-99m-MDP labeled) formed a sealed and static water loop inside the phantom. As clean feed water was infused to replace the original solution, the system mimicked metabolic mechanisms for data acquisition. Five cases with different water loop settings were tested and analyzed, with case settings changed by controlling valve poles located in the ditches. The phantom could also be changed from model A to model B by transferring its vertical configuration. The phantom was surveyed with a clinical gamma camera to determine the time-dependent intensity of every chamber. The recorded counts per pixel in each chamber were analyzed and normalized to compare with theoretical estimations from the MATLAB program. Every preset case was represented by uniquely defined, time-dependent, simultaneous differential equations, and a corresponding MATLAB program optimized the solutions by comparing theoretical calculations and practical measurements. A dimensionless agreement (AT) index was recommended to evaluate the comparison in each case. ATs varied from 5.6 to 48.7 over the 5 cases, indicating that this work presented an acceptable feasibility study. PMID:27286096
MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations
Vergara-Perez, Sandra; Marucho, Marcelo
2015-01-01
One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules. PMID:26924848
MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations.
Vergara-Perez, Sandra; Marucho, Marcelo
2016-01-01
One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goreac, Dan, E-mail: Dan.Goreac@u-pem.fr; Kobylanski, Magdalena, E-mail: Magdalena.Kobylanski@u-pem.fr; Martinez, Miguel, E-mail: Miguel.Martinez@u-pem.fr
2016-10-15
We study optimal control problems in infinite horizon whxen the dynamics belong to a specific class of piecewise deterministic Markov processes constrained to star-shaped networks (corresponding to a toy traffic model). We adapt the results in Soner (SIAM J Control Optim 24(6):1110–1122, 1986) to prove the regularity of the value function and the dynamic programming principle. Extending the networks and Krylov’s “shaking the coefficients” method, we prove that the value function can be seen as the solution to a linearized optimization problem set on a convenient set of probability measures. The approach relies entirely on viscosity arguments. As a by-product,more » the dual formulation guarantees that the value function is the pointwise supremum over regular subsolutions of the associated Hamilton–Jacobi integrodifferential system. This ensures that the value function satisfies Perron’s preconization for the (unique) candidate to viscosity solution.« less
Nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates
Melechko, Anatoli V [Oak Ridge, TN; McKnight, Timothy E. , Guillorn, Michael A.; Ilic, Bojan [Ithaca, NY; Merkulov, Vladimir I [Knoxville, TN; Doktycz, Mitchel J [Knoxville, TN; Lowndes, Douglas H [Knoxville, TN; Simpson, Michael L [Knoxville, TN
2011-05-17
Methods, manufactures, machines and compositions are described for nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates. A method includes depositing a catalyst particle on a surface of a substrate to define a deterministically located position; growing an aligned elongated nanostructure on the substrate, an end of the aligned elongated nanostructure coupled to the substrate at the deterministically located position; coating the aligned elongated nanostructure with a conduit material; removing a portion of the conduit material to expose the catalyst particle; removing the catalyst particle; and removing the elongated nanostructure to define a nanoconduit.
NASA Astrophysics Data System (ADS)
Itoh, Kosuke; Nakada, Tsutomu
2013-04-01
Deterministic nonlinear dynamical processes are ubiquitous in nature. Chaotic sounds generated by such processes may appear irregular and random in waveform, but these sounds are mathematically distinguished from random stochastic sounds in that they contain deterministic short-time predictability in their temporal fine structures. We show that the human brain distinguishes deterministic chaotic sounds from spectrally matched stochastic sounds in neural processing and perception. Deterministic chaotic sounds, even without being attended to, elicited greater cerebral cortical responses than the surrogate control sounds after about 150 ms in latency after sound onset. Listeners also clearly discriminated these sounds in perception. The results support the hypothesis that the human auditory system is sensitive to the subtle short-time predictability embedded in the temporal fine structure of sounds.
A deterministic particle method for one-dimensional reaction-diffusion equations
NASA Technical Reports Server (NTRS)
Mascagni, Michael
1995-01-01
We derive a deterministic particle method for the solution of nonlinear reaction-diffusion equations in one spatial dimension. This deterministic method is an analog of a Monte Carlo method for the solution of these problems that has been previously investigated by the author. The deterministic method leads to the consideration of a system of ordinary differential equations for the positions of suitably defined particles. We then consider the time explicit and implicit methods for this system of ordinary differential equations and we study a Picard and Newton iteration for the solution of the implicit system. Next we solve numerically this system and study the discretization error both analytically and numerically. Numerical computation shows that this deterministic method is automatically adaptive to large gradients in the solution.
Piezoelectric Actuator Modeling Using MSC/NASTRAN and MATLAB
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.
2003-01-01
This paper presents a procedure for modeling structures containing piezoelectric actuators using MSCMASTRAN and MATLAB. The paper describes the utility and functionality of one set of validated modeling tools. The tools described herein use MSCMASTRAN to model the structure with piezoelectric actuators and a thermally induced strain to model straining of the actuators due to an applied voltage field. MATLAB scripts are used to assemble the dynamic equations and to generate frequency response functions. The application of these tools is discussed using a cantilever aluminum beam with a surface mounted piezoelectric actuator as a sample problem. Software in the form of MSCINASTRAN DMAP input commands, MATLAB scripts, and a step-by-step procedure to solve the example problem are provided. Analysis results are generated in terms of frequency response functions from deflection and strain data as a function of input voltage to the actuator.
Valdes-Abellan, Javier; Pachepsky, Yakov; Martinez, Gonzalo
2018-01-01
Data assimilation is becoming a promising technique in hydrologic modelling to update not only model states but also to infer model parameters, specifically to infer soil hydraulic properties in Richard-equation-based soil water models. The Ensemble Kalman Filter method is one of the most widely employed method among the different data assimilation alternatives. In this study the complete Matlab© code used to study soil data assimilation efficiency under different soil and climatic conditions is shown. The code shows the method how data assimilation through EnKF was implemented. Richards equation was solved by the used of Hydrus-1D software which was run from Matlab. •MATLAB routines are released to be used/modified without restrictions for other researchers•Data assimilation Ensemble Kalman Filter method code.•Soil water Richard equation flow solved by Hydrus-1D.
Peng, Henry T; Edginton, Andrea N; Cheung, Bob
2013-10-01
Physiologically based pharmacokinetic models were developed using MATLAB Simulink® and PK-Sim®. We compared the capability and usefulness of these two models by simulating pharmacokinetic changes of midazolam under exercise and heat stress to verify the usefulness of MATLAB Simulink® as a generic PBPK modeling software. Although both models show good agreement with experimental data obtained under resting condition, their predictions of pharmacokinetics changes are less accurate in the stressful conditions. However, MATLAB Simulink® may be more flexible to include physiologically based processes such as oral absorption and simulate various stress parameters such as stress intensity, duration and timing of drug administration to improve model performance. Further work will be conducted to modify algorithms in our generic model developed using MATLAB Simulink® and to investigate pharmacokinetics under other physiological stress such as trauma. © The Author(s) 2013.
Introduction to multifractal detrended fluctuation analysis in matlab.
Ihlen, Espen A F
2012-01-01
Fractal structures are found in biomedical time series from a wide range of physiological phenomena. The multifractal spectrum identifies the deviations in fractal structure within time periods with large and small fluctuations. The present tutorial is an introduction to multifractal detrended fluctuation analysis (MFDFA) that estimates the multifractal spectrum of biomedical time series. The tutorial presents MFDFA step-by-step in an interactive Matlab session. All Matlab tools needed are available in Introduction to MFDFA folder at the website www.ntnu.edu/inm/geri/software. MFDFA are introduced in Matlab code boxes where the reader can employ pieces of, or the entire MFDFA to example time series. After introducing MFDFA, the tutorial discusses the best practice of MFDFA in biomedical signal processing. The main aim of the tutorial is to give the reader a simple self-sustained guide to the implementation of MFDFA and interpretation of the resulting multifractal spectra.
Introduction to Multifractal Detrended Fluctuation Analysis in Matlab
Ihlen, Espen A. F.
2012-01-01
Fractal structures are found in biomedical time series from a wide range of physiological phenomena. The multifractal spectrum identifies the deviations in fractal structure within time periods with large and small fluctuations. The present tutorial is an introduction to multifractal detrended fluctuation analysis (MFDFA) that estimates the multifractal spectrum of biomedical time series. The tutorial presents MFDFA step-by-step in an interactive Matlab session. All Matlab tools needed are available in Introduction to MFDFA folder at the website www.ntnu.edu/inm/geri/software. MFDFA are introduced in Matlab code boxes where the reader can employ pieces of, or the entire MFDFA to example time series. After introducing MFDFA, the tutorial discusses the best practice of MFDFA in biomedical signal processing. The main aim of the tutorial is to give the reader a simple self-sustained guide to the implementation of MFDFA and interpretation of the resulting multifractal spectra. PMID:22675302
Interactive Supercomputing’s Star-P Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edelman, Alan; Husbands, Parry; Leibman, Steve
2006-09-19
The thesis of this extended abstract is simple. High productivity comes from high level infrastructures. To measure this, we introduce a methodology that goes beyond the tradition of timing software in serial and tuned parallel modes. We perform a classroom productivity study involving 29 students who have written a homework exercise in a low level language (MPI message passing) and a high level language (Star-P with MATLAB client). Our conclusions indicate what perhaps should be of little surprise: (1) the high level language is always far easier on the students than the low level language. (2) The early versions ofmore » the high level language perform inadequately compared to the tuned low level language, but later versions substantially catch up. Asymptotically, the analogy must hold that message passing is to high level language parallel programming as assembler is to high level environments such as MATLAB, Mathematica, Maple, or even Python. We follow the Kepner method that correctly realizes that traditional speedup numbers without some discussion of the human cost of reaching these numbers can fail to reflect the true human productivity cost of high performance computing. Traditional data compares low level message passing with serial computation. With the benefit of a high level language system in place, in our case Star-P running with MATLAB client, and with the benefit of a large data pool: 29 students, each running the same code ten times on three evolutions of the same platform, we can methodically demonstrate the productivity gains. To date we are not aware of any high level system as extensive and interoperable as Star-P, nor are we aware of an experiment of this kind performed with this volume of data.« less
Sinha, Shriprakash
2016-12-01
Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d -connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.
An algorithm for fast elastic wave simulation using a vectorized finite difference operator
NASA Astrophysics Data System (ADS)
Malkoti, Ajay; Vedanti, Nimisha; Tiwari, Ram Krishna
2018-07-01
Modern geophysical imaging techniques exploit the full wavefield information which can be simulated numerically. These numerical simulations are computationally expensive due to several factors, such as a large number of time steps and nodes, big size of the derivative stencil and huge model size. Besides these constraints, it is also important to reformulate the numerical derivative operator for improved efficiency. In this paper, we have introduced a vectorized derivative operator over the staggered grid with shifted coordinate systems. The operator increases the efficiency of simulation by exploiting the fact that each variable can be represented in the form of a matrix. This operator allows updating all nodes of a variable defined on the staggered grid, in a manner similar to the collocated grid scheme and thereby reducing the computational run-time considerably. Here we demonstrate an application of this operator to simulate the seismic wave propagation in elastic media (Marmousi model), by discretizing the equations on a staggered grid. We have compared the performance of this operator on three programming languages, which reveals that it can increase the execution speed by a factor of at least 2-3 times for FORTRAN and MATLAB; and nearly 100 times for Python. We have further carried out various tests in MATLAB to analyze the effect of model size and the number of time steps on total simulation run-time. We find that there is an additional, though small, computational overhead for each step and it depends on total number of time steps used in the simulation. A MATLAB code package, 'FDwave', for the proposed simulation scheme is available upon request.
Ben Abdallah, Emna; Folschette, Maxime; Roux, Olivier; Magnin, Morgan
2017-01-01
This paper addresses the problem of finding attractors in biological regulatory networks. We focus here on non-deterministic synchronous and asynchronous multi-valued networks, modeled using automata networks (AN). AN is a general and well-suited formalism to study complex interactions between different components (genes, proteins,...). An attractor is a minimal trap domain, that is, a part of the state-transition graph that cannot be escaped. Such structures are terminal components of the dynamics and take the form of steady states (singleton) or complex compositions of cycles (non-singleton). Studying the effect of a disease or a mutation on an organism requires finding the attractors in the model to understand the long-term behaviors. We present a computational logical method based on answer set programming (ASP) to identify all attractors. Performed without any network reduction, the method can be applied on any dynamical semantics. In this paper, we present the two most widespread non-deterministic semantics: the asynchronous and the synchronous updating modes. The logical approach goes through a complete enumeration of the states of the network in order to find the attractors without the necessity to construct the whole state-transition graph. We realize extensive computational experiments which show good performance and fit the expected theoretical results in the literature. The originality of our approach lies on the exhaustive enumeration of all possible (sets of) states verifying the properties of an attractor thanks to the use of ASP. Our method is applied to non-deterministic semantics in two different schemes (asynchronous and synchronous). The merits of our methods are illustrated by applying them to biological examples of various sizes and comparing the results with some existing approaches. It turns out that our approach succeeds to exhaustively enumerate on a desktop computer, in a large model (100 components), all existing attractors up to a given size (20 states). This size is only limited by memory and computation time.
Women's lives in transition: a qualitative analysis of the fertility decline in Bangladesh.
Simmons, R
1996-01-01
The fertility decline that began in Bangladesh in the late 1980s and continues has prompted diverse theories to explain it. In this qualitative analysis of 21 focus-group sessions with rural women ranging in age from the teens to late 40s and living in the villages of the Matlab area, the women's perceptions of their changing society and of the influence of the family planning program are examined. The women's statements reveal their awareness of the social and economic transition they are undergoing and their interest in family-size limitation, which is bolstered by a strong family planning program. Although the shifts in economic and social circumstances are not large, in conjunction with the strong family planning program they constitute a powerful force for change in attitudes, ideas, and behavior among these women.
NASA Astrophysics Data System (ADS)
Mehanee, Salah A.
2015-01-01
This paper describes a new method for tracing paleo-shear zones of the continental crust by self-potential (SP) data inversion. The method falls within the deterministic inversion framework, and it is exclusively applicable for the interpretation of the SP anomalies measured along a profile over sheet-type structures such as conductive thin films of interconnected graphite precipitations formed on shear planes. The inverse method fits a residual SP anomaly by a single thin sheet and recovers the characteristic parameters (depth to the top h, extension in depth a, amplitude coefficient k, and amount and direction of dip θ) of the sheet. This method minimizes an objective functional in the space of the logarithmed and non-logarithmed model parameters (log( h), log( a), log( k), and θ) successively by the steepest descent (SD) and Gauss-Newton (GN) techniques in order to essentially maintain the stability and convergence of this inverse method. Prior to applying the method to real data, its accuracy, convergence, and stability are successfully verified on numerical examples with and without noise. The method is then applied to SP profiles from the German Continental Deep Drilling Program (Kontinentales Tiefbohrprogramm der Bundesrepublik Deutschla - KTB), Rittsteig, and Grossensees sites in Germany for tracing paleo-shear planes coated with graphitic deposits. The comparisons of geologic sections constructed in this paper (based on the proposed deterministic approach) against the existing published interpretations (obtained based on trial-and-error modeling) for the SP data of the KTB and Rittsteig sites have revealed that the deterministic approach suggests some new details that are of some geological significance. The findings of the proposed inverse scheme are supported by available drilling and other geophysical data. Furthermore, the real SP data of the Grossensees site have been interpreted (apparently for the first time ever) by the deterministic inverse scheme from which interpretive geologic cross sections are suggested. The computational efficiency, analysis of the numerical examples investigated, and comparisons of the real data inverted here have demonstrated that the developed deterministic approach is advantageous to the existing interpretation methods, and it is suitable for meaningful interpretation of SP data acquired elsewhere over graphitic occurrences on fault planes.
Faust: Flexible Acquistion and Understanding System for Text
2013-07-01
second version is still underway and it will continue in development as part of the DARPA DEFT program; it is written in Java and Clojure with MySQL and...SUTime, a Java library that recognizes and normalizes temporal expressions using deterministic patterns [101]. UIUC made another such framework... Java -based, large-scale inference engine called Tuffy. It leverages the full power of a relational optimizer in an RDBMS to perform the grounding of MLN
Proceedings of the Expert Systems Workshop Held in Pacific Grove, California on 16-18 April 1986
1986-04-18
13- NUMBER OF PAGES 197 N IS. SECURITY CLASS, (ol Mm raport) UNCLASSIFIED I5a. DECLASSIFI CATION/DOWNGRADING SCHEDULE 16. DISTRIBUTION...are distributed and parallel. * - Features unimplemented at present; scheduled for phase 2. Table 1-1: Key design characteristics of ABE 2. a...data structuring techniques and a semi- deterministic scheduler . A program for the DF framework consists of a number of independent processing modules
NASA Technical Reports Server (NTRS)
Chin, Jeffrey C.; Csank, Jeffrey T.; Haller, William J.; Seidel, Jonathan A.
2016-01-01
This document outlines methodologies designed to improve the interface between the Numerical Propulsion System Simulation framework and various control and dynamic analyses developed in the Matlab and Simulink environment. Although NPSS is most commonly used for steady-state modeling, this paper is intended to supplement the relatively sparse documentation on it's transient analysis functionality. Matlab has become an extremely popular engineering environment, and better methodologies are necessary to develop tools that leverage the benefits of these disparate frameworks. Transient analysis is not a new feature of the Numerical Propulsion System Simulation (NPSS), but transient considerations are becoming more pertinent as multidisciplinary trade-offs begin to play a larger role in advanced engine designs. This paper serves to supplement the relatively sparse documentation on transient modeling and cover the budding convergence between NPSS and Matlab based modeling toolsets. The following sections explore various design patterns to rapidly develop transient models. Each approach starts with a base model built with NPSS, and assumes the reader already has a basic understanding of how to construct a steady-state model. The second half of the paper focuses on further enhancements required to subsequently interface NPSS with Matlab codes. The first method being the simplest and most straightforward but performance constrained, and the last being the most abstract. These methods aren't mutually exclusive and the specific implementation details could vary greatly based on the designer's discretion. Basic recommendations are provided to organize model logic in a format most easily amenable to integration with existing Matlab control toolsets.
System IDentification Programs for AirCraft (SIDPAC)
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2002-01-01
A collection of computer programs for aircraft system identification is described and demonstrated. The programs, collectively called System IDentification Programs for AirCraft, or SIDPAC, were developed in MATLAB as m-file functions. SIDPAC has been used successfully at NASA Langley Research Center with data from many different flight test programs and wind tunnel experiments. SIDPAC includes routines for experiment design, data conditioning, data compatibility analysis, model structure determination, equation-error and output-error parameter estimation in both the time and frequency domains, real-time and recursive parameter estimation, low order equivalent system identification, estimated parameter error calculation, linear and nonlinear simulation, plotting, and 3-D visualization. An overview of SIDPAC capabilities is provided, along with a demonstration of the use of SIDPAC with real flight test data from the NASA Glenn Twin Otter aircraft. The SIDPAC software is available without charge to U.S. citizens by request to the author, contingent on the requestor completing a NASA software usage agreement.
A Personal Navigation System Based on Inertial and Magnetic Field Measurements
2010-09-01
MATLAB IMPLEMENTATION.................................................................74 G. A MODEL FOR PENDULUM MOTION SENSOR DATA...76 1. Pendulum Model for MATLAB Simulation....................................76 2. Sensor Data Generated with the Pendulum Model... PENDULUM ..................................................................................................88 I. FILTER PERFORMANCE WITH REAL PENDULUM DATA
MILAMIN 2 - Fast MATLAB FEM solver
NASA Astrophysics Data System (ADS)
Dabrowski, Marcin; Krotkiewski, Marcin; Schmid, Daniel W.
2013-04-01
MILAMIN is a free and efficient MATLAB-based two-dimensional FEM solver utilizing unstructured meshes [Dabrowski et al., G-cubed (2008)]. The code consists of steady-state thermal diffusion and incompressible Stokes flow solvers implemented in approximately 200 lines of native MATLAB code. The brevity makes the code easily customizable. An important quality of MILAMIN is speed - it can handle millions of nodes within minutes on one CPU core of a standard desktop computer, and is faster than many commercial solutions. The new MILAMIN 2 allows three-dimensional modeling. It is designed as a set of functional modules that can be used as building blocks for efficient FEM simulations using MATLAB. The utilities are largely implemented as native MATLAB functions. For performance critical parts we use MUTILS - a suite of compiled MEX functions optimized for shared memory multi-core computers. The most important features of MILAMIN 2 are: 1. Modular approach to defining, tracking, and discretizing the geometry of the model 2. Interfaces to external mesh generators (e.g., Triangle, Fade2d, T3D) and mesh utilities (e.g., element type conversion, fast point location, boundary extraction) 3. Efficient computation of the stiffness matrix for a wide range of element types, anisotropic materials and three-dimensional problems 4. Fast global matrix assembly using a dedicated MEX function 5. Automatic integration rules 6. Flexible prescription (spatial, temporal, and field functions) and efficient application of Dirichlet, Neuman, and periodic boundary conditions 7. Treatment of transient and non-linear problems 8. Various iterative and multi-level solution strategies 9. Post-processing tools (e.g., numerical integration) 10. Visualization primitives using MATLAB, and VTK export functions We provide a large number of examples that show how to implement a custom FEM solver using the MILAMIN 2 framework. The examples are MATLAB scripts of increasing complexity that address a given technical topic (e.g., creating meshes, reordering nodes, applying boundary conditions), a given numerical topic (e.g., using various solution strategies, non-linear iterations), or that present a fully-developed solver designed to address a scientific topic (e.g., performing Stokes flow simulations in synthetic porous medium). References: Dabrowski, M., M. Krotkiewski, and D. W. Schmid MILAMIN: MATLAB-based finite element method solver for large problems, Geochem. Geophys. Geosyst., 9, Q04030, 2008
Deterministic and Stochastic Analysis of a Prey-Dependent Predator-Prey System
ERIC Educational Resources Information Center
Maiti, Alakes; Samanta, G. P.
2005-01-01
This paper reports on studies of the deterministic and stochastic behaviours of a predator-prey system with prey-dependent response function. The first part of the paper deals with the deterministic analysis of uniform boundedness, permanence, stability and bifurcation. In the second part the reproductive and mortality factors of the prey and…
ShinyGPAS: interactive genomic prediction accuracy simulator based on deterministic formulas.
Morota, Gota
2017-12-20
Deterministic formulas for the accuracy of genomic predictions highlight the relationships among prediction accuracy and potential factors influencing prediction accuracy prior to performing computationally intensive cross-validation. Visualizing such deterministic formulas in an interactive manner may lead to a better understanding of how genetic factors control prediction accuracy. The software to simulate deterministic formulas for genomic prediction accuracy was implemented in R and encapsulated as a web-based Shiny application. Shiny genomic prediction accuracy simulator (ShinyGPAS) simulates various deterministic formulas and delivers dynamic scatter plots of prediction accuracy versus genetic factors impacting prediction accuracy, while requiring only mouse navigation in a web browser. ShinyGPAS is available at: https://chikudaisei.shinyapps.io/shinygpas/ . ShinyGPAS is a shiny-based interactive genomic prediction accuracy simulator using deterministic formulas. It can be used for interactively exploring potential factors that influence prediction accuracy in genome-enabled prediction, simulating achievable prediction accuracy prior to genotyping individuals, or supporting in-class teaching. ShinyGPAS is open source software and it is hosted online as a freely available web-based resource with an intuitive graphical user interface.
Automated Calibration For Numerical Models Of Riverflow
NASA Astrophysics Data System (ADS)
Fernandez, Betsaida; Kopmann, Rebekka; Oladyshkin, Sergey
2017-04-01
Calibration of numerical models is fundamental since the beginning of all types of hydro system modeling, to approximate the parameters that can mimic the overall system behavior. Thus, an assessment of different deterministic and stochastic optimization methods is undertaken to compare their robustness, computational feasibility, and global search capacity. Also, the uncertainty of the most suitable methods is analyzed. These optimization methods minimize the objective function that comprises synthetic measurements and simulated data. Synthetic measurement data replace the observed data set to guarantee an existing parameter solution. The input data for the objective function derivate from a hydro-morphological dynamics numerical model which represents an 180-degree bend channel. The hydro- morphological numerical model shows a high level of ill-posedness in the mathematical problem. The minimization of the objective function by different candidate methods for optimization indicates a failure in some of the gradient-based methods as Newton Conjugated and BFGS. Others reveal partial convergence, such as Nelder-Mead, Polak und Ribieri, L-BFGS-B, Truncated Newton Conjugated, and Trust-Region Newton Conjugated Gradient. Further ones indicate parameter solutions that range outside the physical limits, such as Levenberg-Marquardt and LeastSquareRoot. Moreover, there is a significant computational demand for genetic optimization methods, such as Differential Evolution and Basin-Hopping, as well as for Brute Force methods. The Deterministic Sequential Least Square Programming and the scholastic Bayes Inference theory methods present the optimal optimization results. keywords: Automated calibration of hydro-morphological dynamic numerical model, Bayesian inference theory, deterministic optimization methods.
PSQP: Puzzle Solving by Quadratic Programming.
Andalo, Fernanda A; Taubin, Gabriel; Goldenstein, Siome
2017-02-01
In this article we present the first effective method based on global optimization for the reconstruction of image puzzles comprising rectangle pieces-Puzzle Solving by Quadratic Programming (PSQP). The proposed novel mathematical formulation reduces the problem to the maximization of a constrained quadratic function, which is solved via a gradient ascent approach. The proposed method is deterministic and can deal with arbitrary identical rectangular pieces. We provide experimental results showing its effectiveness when compared to state-of-the-art approaches. Although the method was developed to solve image puzzles, we also show how to apply it to the reconstruction of simulated strip-shredded documents, broadening its applicability.
SDIO Producibility and Manufacturing Intelligent Processing Programs
NASA Technical Reports Server (NTRS)
Stottlemyer, Greg
1992-01-01
SDIO has to fashion a comprehensive strategy to insert the capability of an industrial base into ongoing design tradeoffs. This means that there is not only a need to determine if something can be made to the precision needed to meet system performance, but also what changes need to be made in that industry sector to develop a deterministic approach to fabrication precision components. Developing and introducing advanced production and quality control systems is part of this success. To address this situation, SDIO has developed the MODIL (Manufacturing Operations Development and Integration Labs) program. MODILs were developed into three areas: Survivable Optics, Electronics and Sensors, and Spacecraft Fabrication and Test.
KEGGParser: parsing and editing KEGG pathway maps in Matlab.
Arakelyan, Arsen; Nersisyan, Lilit
2013-02-15
KEGG pathway database is a collection of manually drawn pathway maps accompanied with KGML format files intended for use in automatic analysis. KGML files, however, do not contain the required information for complete reproduction of all the events indicated in the static image of a pathway map. Several parsers and editors of KEGG pathways exist for processing KGML files. We introduce KEGGParser-a MATLAB based tool for KEGG pathway parsing, semiautomatic fixing, editing, visualization and analysis in MATLAB environment. It also works with Scilab. The source code is available at http://www.mathworks.com/matlabcentral/fileexchange/37561.
An Open-source Toolbox for Analysing and Processing PhysioNet Databases in MATLAB and Octave.
Silva, Ikaro; Moody, George B
The WaveForm DataBase (WFDB) Toolbox for MATLAB/Octave enables integrated access to PhysioNet's software and databases. Using the WFDB Toolbox for MATLAB/Octave, users have access to over 50 physiological databases in PhysioNet. The toolbox provides access over 4 TB of biomedical signals including ECG, EEG, EMG, and PLETH. Additionally, most signals are accompanied by metadata such as medical annotations of clinical events: arrhythmias, sleep stages, seizures, hypotensive episodes, etc. Users of this toolbox should easily be able to reproduce, validate, and compare results published based on PhysioNet's software and databases.
Visualizing the inner product space ℝm×n in a MATLAB-assisted linear algebra classroom
NASA Astrophysics Data System (ADS)
Caglayan, Günhan
2018-05-01
This linear algebra note offers teaching and learning ideas in the treatment of the inner product space ? in a technology-supported learning environment. Classroom activities proposed in this note demonstrate creative ways of integrating MATLAB technology into various properties of Frobenius inner product as visualization tools that complement the algebraic approach. As implemented in linear algebra lessons in a university in the Unites States, the article also incorporates algebraic and visual work of students who experienced these activities with MATLAB software. The connection between the Frobenius norm and the Euclidean norm is also emphasized.
Das, Sumon Kumar; Klontz, Erik H; Azmi, Ishrat J; Ud-Din, Abu I M S; Chisti, Mohammod Jobayer; Afrad, Mokibul Hassan; Malek, Mohammad Abdul; Ahmed, Shahnawaz; Das, Jui; Talukder, Kaisar Ali; Salam, Mohammed Abdus; Bardhan, Pradip Kumar; Faruque, Abu Syed Golam; Klontz, Karl C
2013-12-22
We determined the frequency of multidrug resistant (MDR) infections with Shigella spp. and Vibrio cholerae O1 at an urban (Dhaka) and rural (Matlab) hospital in Bangladesh. We also compared sociodemographic and clinical features of patients with MDR infections to those with antibiotic-susceptible infections at both sites. Analyses were conducted using surveillance data from the International Centre for Diarrhoeal Disease Research, Bangladesh (icddr,b), for the years 2000-2012. Compared to patients with antibiotic-susceptible for Shigella infections, those in Dhaka with MDR shigellosis were more likely to experience diarrhea for >24 hours, while, in Matlab, they were more likely to stay inhospital >24 hours. For MDR shigellosis, Dhaka patients were more likely than those in Matlab to have dehydration, stool frequency >10/day, and diarrheal duration >24 hours. Patients with MDR Vibrio cholerae O1 infections in Dhaka were more likely than those in Matlab to experience dehydration and stool frequency >10/day. Thus, patients with MDR shigellosis and Vibrio cholerae O1 infection exhibited features suggesting more severe illness than those with antibiotic-susceptible infections. Moreover, Dhaka patients with MDR shigellosis and Vibrio cholerae O1 infections exhibited features indicating more severe illness than patients in Matlab.
Modeling of diatomic molecule using the Morse potential and the Verlet algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fidiani, Elok
Performing molecular modeling usually uses special software for Molecular Dynamics (MD) such as: GROMACS, NAMD, JMOL etc. Molecular dynamics is a computational method to calculate the time dependent behavior of a molecular system. In this work, MATLAB was used as numerical method for a simple modeling of some diatomic molecules: HCl, H{sub 2} and O{sub 2}. MATLAB is a matrix based numerical software, in order to do numerical analysis, all the functions and equations describing properties of atoms and molecules must be developed manually in MATLAB. In this work, a Morse potential was generated to describe the bond interaction betweenmore » the two atoms. In order to analyze the simultaneous motion of molecules, the Verlet Algorithm derived from Newton’s Equations of Motion (classical mechanics) was operated. Both the Morse potential and the Verlet algorithm were integrated using MATLAB to derive physical properties and the trajectory of the molecules. The data computed by MATLAB is always in the form of a matrix. To visualize it, Visualized Molecular Dynamics (VMD) was performed. Such method is useful for development and testing some types of interaction on a molecular scale. Besides, this can be very helpful for describing some basic principles of molecular interaction for educational purposes.« less
Development of an energy storage tank model
NASA Astrophysics Data System (ADS)
Buckley, Robert Christopher
A linearized, one-dimensional finite difference model employing an implicit finite difference method for energy storage tanks is developed, programmed with MATLAB, and demonstrated for different applications. A set of nodal energy equations is developed by considering the energy interactions on a small control volume. The general method of solving these equations is described as are other features of the simulation program. Two modeling applications are presented: the first using a hot water storage tank with a solar collector and an absorption chiller to cool a building in the summer, the second using a molten salt storage system with a solar collector and steam power plant to generate electricity. Recommendations for further study as well as all of the source code generated in the project are also provided.
The Non-Signalling theorem in generalizations of Bell's theorem
NASA Astrophysics Data System (ADS)
Walleczek, J.; Grössing, G.
2014-04-01
Does "epistemic non-signalling" ensure the peaceful coexistence of special relativity and quantum nonlocality? The possibility of an affirmative answer is of great importance to deterministic approaches to quantum mechanics given recent developments towards generalizations of Bell's theorem. By generalizations of Bell's theorem we here mean efforts that seek to demonstrate the impossibility of any deterministic theories to obey the predictions of Bell's theorem, including not only local hidden-variables theories (LHVTs) but, critically, of nonlocal hidden-variables theories (NHVTs) also, such as de Broglie-Bohm theory. Naturally, in light of the well-established experimental findings from quantum physics, whether or not a deterministic approach to quantum mechanics, including an emergent quantum mechanics, is logically possible, depends on compatibility with the predictions of Bell's theorem. With respect to deterministic NHVTs, recent attempts to generalize Bell's theorem have claimed the impossibility of any such approaches to quantum mechanics. The present work offers arguments showing why such efforts towards generalization may fall short of their stated goal. In particular, we challenge the validity of the use of the non-signalling theorem as a conclusive argument in favor of the existence of free randomness, and therefore reject the use of the non-signalling theorem as an argument against the logical possibility of deterministic approaches. We here offer two distinct counter-arguments in support of the possibility of deterministic NHVTs: one argument exposes the circularity of the reasoning which is employed in recent claims, and a second argument is based on the inconclusive metaphysical status of the non-signalling theorem itself. We proceed by presenting an entirely informal treatment of key physical and metaphysical assumptions, and of their interrelationship, in attempts seeking to generalize Bell's theorem on the basis of an ontic, foundational interpretation of the non-signalling theorem. We here argue that the non-signalling theorem must instead be viewed as an epistemic, operational theorem i.e. one that refers exclusively to what epistemic agents can, or rather cannot, do. That is, we emphasize that the non-signalling theorem is a theorem about the operational inability of epistemic agents to signal information. In other words, as a proper principle, the non-signalling theorem may only be employed as an epistemic, phenomenological, or operational principle. Critically, our argument emphasizes that the non-signalling principle must not be used as an ontic principle about physical reality as such, i.e. as a theorem about the nature of physical reality independently of epistemic agents e.g. human observers. One major reason in favor of our conclusion is that any definition of signalling or of non-signalling invariably requires a reference to epistemic agents, and what these agents can actually measure and report. Otherwise, the non-signalling theorem would equal a general "no-influence" theorem. In conclusion, under the assumption that the non-signalling theorem is epistemic (i.e. "epistemic non-signalling"), the search for deterministic approaches to quantum mechanics, including NHVTs and an emergent quantum mechanics, continues to be a viable research program towards disclosing the foundations of physical reality at its smallest dimensions.
The Communications Toolbox for MATLAB and E0 3513 Laboratory Design
1994-03-24
lengtli(quanchj)-1)) sxniejbinnuMs=bin...nums(1:6) %;.. M.asure dhe quantization noise sig-toLia.is~snr(s1 ,quansig); str --numn2str(sigjoQnois); %Plot 4...compressed quantized signal exp..comp=expmnd(quansig2,mu2,tmx(quansi2)); %use maximum signal value =nrw.ccwnp-snr(s,exp~coinp); str -num2str(sir..comp); MPot...8217) ylabelCAmplitude’) grid cig %Part 2--Observe fte differences in the spectra Program mung Laboratory 3B Key-age 3 319 %for two types of pulse-modulatWd signals
Artificial Intelligence in Prediction of Secondary Protein Structure Using CB513 Database
Avdagic, Zikrija; Purisevic, Elvir; Omanovic, Samir; Coralic, Zlatan
2009-01-01
In this paper we describe CB513 a non-redundant dataset, suitable for development of algorithms for prediction of secondary protein structure. A program was made in Borland Delphi for transforming data from our dataset to make it suitable for learning of neural network for prediction of secondary protein structure implemented in MATLAB Neural-Network Toolbox. Learning (training and testing) of neural network is researched with different sizes of windows, different number of neurons in the hidden layer and different number of training epochs, while using dataset CB513. PMID:21347158
The research of "blind" spot in the LVQ network
NASA Astrophysics Data System (ADS)
Guo, Zhanjie; Nan, Shupo; Wang, Xiaoli
2017-04-01
Nowadays competitive neural network has been widely used in the pattern recognition, classification and other aspects, and show the great advantages compared with the traditional clustering methods. But the competitive neural networks still has inadequate in many aspects, and it needs to be further improved. Based on the learning Vector Quantization Network proposed by Learning Kohonen [1], this paper resolve the issue of the large training error, when there are "blind" spots in a network through the introduction of threshold value learning rules and finally programs the realization with Matlab.
Nearly Interactive Parabolized Navier-Stokes Solver for High Speed Forebody and Inlet Flows
NASA Technical Reports Server (NTRS)
Benson, Thomas J.; Liou, May-Fun; Jones, William H.; Trefny, Charles J.
2009-01-01
A system of computer programs is being developed for the preliminary design of high speed inlets and forebodies. The system comprises four functions: geometry definition, flow grid generation, flow solver, and graphics post-processor. The system runs on a dedicated personal computer using the Windows operating system and is controlled by graphical user interfaces written in MATLAB (The Mathworks, Inc.). The flow solver uses the Parabolized Navier-Stokes equations to compute millions of mesh points in several minutes. Sample two-dimensional and three-dimensional calculations are demonstrated in the paper.
NASA Astrophysics Data System (ADS)
Mann, Christopher; Narasimhamurthi, Natarajan
1998-08-01
This paper discusses a specific implementation of a web and complement based simulation systems. The overall simulation container is implemented within a web page viewed with Microsoft's Internet Explorer 4.0 web browser. Microsoft's ActiveX/Distributed Component Object Model object interfaces are used in conjunction with the Microsoft DirectX graphics APIs to provide visualization functionality for the simulation. The MathWorks' Matlab computer aided control system design program is used as an ActiveX automation server to provide the compute engine for the simulations.
Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf
2016-06-01
This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.
Optimization Routine for Generating Medical Kits for Spaceflight Using the Integrated Medical Model
NASA Technical Reports Server (NTRS)
Graham, Kimberli; Myers, Jerry; Goodenow, Deb
2017-01-01
The Integrated Medical Model (IMM) is a MATLAB model that provides probabilistic assessment of the medical risk associated with human spaceflight missions.Different simulations or profiles can be run in which input conditions regarding both mission characteristics and crew characteristics may vary. For each simulation, the IMM records the total medical events that occur and “treats” each event with resources drawn from import scripts. IMM outputs include Total Medical Events (TME), Crew Health Index (CHI), probability of Evacuation (pEVAC), and probability of Loss of Crew Life (pLOCL).The Crew Health Index is determined by the amount of quality time lost (QTL). Previously, an optimization code was implemented in order to efficiently generate medical kits. The kits were optimized to have the greatest benefit possible, given amass and/or volume constraint. A 6-crew, 14-day lunar mission was chosen for the simulation and run through the IMM for 100,000 trials. A built-in MATLAB solver, mixed-integer linear programming, was used for the optimization routine. Kits were generated in 10% increments ranging from 10%-100% of the benefit constraints. Conditions wheremass alone was minimized, volume alone was minimized, and where mass and volume were minimizedjointly were tested.
Optimization design of wind turbine drive train based on Matlab genetic algorithm toolbox
NASA Astrophysics Data System (ADS)
Li, R. N.; Liu, X.; Liu, S. J.
2013-12-01
In order to ensure the high efficiency of the whole flexible drive train of the front-end speed adjusting wind turbine, the working principle of the main part of the drive train is analyzed. As critical parameters, rotating speed ratios of three planetary gear trains are selected as the research subject. The mathematical model of the torque converter speed ratio is established based on these three critical variable quantity, and the effect of key parameters on the efficiency of hydraulic mechanical transmission is analyzed. Based on the torque balance and the energy balance, refer to hydraulic mechanical transmission characteristics, the transmission efficiency expression of the whole drive train is established. The fitness function and constraint functions are established respectively based on the drive train transmission efficiency and the torque converter rotating speed ratio range. And the optimization calculation is carried out by using MATLAB genetic algorithm toolbox. The optimization method and results provide an optimization program for exact match of wind turbine rotor, gearbox, hydraulic mechanical transmission, hydraulic torque converter and synchronous generator, ensure that the drive train work with a high efficiency, and give a reference for the selection of the torque converter and hydraulic mechanical transmission.
Solving delay differential equations in S-ADAPT by method of steps.
Bauer, Robert J; Mo, Gary; Krzyzanski, Wojciech
2013-09-01
S-ADAPT is a version of the ADAPT program that contains additional simulation and optimization abilities such as parametric population analysis. S-ADAPT utilizes LSODA to solve ordinary differential equations (ODEs), an algorithm designed for large dimension non-stiff and stiff problems. However, S-ADAPT does not have a solver for delay differential equations (DDEs). Our objective was to implement in S-ADAPT a DDE solver using the methods of steps. The method of steps allows one to solve virtually any DDE system by transforming it to an ODE system. The solver was validated for scalar linear DDEs with one delay and bolus and infusion inputs for which explicit analytic solutions were derived. Solutions of nonlinear DDE problems coded in S-ADAPT were validated by comparing them with ones obtained by the MATLAB DDE solver dde23. The estimation of parameters was tested on the MATLB simulated population pharmacodynamics data. The comparison of S-ADAPT generated solutions for DDE problems with the explicit solutions as well as MATLAB produced solutions which agreed to at least 7 significant digits. The population parameter estimates from using importance sampling expectation-maximization in S-ADAPT agreed with ones used to generate the data. Published by Elsevier Ireland Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y M; Bush, K; Han, B
Purpose: Accurate and fast dose calculation is a prerequisite of precision radiation therapy in modern photon and particle therapy. While Monte Carlo (MC) dose calculation provides high dosimetric accuracy, the drastically increased computational time hinders its routine use. Deterministic dose calculation methods are fast, but problematic in the presence of tissue density inhomogeneity. We leverage the useful features of deterministic methods and MC to develop a hybrid dose calculation platform with autonomous utilization of MC and deterministic calculation depending on the local geometry, for optimal accuracy and speed. Methods: Our platform utilizes a Geant4 based “localized Monte Carlo” (LMC) methodmore » that isolates MC dose calculations only to volumes that have potential for dosimetric inaccuracy. In our approach, additional structures are created encompassing heterogeneous volumes. Deterministic methods calculate dose and energy fluence up to the volume surfaces, where the energy fluence distribution is sampled into discrete histories and transported using MC. Histories exiting the volume are converted back into energy fluence, and transported deterministically. By matching boundary conditions at both interfaces, deterministic dose calculation account for dose perturbations “downstream” of localized heterogeneities. Hybrid dose calculation was performed for water and anthropomorphic phantoms. Results: We achieved <1% agreement between deterministic and MC calculations in the water benchmark for photon and proton beams, and dose differences of 2%–15% could be observed in heterogeneous phantoms. The saving in computational time (a factor ∼4–7 compared to a full Monte Carlo dose calculation) was found to be approximately proportional to the volume of the heterogeneous region. Conclusion: Our hybrid dose calculation approach takes advantage of the computational efficiency of deterministic method and accuracy of MC, providing a practical tool for high performance dose calculation in modern RT. The approach is generalizable to all modalities where heterogeneities play a large role, notably particle therapy.« less
Three-dimensional rendering of segmented object using matlab - biomed 2010.
Anderson, Jeffrey R; Barrett, Steven F
2010-01-01
The three-dimensional rendering of microscopic objects is a difficult and challenging task that often requires specialized image processing techniques. Previous work has been described of a semi-automatic segmentation process of fluorescently stained neurons collected as a sequence of slice images with a confocal laser scanning microscope. Once properly segmented, each individual object can be rendered and studied as a three-dimensional virtual object. This paper describes the work associated with the design and development of Matlab files to create three-dimensional images from the segmented object data previously mentioned. Part of the motivation for this work is to integrate both the segmentation and rendering processes into one software application, providing a seamless transition from the segmentation tasks to the rendering and visualization tasks. Previously these tasks were accomplished on two different computer systems, windows and Linux. This transition basically limits the usefulness of the segmentation and rendering applications to those who have both computer systems readily available. The focus of this work is to create custom Matlab image processing algorithms for object rendering and visualization, and merge these capabilities to the Matlab files that were developed especially for the image segmentation task. The completed Matlab application will contain both the segmentation and rendering processes in a single graphical user interface, or GUI. This process for rendering three-dimensional images in Matlab requires that a sequence of two-dimensional binary images, representing a cross-sectional slice of the object, be reassembled in a 3D space, and covered with a surface. Additional segmented objects can be rendered in the same 3D space. The surface properties of each object can be varied by the user to aid in the study and analysis of the objects. This inter-active process becomes a powerful visual tool to study and understand microscopic objects.
A platform for dynamic simulation and control of movement based on OpenSim and MATLAB
Mansouri, Misagh; Reinbolt, Jeffrey A.
2013-01-01
Numerical simulations play an important role in solving complex engineering problems and have the potential to revolutionize medical decision making and treatment strategies. In this paper, we combine the rapid model-based design, control systems and powerful numerical method strengths of MATLAB/Simulink with the simulation and human movement dynamics strengths of OpenSim by developing a new interface between the two software tools. OpenSim is integrated with Simulink using the MATLAB S-function mechanism, and the interface is demonstrated using both open-loop and closed-loop control systems. While the open-loop system uses MATLAB/Simulink to separately reproduce the OpenSim Forward Dynamics Tool, the closed-loop system adds the unique feature of feedback control to OpenSim, which is necessary for most human movement simulations. An arm model example was successfully used in both open-loop and closed-loop cases. For the open-loop case, the simulation reproduced results from the OpenSim Forward Dynamics Tool with root mean square (RMS) differences of 0.03° for the shoulder elevation angle and 0.06° for the elbow flexion angle. MATLAB’s variable step-size integrator reduced the time required to generate the forward dynamic simulation from 7.1 s (OpenSim) to 2.9 s (MATLAB). For the closed-loop case, a proportional–integral–derivative controller was used to successfully balance a pole on model’s hand despite random force disturbances on the pole. The new interface presented here not only integrates the OpenSim and MATLAB/Simulink software tools, but also will allow neuroscientists, physiologists, biomechanists, and physical therapists to adapt and generate new solutions as treatments for musculoskeletal conditions. PMID:22464351
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
Rule-based programming paradigm: a formal basis for biological, chemical and physical computation.
Krishnamurthy, V; Krishnamurthy, E V
1999-03-01
A rule-based programming paradigm is described as a formal basis for biological, chemical and physical computations. In this paradigm, the computations are interpreted as the outcome arising out of interaction of elements in an object space. The interactions can create new elements (or same elements with modified attributes) or annihilate old elements according to specific rules. Since the interaction rules are inherently parallel, any number of actions can be performed cooperatively or competitively among the subsets of elements, so that the elements evolve toward an equilibrium or unstable or chaotic state. Such an evolution may retain certain invariant properties of the attributes of the elements. The object space resembles Gibbsian ensemble that corresponds to a distribution of points in the space of positions and momenta (called phase space). It permits the introduction of probabilities in rule applications. As each element of the ensemble changes over time, its phase point is carried into a new phase point. The evolution of this probability cloud in phase space corresponds to a distributed probabilistic computation. Thus, this paradigm can handle tor deterministic exact computation when the initial conditions are exactly specified and the trajectory of evolution is deterministic. Also, it can handle probabilistic mode of computation if we want to derive macroscopic or bulk properties of matter. We also explain how to support this rule-based paradigm using relational-database like query processing and transactions.
Automatic Design of Synthetic Gene Circuits through Mixed Integer Non-linear Programming
Huynh, Linh; Kececioglu, John; Köppe, Matthias; Tagkopoulos, Ilias
2012-01-01
Automatic design of synthetic gene circuits poses a significant challenge to synthetic biology, primarily due to the complexity of biological systems, and the lack of rigorous optimization methods that can cope with the combinatorial explosion as the number of biological parts increases. Current optimization methods for synthetic gene design rely on heuristic algorithms that are usually not deterministic, deliver sub-optimal solutions, and provide no guaranties on convergence or error bounds. Here, we introduce an optimization framework for the problem of part selection in synthetic gene circuits that is based on mixed integer non-linear programming (MINLP), which is a deterministic method that finds the globally optimal solution and guarantees convergence in finite time. Given a synthetic gene circuit, a library of characterized parts, and user-defined constraints, our method can find the optimal selection of parts that satisfy the constraints and best approximates the objective function given by the user. We evaluated the proposed method in the design of three synthetic circuits (a toggle switch, a transcriptional cascade, and a band detector), with both experimentally constructed and synthetic promoter libraries. Scalability and robustness analysis shows that the proposed framework scales well with the library size and the solution space. The work described here is a step towards a unifying, realistic framework for the automated design of biological circuits. PMID:22536398
Linking Advanced Visualization and MATLAB for the Analysis of 3D Gene Expression Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver; Keranen, Soile V.E.; Biggin, Mark
Three-dimensional gene expression PointCloud data generated by the Berkeley Drosophila Transcription Network Project (BDTNP) provides quantitative information about the spatial and temporal expression of genes in early Drosophila embryos at cellular resolution. The BDTNP team visualizes and analyzes Point-Cloud data using the software application PointCloudXplore (PCX). To maximize the impact of novel, complex data sets, such as PointClouds, the data needs to be accessible to biologists and comprehensible to developers of analysis functions. We address this challenge by linking PCX and Matlab via a dedicated interface, thereby providing biologists seamless access to advanced data analysis functions and giving bioinformatics researchersmore » the opportunity to integrate their analysis directly into the visualization application. To demonstrate the usefulness of this approach, we computationally model parts of the expression pattern of the gene even skipped using a genetic algorithm implemented in Matlab and integrated into PCX via our Matlab interface.« less
NASA Astrophysics Data System (ADS)
Vlasayevsky, Stanislav; Klimash, Stepan; Klimash, Vladimir
2017-10-01
A set of mathematical modules was developed for evaluation the energy performance in the research of electrical systems and complexes in the MatLab. In the electrotechnical library SimPowerSystems of the MatLab software, there are no measuring modules of energy coefficients characterizing the quality of electricity and the energy efficiency of electrical apparatus. Modules are designed to calculate energy coefficients characterizing the quality of electricity (current distortion and voltage distortion) and energy efficiency indicators (power factor and efficiency) are presented. There are described the methods and principles of building the modules. The detailed schemes of modules built on the elements of the Simulink Library are presented, in this connection, these modules are compatible with mathematical models of electrical systems and complexes in the MatLab. Also there are presented the results of the testing of the developed modules and the results of their verification on the schemes that have analytical expressions of energy indicators.
DSISoft—a MATLAB VSP data processing package
NASA Astrophysics Data System (ADS)
Beaty, K. S.; Perron, G.; Kay, I.; Adam, E.
2002-05-01
DSISoft is a public domain vertical seismic profile processing software package developed at the Geological Survey of Canada. DSISoft runs under MATLAB version 5.0 and above and hence is portable between computer operating systems supported by MATLAB (i.e. Unix, Windows, Macintosh, Linux). The package includes processing modules for reading and writing various standard seismic data formats, performing data editing, sorting, filtering, and other basic processing modules. The processing sequence can be scripted allowing batch processing and easy documentation. A structured format has been developed to ensure future additions to the package are compatible with existing modules. Interactive modules have been created using MATLAB's graphical user interface builder for displaying seismic data, picking first break times, examining frequency spectra, doing f- k filtering, and plotting the trace header information. DSISoft modular design facilitates the incorporation of new processing algorithms as they are developed. This paper gives an overview of the scope of the software and serves as a guide for the addition of new modules.
Stability analysis of multi-group deterministic and stochastic epidemic models with vaccination rate
NASA Astrophysics Data System (ADS)
Wang, Zhi-Gang; Gao, Rui-Mei; Fan, Xiao-Ming; Han, Qi-Xing
2014-09-01
We discuss in this paper a deterministic multi-group MSIR epidemic model with a vaccination rate, the basic reproduction number ℛ0, a key parameter in epidemiology, is a threshold which determines the persistence or extinction of the disease. By using Lyapunov function techniques, we show if ℛ0 is greater than 1 and the deterministic model obeys some conditions, then the disease will prevail, the infective persists and the endemic state is asymptotically stable in a feasible region. If ℛ0 is less than or equal to 1, then the infective disappear so the disease dies out. In addition, stochastic noises around the endemic equilibrium will be added to the deterministic MSIR model in order that the deterministic model is extended to a system of stochastic ordinary differential equations. In the stochastic version, we carry out a detailed analysis on the asymptotic behavior of the stochastic model. In addition, regarding the value of ℛ0, when the stochastic system obeys some conditions and ℛ0 is greater than 1, we deduce the stochastic system is stochastically asymptotically stable. Finally, the deterministic and stochastic model dynamics are illustrated through computer simulations.
NASA Astrophysics Data System (ADS)
Błażejewski, Ryszard; Murat-Błażejewska, Sadżide; Jędrkowiak, Martyna
2014-09-01
The paper presents a water balance of a flow-through, dammed lake, consisted of the following terms: surface inflow, underground inflow/outflow based on the Dupuit's equation, precipitation on the lake surface, evaporation from water surface and outflow from the lake at which a damming weir is located. The balance equation was implemented Matlab-Simulink®. Applicability of the model was assessed on the example of the Sławianowskie Lake of surface area 276 ha and mean depth - 6.6 m, Water balances, performed for month time intervals in the hydrological year 2009, showed good agreement for the first three months only. It is concluded that the balancing time interval should be shorter (1 day) to minimize the errors. For calibration purposes, measurements of ground water levels in the vicinity of the lake are also recommended. Praca przedstawia bilans wodny przepływowego piętrzonego jeziora, uwzględniający dopływ powierzchniowy, dopływ i odpływ podziemny opisany równaniem Dupuita, opad na powierzchnię jeziora, parowanie z powierzchni wody oraz odpływ w przekroju zamkniętym jazem piętrzącym. Z uwagi na nieliniowe związki wymienionych składników bilansu z poziomem wody w jeziorze, do obliczeń wykorzystano program komuterowy Matlab-Simulink®. Przydatność modelu sprawdzono na przykładzie Jeziora Sławianowskiego o powierzchni 276 ha i średniej głębokości - 6,6 m. Jezioro to zostało podzielone na dwa akweny o zróżnicowanej głębokości. Wyniki obliczeń miesięcznych bilansów wodnych dla roku hydrologicznego 2009, wykazały dobrą zgodność z pomiarami jedynie dla trzech pierwszych miesięcy. Stwierdzono, że dla zmniejszenia błędów obliczeniowych należałoby skrócić interwał bilansowania do jednej doby. Kalibracja modelu byłaby łatwiejsza i bardziej adekwatna, gdyby do oszacowania przewodności hydraulicznej przyległych do jeziora gruntów i osadów dennych wykorzystać badania poziomów wody w piezometrach, zlokalizowanych w kilku transektach, prostopadłych do linii brzegowej jeziora.
Computer programming in the UK undergraduate mathematics curriculum
NASA Astrophysics Data System (ADS)
Sangwin, Christopher J.; O'Toole, Claire
2017-11-01
This paper reports a study which investigated the extent to which undergraduate mathematics students in the United Kingdom are currently taught to programme a computer as a core part of their mathematics degree programme. We undertook an online survey, with significant follow-up correspondence, to gather data on current curricula and received replies from 46 (63%) of the departments who teach a BSc mathematics degree. We found that 78% of BSc degree courses in mathematics included computer programming in a compulsory module but 11% of mathematics degree programmes do not teach programming to all their undergraduate mathematics students. In 2016, programming is most commonly taught to undergraduate mathematics students through imperative languages, notably MATLAB, using numerical analysis as the underlying (or parallel) mathematical subject matter. Statistics is a very popular choice in optional courses, using the package R. Computer algebra systems appear to be significantly less popular for compulsory first-year courses than a decade ago, and there was no mention of logic programming, functional programming or automatic theorem proving software. The modal form of assessment of computing modules is entirely by coursework (i.e. no examination).
Wang, Dongmei; Yu, Liniu; Zhou, Xianlian; Wang, Chengtao
2004-02-01
Four types of 3D mathematical mode of the muscle groups applied to the human mandible have been developed. One is based on electromyography (EMG) and the others are based on linear programming with different objective function. Each model contains 26 muscle forces and two joint forces, allowing simulation of static bite forces and concomitant joint reaction forces for various bite point locations and mandibular positions. In this paper, the method of image processing to measure the position and direction of muscle forces according to 3D CAD model was built with CT data. Matlab optimization toolbox is applied to solve the three modes based on linear programming. Results show that the model with an objective function requiring a minimum sum of the tensions in the muscles is reasonable and agrees very well with the normal physiology activity.
Excel spreadsheet in teaching numerical methods
NASA Astrophysics Data System (ADS)
Djamila, Harimi
2017-09-01
One of the important objectives in teaching numerical methods for undergraduates’ students is to bring into the comprehension of numerical methods algorithms. Although, manual calculation is important in understanding the procedure, it is time consuming and prone to error. This is specifically the case when considering the iteration procedure used in many numerical methods. Currently, many commercial programs are useful in teaching numerical methods such as Matlab, Maple, and Mathematica. These are usually not user-friendly by the uninitiated. Excel spreadsheet offers an initial level of programming, which it can be used either in or off campus. The students will not be distracted with writing codes. It must be emphasized that general commercial software is required to be introduced later to more elaborated questions. This article aims to report on a teaching numerical methods strategy for undergraduates engineering programs. It is directed to students, lecturers and researchers in engineering field.
NASA Astrophysics Data System (ADS)
Popov, Vyacheslav; Ermakov, Alexander; Mukhamedzhanova, Olga
2017-10-01
Sustainable development of trailering in Russia needs energy efficient and environmentally safe localization of the providing infrastructure which includes customer services, such as enterprises of hospitality (campings). Their rational placement minimizes the fuel consumption by vehicles, but also emissions of harmful substances into the atmosphere. The article presents rational localization of the sites for the construction of such enterprises using the MATLAB program. The program provides several levels of the task solution: from the total characteristic of the territory (the head interface) to the analysis of the possibility of forwarding charges on visit of the enterprises of car service (petrol station, automobile spare parts shops, car repair enterprises, cafe, campings and so on). The program offered implementation of the optimization by the criterion of decrease in energy costs allows to establish the preferable fields of their rational localization.
Automated calculation of surface energy fluxes with high-frequency lake buoy data
Woolway, R. Iestyn; Jones, Ian D; Hamilton, David P.; Maberly, Stephen C; Muroaka, Kohji; Read, Jordan S.; Smyth, Robyn L; Winslow, Luke A.
2015-01-01
Lake Heat Flux Analyzer is a program used for calculating the surface energy fluxes in lakes according to established literature methodologies. The program was developed in MATLAB for the rapid analysis of high-frequency data from instrumented lake buoys in support of the emerging field of aquatic sensor network science. To calculate the surface energy fluxes, the program requires a number of input variables, such as air and water temperature, relative humidity, wind speed, and short-wave radiation. Available outputs for Lake Heat Flux Analyzer include the surface fluxes of momentum, sensible heat and latent heat and their corresponding transfer coefficients, incoming and outgoing long-wave radiation. Lake Heat Flux Analyzer is open source and can be used to process data from multiple lakes rapidly. It provides a means of calculating the surface fluxes using a consistent method, thereby facilitating global comparisons of high-frequency data from lake buoys.
Acoustics Research of Propulsion Systems
NASA Technical Reports Server (NTRS)
Gao, Ximing; Houston, Janice D.
2014-01-01
The liftoff phase induces some of the highest acoustic loading over a broad frequency for a launch vehicle. These external acoustic environments are used in the prediction of the internal vibration responses of the vehicle and components. Thus, predicting these liftoff acoustic environments is critical to the design requirements of any launch vehicle but there are challenges. Present liftoff vehicle acoustic environment prediction methods utilize stationary data from previously conducted hold-down tests; i.e. static firings conducted in the 1960's, to generate 1/3 octave band Sound Pressure Level (SPL) spectra. These data sets are used to predict the liftoff acoustic environments for launch vehicles. To facilitate the accuracy and quality of acoustic loading, predictions at liftoff for future launch vehicles such as the Space Launch System (SLS), non-stationary flight data from the Ares I-X were processed in PC-Signal in two forms which included a simulated hold-down phase and the entire launch phase. In conjunction, the Prediction of Acoustic Vehicle Environments (PAVE) program was developed in MATLAB to allow for efficient predictions of sound pressure levels (SPLs) as a function of station number along the vehicle using semiempirical methods. This consisted, initially, of generating the Dimensionless Spectrum Function (DSF) and Dimensionless Source Location (DSL) curves from the Ares I-X flight data. These are then used in the MATLAB program to generate the 1/3 octave band SPL spectra. Concluding results show major differences in SPLs between the hold-down test data and the processed Ares IX flight data making the Ares I-X flight data more practical for future vehicle acoustic environment predictions.
fissioncore: A desktop-computer simulation of a fission-bomb core
NASA Astrophysics Data System (ADS)
Cameron Reed, B.; Rohe, Klaus
2014-10-01
A computer program, fissioncore, has been developed to deterministically simulate the growth of the number of neutrons within an exploding fission-bomb core. The program allows users to explore the dependence of criticality conditions on parameters such as nuclear cross-sections, core radius, number of secondary neutrons liberated per fission, and the distance between nuclei. Simulations clearly illustrate the existence of a critical radius given a particular set of parameter values, as well as how the exponential growth of the neutron population (the condition that characterizes criticality) depends on these parameters. No understanding of neutron diffusion theory is necessary to appreciate the logic of the program or the results. The code is freely available in FORTRAN, C, and Java and is configured so that modifications to accommodate more refined physical conditions are possible.
Raju, Leo; Milton, R S; Mahadevan, Senthilkumaran
The objective of this paper is implementation of multiagent system (MAS) for the advanced distributed energy management and demand side management of a solar microgrid. Initially, Java agent development environment (JADE) frame work is used to implement MAS based dynamic energy management of solar microgrid. Due to unstable nature of MATLAB, when dealing with multithreading environment, MAS operating in JADE is linked with the MATLAB using a middle ware called Multiagent Control Using Simulink with Jade Extension (MACSimJX). MACSimJX allows the solar microgrid components designed with MATLAB to be controlled by the corresponding agents of MAS. The microgrid environment variables are captured through sensors and given to agents through MATLAB/Simulink and after the agent operations in JADE, the results are given to the actuators through MATLAB for the implementation of dynamic operation in solar microgrid. MAS operating in JADE maximizes operational efficiency of solar microgrid by decentralized approach and increase in runtime efficiency due to JADE. Autonomous demand side management is implemented for optimizing the power exchange between main grid and microgrid with intermittent nature of solar power, randomness of load, and variation of noncritical load and grid price. These dynamics are considered for every time step and complex environment simulation is designed to emulate the distributed microgrid operations and evaluate the impact of agent operations.
Raju, Leo; Milton, R. S.; Mahadevan, Senthilkumaran
2016-01-01
The objective of this paper is implementation of multiagent system (MAS) for the advanced distributed energy management and demand side management of a solar microgrid. Initially, Java agent development environment (JADE) frame work is used to implement MAS based dynamic energy management of solar microgrid. Due to unstable nature of MATLAB, when dealing with multithreading environment, MAS operating in JADE is linked with the MATLAB using a middle ware called Multiagent Control Using Simulink with Jade Extension (MACSimJX). MACSimJX allows the solar microgrid components designed with MATLAB to be controlled by the corresponding agents of MAS. The microgrid environment variables are captured through sensors and given to agents through MATLAB/Simulink and after the agent operations in JADE, the results are given to the actuators through MATLAB for the implementation of dynamic operation in solar microgrid. MAS operating in JADE maximizes operational efficiency of solar microgrid by decentralized approach and increase in runtime efficiency due to JADE. Autonomous demand side management is implemented for optimizing the power exchange between main grid and microgrid with intermittent nature of solar power, randomness of load, and variation of noncritical load and grid price. These dynamics are considered for every time step and complex environment simulation is designed to emulate the distributed microgrid operations and evaluate the impact of agent operations. PMID:27127802
A flexible tool for diagnosing water, energy, and entropy budgets in climate models
NASA Astrophysics Data System (ADS)
Lembo, Valerio; Lucarini, Valerio
2017-04-01
We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.
Comparison of optimization algorithms in intensity-modulated radiation therapy planning
NASA Astrophysics Data System (ADS)
Kendrick, Rachel
Intensity-modulated radiation therapy is used to better conform the radiation dose to the target, which includes avoiding healthy tissue. Planning programs employ optimization methods to search for the best fluence of each photon beam, and therefore to create the best treatment plan. The Computational Environment for Radiotherapy Research (CERR), a program written in MATLAB, was used to examine some commonly-used algorithms for one 5-beam plan. Algorithms include the genetic algorithm, quadratic programming, pattern search, constrained nonlinear optimization, simulated annealing, the optimization method used in Varian EclipseTM, and some hybrids of these. Quadratic programing, simulated annealing, and a quadratic/simulated annealing hybrid were also separately compared using different prescription doses. The results of each dose-volume histogram as well as the visual dose color wash were used to compare the plans. CERR's built-in quadratic programming provided the best overall plan, but avoidance of the organ-at-risk was rivaled by other programs. Hybrids of quadratic programming with some of these algorithms seems to suggest the possibility of better planning programs, as shown by the improved quadratic/simulated annealing plan when compared to the simulated annealing algorithm alone. Further experimentation will be done to improve cost functions and computational time.
An Innovative Learning Model for Computation in First Year Mathematics
ERIC Educational Resources Information Center
Tonkes, E. J.; Loch, B. I.; Stace, A. W.
2005-01-01
MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swyer, Michael; Davatzes, Nicholas; Cladouhos, Trenton
Matlab scripts and functions and data used to build Poly3D models and create permeability potential layers for 1) St. Helens Shear Zone, 2) Wind River Valley, and 3) Mount Baker geothermal prospect areas located in Washington state.
Equilibrium-Staged Separations Using Matlab and Mathematica
ERIC Educational Resources Information Center
Binous, Housam
2008-01-01
We show a new approach, based on the utilization of Matlab and Mathematica, for solving liquid-liquid extraction and binary distillation problems. In addition, the author shares his experience using these two softwares to teach equilibrium staged separations at the National Institute of Applied Sciences and Technology. (Contains 7 figures.)
Deterministic and stochastic CTMC models from Zika disease transmission
NASA Astrophysics Data System (ADS)
Zevika, Mona; Soewono, Edy
2018-03-01
Zika infection is one of the most important mosquito-borne diseases in the world. Zika virus (ZIKV) is transmitted by many Aedes-type mosquitoes including Aedes aegypti. Pregnant women with the Zika virus are at risk of having a fetus or infant with a congenital defect and suffering from microcephaly. Here, we formulate a Zika disease transmission model using two approaches, a deterministic model and a continuous-time Markov chain stochastic model. The basic reproduction ratio is constructed from a deterministic model. Meanwhile, the CTMC stochastic model yields an estimate of the probability of extinction and outbreaks of Zika disease. Dynamical simulations and analysis of the disease transmission are shown for the deterministic and stochastic models.
Distinguishing between stochasticity and determinism: Examples from cell cycle duration variability.
Pearl Mizrahi, Sivan; Sandler, Oded; Lande-Diner, Laura; Balaban, Nathalie Q; Simon, Itamar
2016-01-01
We describe a recent approach for distinguishing between stochastic and deterministic sources of variability, focusing on the mammalian cell cycle. Variability between cells is often attributed to stochastic noise, although it may be generated by deterministic components. Interestingly, lineage information can be used to distinguish between variability and determinism. Analysis of correlations within a lineage of the mammalian cell cycle duration revealed its deterministic nature. Here, we discuss the sources of such variability and the possibility that the underlying deterministic process is due to the circadian clock. Finally, we discuss the "kicked cell cycle" model and its implication on the study of the cell cycle in healthy and cancerous tissues. © 2015 WILEY Periodicals, Inc.
Legacy model integration for enhancing hydrologic interdisciplinary research
NASA Astrophysics Data System (ADS)
Dozier, A.; Arabi, M.; David, O.
2013-12-01
Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the model integration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the model integration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common implementation of the message passing interface (MPI), which includes FORTRAN, C, Java, the .NET languages, Python, R, Matlab, and many others. The system is tested on a longstanding legacy hydrologic model, the Soil and Water Assessment Tool (SWAT), to observe and enhance speed-up capabilities for various optimization, parameter estimation, and model uncertainty characterization techniques, which is particularly important for computationally intensive hydrologic simulations. Initial results indicate that the legacy extension system significantly decreases developer time, computation time, and the cost of purchasing commercial parallel processing licenses, while enhancing interdisciplinary research by providing detailed two-way feedback mechanisms between various process models with minimal changes to legacy code.
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan D.; ...
2015-03-17
Despite growing recognition that deterministic and stochastic factors simultaneously influence bacterial communities, little is known about mechanisms shifting their relative importance. To better understand underlying mechanisms, we developed a conceptual model linking ecosystem development during primary succession to shifts in the stochastic/deterministic balance. To evaluate the conceptual model we coupled spatiotemporal data on soil bacterial communities with environmental conditions spanning 105 years of salt marsh development. At the local scale there was a progression from stochasticity to determinism due to Na accumulation with increasing ecosystem age, supporting a main element of the conceptual model. At the regional-scale, soil organic mattermore » (SOM) governed the relative influence of stochasticity and the type of deterministic ecological selection, suggesting scale-dependency in how deterministic ecological selection is imposed. Analysis of a new ecological simulation model supported these conceptual inferences. Looking forward, we propose an extended conceptual model that integrates primary and secondary succession in microbial systems.« less
Guymon, Gary L.; Yen, Chung-Cheng
1990-01-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
NASA Astrophysics Data System (ADS)
Guymon, Gary L.; Yen, Chung-Cheng
1990-07-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
NASA Technical Reports Server (NTRS)
Bollman, W. E.; Chadwick, C.
1982-01-01
A number of interplanetary missions now being planned involve placing deterministic maneuvers along the flight path to alter the trajectory. Lee and Boain (1973) examined the statistics of trajectory correction maneuver (TCM) magnitude with no deterministic ('bias') component. The Delta v vector magnitude statistics were generated for several values of random Delta v standard deviations using expansions in terms of infinite hypergeometric series. The present investigation uses a different technique (Monte Carlo simulation) to generate Delta v magnitude statistics for a wider selection of random Delta v standard deviations and also extends the analysis to the case of nonzero deterministic Delta v's. These Delta v magnitude statistics are plotted parametrically. The plots are useful in assisting the analyst in quickly answering questions about the statistics of Delta v magnitude for single TCM's consisting of both a deterministic and a random component. The plots provide quick insight into the nature of the Delta v magnitude distribution for the TCM.
URBAN STORMWATER INVESTIGATIONS BY THE U. S. GEOLOGICAL SURVEY.
Jennings, Marshall E.
1985-01-01
Urban stormwater hydrology studies in the U. S. Geological Survey are currently focused on compilation of national data bases containing flood-peak and short time-interval rainfall, discharge and water-quality information for urban watersheds. Current data bases, updated annually, are nationwide in scope. Supplementing the national data files are published reports of interpretative analyses, a map report and research products including improved instrumentation and deterministic modeling capabilities. New directions of Survey investigations include gaging programs for very small catchments and for stormwater detention facilities.
Intelligent guidance and control for wind shear encounter
NASA Technical Reports Server (NTRS)
Stengel, Robert F.
1988-01-01
The principal objective is to develop methods for assessing the likelihood of wind shear encounter, for deciding what flight path to pursue, and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information, for making go/no-go decisions, and for generating commands to the aircraft's cockpit displays and autopilot for both manually controlled and automatic flight. The program has begun with the development of a real-time expert system for pilot aiding that is based on the results of the FAA Windshear Training Aids Program. A two-volume manual that presents an overview, pilot guide, training program, and substantiating data provides guidelines for this initial development. The Expert System to Avoid Wind Shear (ESAWS) currently contains over 140 rules and is coded in the LISP programming language for implementation on a Symbolics 3670 LISP machine.
An Algebra-Based Introductory Computational Neuroscience Course with Lab.
Fink, Christian G
2017-01-01
A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.
WTO — a deterministic approach to 4-fermion physics
NASA Astrophysics Data System (ADS)
Passarino, Giampiero
1996-09-01
The program WTO, which is designed for computing cross sections and other relevant observables in the e+e- annihilation into four fermions, is described. The various quantities are computed over both a completely inclusive experimental set-up and a realistic one, i.e. with cuts on the final state energies, final state angles, scattering angles and final state invariant masses. Initial state QED corrections are included by means of the structure function approach while final state QCD corrections are applicable in their naive formulation. A gauge restoring mechanism is included according to the Fermion-Loop scheme. The program structure is highly modular and particular care has been devoted to computing efficiency and speed.
NASA Astrophysics Data System (ADS)
García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.
2018-07-01
In the past few decades, it has been recognized that 1 / f fluctuations are ubiquitous in nature. The most widely used mathematical models to capture the long-term memory properties of 1 / f fluctuations have been stochastic fractal models. However, physical systems do not usually consist of just stochastic fractal dynamics, but they often also show some degree of deterministic behavior. The present paper proposes a model based on fractal stochastic and deterministic components that can provide a valuable basis for the study of complex systems with long-term correlations. The fractal stochastic component is assumed to be a fractional Brownian motion process and the deterministic component is assumed to be a band-limited signal. We also provide a method that, under the assumptions of this model, is able to characterize the fractal stochastic component and to provide an estimate of the deterministic components present in a given time series. The method is based on a Bayesian wavelet shrinkage procedure that exploits the self-similar properties of the fractal processes in the wavelet domain. This method has been validated over simulated signals and over real signals with economical and biological origin. Real examples illustrate how our model may be useful for exploring the deterministic-stochastic duality of complex systems, and uncovering interesting patterns present in time series.
Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model
Nené, Nuno R.; Dunham, Alistair S.; Illingworth, Christopher J. R.
2018-01-01
A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. PMID:29500183
ShareSync: A Solution for Deterministic Data Sharing over Ethernet
NASA Technical Reports Server (NTRS)
Dunn, Daniel J., II; Koons, William A.; Kennedy, Richard D.; Davis, Philip A.
2007-01-01
As part of upgrading the Contact Dynamics Simulation Laboratory (CDSL) at the NASA Marshall Space Flight Center (MSFC), a simple, cost effective method was needed to communicate data among the networked simulation machines and I/O controllers used to run the facility. To fill this need and similar applicable situations, a generic protocol was developed, called ShareSync. ShareSync is a lightweight, real-time, publish-subscribe Ethernet protocol for simple and deterministic data sharing across diverse machines and operating systems. ShareSync provides a simple Application Programming Interface (API) for simulation programmers to incorporate into their code. The protocol is compatible with virtually all Ethernet-capable machines, is flexible enough to support a variety of applications, is fast enough to provide soft real-time determinism, and is a low-cost resource for distributed simulation development, deployment, and maintenance. The first design cycle iteration of ShareSync has been completed, and the protocol has undergone several testing procedures including endurance and benchmarking tests and approaches the 2001ts data synchronization design goal for the CDSL.
NASA Astrophysics Data System (ADS)
Tubman, Norm; Whaley, Birgitta
The development of exponential scaling methods has seen great progress in tackling larger systems than previously thought possible. One such technique, full configuration interaction quantum Monte Carlo, allows exact diagonalization through stochastically sampling of determinants. The method derives its utility from the information in the matrix elements of the Hamiltonian, together with a stochastic projected wave function, which are used to explore the important parts of Hilbert space. However, a stochastic representation of the wave function is not required to search Hilbert space efficiently and new deterministic approaches have recently been shown to efficiently find the important parts of determinant space. We shall discuss the technique of Adaptive Sampling Configuration Interaction (ASCI) and the related heat-bath Configuration Interaction approach for ground state and excited state simulations. We will present several applications for strongly correlated Hamiltonians. This work was supported through the Scientific Discovery through Advanced Computing (SciDAC) program funded by the U.S. Department of Energy, Office of Science, Advanced Scientific Computing Research and Basic Energy Sciences.
Ibrahim, Ahmad M.; Wilson, Paul P.H.; Sawan, Mohamed E.; ...
2015-06-30
The CADIS and FW-CADIS hybrid Monte Carlo/deterministic techniques dramatically increase the efficiency of neutronics modeling, but their use in the accurate design analysis of very large and geometrically complex nuclear systems has been limited by the large number of processors and memory requirements for their preliminary deterministic calculations and final Monte Carlo calculation. Three mesh adaptivity algorithms were developed to reduce the memory requirements of CADIS and FW-CADIS without sacrificing their efficiency improvement. First, a macromaterial approach enhances the fidelity of the deterministic models without changing the mesh. Second, a deterministic mesh refinement algorithm generates meshes that capture as muchmore » geometric detail as possible without exceeding a specified maximum number of mesh elements. Finally, a weight window coarsening algorithm decouples the weight window mesh and energy bins from the mesh and energy group structure of the deterministic calculations in order to remove the memory constraint of the weight window map from the deterministic mesh resolution. The three algorithms were used to enhance an FW-CADIS calculation of the prompt dose rate throughout the ITER experimental facility. Using these algorithms resulted in a 23.3% increase in the number of mesh tally elements in which the dose rates were calculated in a 10-day Monte Carlo calculation and, additionally, increased the efficiency of the Monte Carlo simulation by a factor of at least 3.4. The three algorithms enabled this difficult calculation to be accurately solved using an FW-CADIS simulation on a regular computer cluster, eliminating the need for a world-class super computer.« less
Improving ground-penetrating radar data in sedimentary rocks using deterministic deconvolution
Xia, J.; Franseen, E.K.; Miller, R.D.; Weis, T.V.; Byrnes, A.P.
2003-01-01
Resolution is key to confidently identifying unique geologic features using ground-penetrating radar (GPR) data. Source wavelet "ringing" (related to bandwidth) in a GPR section limits resolution because of wavelet interference, and can smear reflections in time and/or space. The resultant potential for misinterpretation limits the usefulness of GPR. Deconvolution offers the ability to compress the source wavelet and improve temporal resolution. Unlike statistical deconvolution, deterministic deconvolution is mathematically simple and stable while providing the highest possible resolution because it uses the source wavelet unique to the specific radar equipment. Source wavelets generated in, transmitted through and acquired from air allow successful application of deterministic approaches to wavelet suppression. We demonstrate the validity of using a source wavelet acquired in air as the operator for deterministic deconvolution in a field application using "400-MHz" antennas at a quarry site characterized by interbedded carbonates with shale partings. We collected GPR data on a bench adjacent to cleanly exposed quarry faces in which we placed conductive rods to provide conclusive groundtruth for this approach to deconvolution. The best deconvolution results, which are confirmed by the conductive rods for the 400-MHz antenna tests, were observed for wavelets acquired when the transmitter and receiver were separated by 0.3 m. Applying deterministic deconvolution to GPR data collected in sedimentary strata at our study site resulted in an improvement in resolution (50%) and improved spatial location (0.10-0.15 m) of geologic features compared to the same data processed without deterministic deconvolution. The effectiveness of deterministic deconvolution for increased resolution and spatial accuracy of specific geologic features is further demonstrated by comparing results of deconvolved data with nondeconvolved data acquired along a 30-m transect immediately adjacent to a fresh quarry face. The results at this site support using deterministic deconvolution, which incorporates the GPR instrument's unique source wavelet, as a standard part of routine GPR data processing. ?? 2003 Elsevier B.V. All rights reserved.
Expansion or extinction: deterministic and stochastic two-patch models with Allee effects.
Kang, Yun; Lanchier, Nicolas
2011-06-01
We investigate the impact of Allee effect and dispersal on the long-term evolution of a population in a patchy environment. Our main focus is on whether a population already established in one patch either successfully invades an adjacent empty patch or undergoes a global extinction. Our study is based on the combination of analytical and numerical results for both a deterministic two-patch model and a stochastic counterpart. The deterministic model has either two, three or four attractors. The existence of a regime with exactly three attractors only appears when patches have distinct Allee thresholds. In the presence of weak dispersal, the analysis of the deterministic model shows that a high-density and a low-density populations can coexist at equilibrium in nearby patches, whereas the analysis of the stochastic model indicates that this equilibrium is metastable, thus leading after a large random time to either a global expansion or a global extinction. Up to some critical dispersal, increasing the intensity of the interactions leads to an increase of both the basin of attraction of the global extinction and the basin of attraction of the global expansion. Above this threshold, for both the deterministic and the stochastic models, the patches tend to synchronize as the intensity of the dispersal increases. This results in either a global expansion or a global extinction. For the deterministic model, there are only two attractors, while the stochastic model no longer exhibits a metastable behavior. In the presence of strong dispersal, the limiting behavior is entirely determined by the value of the Allee thresholds as the global population size in the deterministic and the stochastic models evolves as dictated by their single-patch counterparts. For all values of the dispersal parameter, Allee effects promote global extinction in terms of an expansion of the basin of attraction of the extinction equilibrium for the deterministic model and an increase of the probability of extinction for the stochastic model.
NASA Astrophysics Data System (ADS)
Mattie, P. D.; Knowlton, R. G.; Arnold, B. W.; Tien, N.; Kuo, M.
2006-12-01
Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in radioactive waste disposal and is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. International technology transfer efforts are often hampered by small budgets, time schedule constraints, and a lack of experienced personnel in countries with small radioactive waste disposal programs. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, re-vitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a creditable and solid computational platform for constructing probabilistic safety assessment models. External model linkage capabilities in Goldsim and the techniques applied to facilitate this process will be presented using example applications, including Breach, Leach, and Transport-Multiple Species (BLT-MS), a U.S. NRC sponsored code simulating release and transport of contaminants from a subsurface low-level waste disposal facility used in a cooperative technology transfer project between Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research (INER) for the preliminary assessment of several candidate low-level waste repository sites. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.
Mills, Fergal P; Ford, Nathan; Nachega, Jean B; Bansback, Nicholas; Nosyk, Bohdan; Yaya, Sanni; Mills, Edward J
2012-11-01
Raising the guidelines for the initiation of antiretroviral therapy in resource-limited settings at CD4 T-cell counts of 350 cells per microliter raises concerns about feasibility and cost. We examined costs of this shift using data from Uganda for almost 10 years. We projected total costs of earlier initiation with combined antiretroviral therapy, including inpatient and outpatient services, antiretroviral treatment and treatment for limited HIV-related opportunistic diseases, and benefits expressed in years-of-life-saved over 5- and 30-year time horizons using a deterministic economic model to examine the incremental cost-effectiveness ratio (ICER), expressed in cost per year-of-life-saved (YLS). The model generated ICERs for 5- and 30-year time horizons. Discounting both costs and benefits at 3% annually, for the 5-year analysis, the ICER was $695/YLS and $769 in the 30-year analysis. The results were most sensitive to program cost and the discount rate applied, but they were less sensitive to opportunistic infection treatment costs or the relative-risk reduction from earlier initiation. Program costs varied from 25% to 125%, and the ICER for the lower bound decreased to $491/YLS at 5-years and $574/YLS at 30 years. For the upper bound, the ICER increased to $899 for 5-years and $964 at 30-years. The budget impact of adoption, assuming the same level of program penetration in the community, is $261,651,942 for 5 years and $872,685,561 for 30 years. Our model showed that earlier initiation of combined antiretroviral therapy in Uganda is associated with improved long-term survival and is highly cost-effective, as defined by WHO-CHOICE.
NASA Astrophysics Data System (ADS)
Sheldon, W.; Chamblee, J.; Cary, R. H.
2013-12-01
Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.