Sample records for custom matlab program

  1. Test Generator for MATLAB Simulations

    NASA Technical Reports Server (NTRS)

    Henry, Joel

    2011-01-01

    MATLAB Automated Test Tool, version 3.0 (MATT 3.0) is a software package that provides automated tools that reduce the time needed for extensive testing of simulation models that have been constructed in the MATLAB programming language by use of the Simulink and Real-Time Workshop programs. MATT 3.0 runs on top of the MATLAB engine application-program interface to communicate with the Simulink engine. MATT 3.0 automatically generates source code from the models, generates custom input data for testing both the models and the source code, and generates graphs and other presentations that facilitate comparison of the outputs of the models and the source code for the same input data. Context-sensitive and fully searchable help is provided in HyperText Markup Language (HTML) format.

  2. Engineering and Scientific Applications: Using MatLab(Registered Trademark) for Data Processing and Visualization

    NASA Technical Reports Server (NTRS)

    Sen, Syamal K.; Shaykhian, Gholam Ali

    2011-01-01

    MatLab(TradeMark)(MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many countries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its real strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbox. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using symbolic operations. MatLab in its interpreter programming language form (command interface) is similar with well known programming languages such as C/C++, support data structures and cell arrays to define classes in object oriented programming. As such, MatLab is equipped with most of the essential constructs of a higher programming language. MatLab is packaged with an editor and debugging functionality useful to perform analysis of large MatLab programs and find errors. We believe there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and analysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applications. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientific problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabular format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed.

  3. Flight Dynamics and Control of a Morphing UAV: Bio inspired by Natural Fliers

    DTIC Science & Technology

    2017-02-17

    Approved for public release: distribution unlimited. IV Modelling and Sizing Tornado Vortex Lattice Method (VLM) was used for aerodynamic prediction... Tornado is a Vortex Lattice Method software programmed in MATLAB; it was selected due to its fast solving time and ability to be controlled through...custom MATLAB scripts. Tornado VLM models the wing as thin sheet of discrete vortices and computes the pressure and force distributions around the

  4. Engineering and Scientific Applications: Using MatLab(Registered Trademark) for Data Processing and Visualization

    NASA Technical Reports Server (NTRS)

    Sen, Syamal K.; Shaykhian, Gholam Ali

    2011-01-01

    MatLab(R) (MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many cou ntries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its re al strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbo x. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using s ymbolic operations. MatLab in its interpreter programming language fo rm (command interface) is similar with well known programming languag es such as C/C++, support data structures and cell arrays to define c lasses in object oriented programming. As such, MatLab is equipped with most ofthe essential constructs of a higher programming language. M atLab is packaged with an editor and debugging functionality useful t o perform analysis of large MatLab programs and find errors. We belie ve there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and ana lysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applicati ons. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientifi c problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabu lar format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed. The presentation will emphasize creating practIcal scripts (pro grams) that extend the basic features of MatLab TOPICS mclude (1) Ma trix and vector analysis and manipulations (2) Mathematical functions (3) Symbolic calculations & functions (4) Import/export data files (5) Program lOgic and flow control (6) Writing function and passing parameters (7) Test application programs

  5. Arc_Mat: a Matlab-based spatial data analysis toolbox

    NASA Astrophysics Data System (ADS)

    Liu, Xingjian; Lesage, James

    2010-03-01

    This article presents an overview of Arc_Mat, a Matlab-based spatial data analysis software package whose source code has been placed in the public domain. An earlier version of the Arc_Mat toolbox was developed to extract map polygon and database information from ESRI shapefiles and provide high quality mapping in the Matlab software environment. We discuss revisions to the toolbox that: utilize enhanced computing and graphing capabilities of more recent versions of Matlab, restructure the toolbox with object-oriented programming features, and provide more comprehensive functions for spatial data analysis. The Arc_Mat toolbox functionality includes basic choropleth mapping; exploratory spatial data analysis that provides exploratory views of spatial data through various graphs, for example, histogram, Moran scatterplot, three-dimensional scatterplot, density distribution plot, and parallel coordinate plots; and more formal spatial data modeling that draws on the extensive Spatial Econometrics Toolbox functions. A brief review of the design aspects of the revised Arc_Mat is described, and we provide some illustrative examples that highlight representative uses of the toolbox. Finally, we discuss programming with and customizing the Arc_Mat toolbox functionalities.

  6. AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments

    NASA Astrophysics Data System (ADS)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  7. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    PubMed

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  8. Acoustic detection of biosonar activity of deep diving odontocetes at Josephine Seamount High Seas Marine Protected Area.

    PubMed

    Giorli, Giacomo; Au, Whitlow W L; Ou, Hui; Jarvis, Susan; Morrissey, Ronald; Moretti, David

    2015-05-01

    The temporal occurrence of deep diving cetaceans in the Josephine Seamount High Seas Marine Protected Area (JSHSMPA), south-west Portugal, was monitored using a passive acoustic recorder. The recorder was deployed on 13 May 2010 at a depth of 814 m during the North Atlantic Treaty Organization Centre for Maritime Research and Experimentation cruise "Sirena10" and recovered on 6 June 2010. The recorder was programmed to record 40 s of data every 2 min. Acoustic data analysis, for the detection and classification of echolocation clicks, was performed using automatic detector/classification systems: M3R (Marine Mammal Monitoring on Navy Ranges), a custom matlab program, and an operator-supervised custom matlab program to assess the classification performance of the detector/classification systems. M3R CS-SVM algorithm contains templates to detect beaked whales, sperm whales, blackfish (pilot and false killer whales), and Risso's dolphins. The detections of each group of odontocetes was monitored as a function of time. Blackfish and Risso's dolphins were detected every day, while beaked whales and sperm whales were detected almost every day. The hourly distribution of detections reveals that blackfish and Risso's dolphins were more active at night, while beaked whales and sperm whales were more active during daylight hours.

  9. Design and Demonstration of a Miniature Lidar System for Rover Applications

    NASA Technical Reports Server (NTRS)

    Robinson, Benjamin

    2010-01-01

    A basic small and portable lidar system for rover applications has been designed. It uses a 20 Hz Nd:YAG pulsed laser, a 4-inch diameter telescope receiver, a custom-built power distribution unit (PDU), and a custom-built 532 nm photomultiplier tube (PMT) to measure the lidar signal. The receiving optics have been designed, but not constructed yet. LabVIEW and MATLAB programs have also been written to control the system, acquire data, and analyze data. The proposed system design, along with some measurements, is described. Future work to be completed is also discussed.

  10. Modeling of Habitat and Foraging Behavior of Beaked Whales in the Southern California Bight

    DTIC Science & Technology

    2014-09-30

    preference. APPROACH High-Frequency Acoustic Recording Packages ( HARPs , Wiggins & Hildebrand 2007) have collected acoustic data at 17 sites...signal processing for HARP data is performed using the MATLAB (Mathworks, Natick, MA) based custom program Triton (Wiggins & Hildebrand 2007) and... HARP data are stored with the remainder of metadata (e.g. project name, instrument location, detection settings, detection effort) in the database

  11. MatLab Script and Functional Programming

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.

  12. MSiReader: an open-source interface to view and analyze high resolving power MS imaging files on Matlab platform.

    PubMed

    Robichaud, Guillaume; Garrard, Kenneth P; Barry, Jeremy A; Muddiman, David C

    2013-05-01

    During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.

  13. MSiReader: An Open-Source Interface to View and Analyze High Resolving Power MS Imaging Files on Matlab Platform

    NASA Astrophysics Data System (ADS)

    Robichaud, Guillaume; Garrard, Kenneth P.; Barry, Jeremy A.; Muddiman, David C.

    2013-05-01

    During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.

  14. MatLab Programming for Engineers Having No Formal Programming Knowledge

    NASA Technical Reports Server (NTRS)

    Shaykhian, Linda H.; Shaykhian, Gholam Ali

    2007-01-01

    MatLab is one of the most widely used very high level programming languages for Scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. Also, stated are the current limitations of the MatLab, which possibly can be taken care of by Mathworks Inc. in a future version to make MatLab more versatile.

  15. COSMIC monthly progress report

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of April 1994. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are summarized. Five articles were prepared for publication in the NASA Tech Brief Journal. These articles (included in this report) describe the following software items: GAP 1.0 - Groove Analysis Program, Version 1.0; SUBTRANS - Subband/Transform MATLAB Functions for Image Processing; CSDM - COLD-SAT Dynamic Model; CASRE - Computer Aided Software Reliability Estimation; and XOPPS - OEL Project Planner/Scheduler Tool. Activities in the areas of marketing, customer service, benefits identification, maintenance and support, and disseminations are also described along with a budget summary.

  16. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation

    PubMed Central

    Sherfey, Jason S.; Soplata, Austin E.; Ardid, Salva; Roberts, Erik A.; Stanley, David A.; Pittman-Polletta, Benjamin R.; Kopell, Nancy J.

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community. PMID:29599715

  17. DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.

    PubMed

    Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J

    2018-01-01

    DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.

  18. Dynamical modeling and multi-experiment fitting with PottersWheel

    PubMed Central

    Maiwald, Thomas; Timmer, Jens

    2008-01-01

    Motivation: Modelers in Systems Biology need a flexible framework that allows them to easily create new dynamic models, investigate their properties and fit several experimental datasets simultaneously. Multi-experiment-fitting is a powerful approach to estimate parameter values, to check the validity of a given model, and to discriminate competing model hypotheses. It requires high-performance integration of ordinary differential equations and robust optimization. Results: We here present the comprehensive modeling framework Potters-Wheel (PW) including novel functionalities to satisfy these requirements with strong emphasis on the inverse problem, i.e. data-based modeling of partially observed and noisy systems like signal transduction pathways and metabolic networks. PW is designed as a MATLAB toolbox and includes numerous user interfaces. Deterministic and stochastic optimization routines are combined by fitting in logarithmic parameter space allowing for robust parameter calibration. Model investigation includes statistical tests for model-data-compliance, model discrimination, identifiability analysis and calculation of Hessian- and Monte-Carlo-based parameter confidence limits. A rich application programming interface is available for customization within own MATLAB code. Within an extensive performance analysis, we identified and significantly improved an integrator–optimizer pair which decreases the fitting duration for a realistic benchmark model by a factor over 3000 compared to MATLAB with optimization toolbox. Availability: PottersWheel is freely available for academic usage at http://www.PottersWheel.de/. The website contains a detailed documentation and introductory videos. The program has been intensively used since 2005 on Windows, Linux and Macintosh computers and does not require special MATLAB toolboxes. Contact: maiwald@fdm.uni-freiburg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18614583

  19. NLSEmagic: Nonlinear Schrödinger equation multi-dimensional Matlab-based GPU-accelerated integrators using compact high-order schemes

    NASA Astrophysics Data System (ADS)

    Caplan, R. M.

    2013-04-01

    We present a simple to use, yet powerful code package called NLSEmagic to numerically integrate the nonlinear Schrödinger equation in one, two, and three dimensions. NLSEmagic is a high-order finite-difference code package which utilizes graphic processing unit (GPU) parallel architectures. The codes running on the GPU are many times faster than their serial counterparts, and are much cheaper to run than on standard parallel clusters. The codes are developed with usability and portability in mind, and therefore are written to interface with MATLAB utilizing custom GPU-enabled C codes with the MEX-compiler interface. The packages are freely distributed, including user manuals and set-up files. Catalogue identifier: AEOJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOJ_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 124453 No. of bytes in distributed program, including test data, etc.: 4728604 Distribution format: tar.gz Programming language: C, CUDA, MATLAB. Computer: PC, MAC. Operating system: Windows, MacOS, Linux. Has the code been vectorized or parallelized?: Yes. Number of processors used: Single CPU, number of GPU processors dependent on chosen GPU card (max is currently 3072 cores on GeForce GTX 690). Supplementary material: Setup guide, Installation guide. RAM: Highly dependent on dimensionality and grid size. For typical medium-large problem size in three dimensions, 4GB is sufficient. Keywords: Nonlinear Schröodinger Equation, GPU, high-order finite difference, Bose-Einstien condensates. Classification: 4.3, 7.7. Nature of problem: Integrate solutions of the time-dependent one-, two-, and three-dimensional cubic nonlinear Schrödinger equation. Solution method: The integrators utilize a fully-explicit fourth-order Runge-Kutta scheme in time and both second- and fourth-order differencing in space. The integrators are written to run on NVIDIA GPUs and are interfaced with MATLAB including built-in visualization and analysis tools. Restrictions: The main restriction for the GPU integrators is the amount of RAM on the GPU as the code is currently only designed for running on a single GPU. Unusual features: Ability to visualize real-time simulations through the interaction of MATLAB and the compiled GPU integrators. Additional comments: Setup guide and Installation guide provided. Program has a dedicated web site at www.nlsemagic.com. Running time: A three-dimensional run with a grid dimension of 87×87×203 for 3360 time steps (100 non-dimensional time units) takes about one and a half minutes on a GeForce GTX 580 GPU card.

  20. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.

    PubMed

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-10-27

    The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.

  1. DataPflex: a MATLAB-based tool for the manipulation and visualization of multidimensional datasets.

    PubMed

    Hendriks, Bart S; Espelin, Christopher W

    2010-02-01

    DataPflex is a MATLAB-based application that facilitates the manipulation and visualization of multidimensional datasets. The strength of DataPflex lies in the intuitive graphical user interface for the efficient incorporation, manipulation and visualization of high-dimensional data that can be generated by multiplexed protein measurement platforms including, but not limited to Luminex or Meso-Scale Discovery. Such data can generally be represented in the form of multidimensional datasets [for example (time x stimulation x inhibitor x inhibitor concentration x cell type x measurement)]. For cases where measurements are made in a combinational fashion across multiple dimensions, there is a need for a tool to efficiently manipulate and reorganize such data for visualization. DataPflex accepts data consisting of up to five arbitrary dimensions in addition to a measurement dimension. Data are imported from a simple .xls format and can be exported to MATLAB or .xls. Data dimensions can be reordered, subdivided, merged, normalized and visualized in the form of collections of line graphs, bar graphs, surface plots, heatmaps, IC50's and other custom plots. Open source implementation in MATLAB enables easy extension for custom plotting routines and integration with more sophisticated analysis tools. DataPflex is distributed under the GPL license (http://www.gnu.org/licenses/) together with documentation, source code and sample data files at: http://code.google.com/p/datapflex. Supplementary data available at Bioinformatics online.

  2. Optimal service using Matlab - simulink controlled Queuing system at call centers

    NASA Astrophysics Data System (ADS)

    Balaji, N.; Siva, E. P.; Chandrasekaran, A. D.; Tamilazhagan, V.

    2018-04-01

    This paper presents graphical integrated model based academic research on telephone call centres. This paper introduces an important feature of impatient customers and abandonments in the queue system. However the modern call centre is a complex socio-technical system. Queuing theory has now become a suitable application in the telecom industry to provide better online services. Through this Matlab-simulink multi queuing structured models provide better solutions in complex situations at call centres. Service performance measures analyzed at optimal level through Simulink queuing model.

  3. MATLAB Algorithms for Rapid Detection and Embedding of Palindrome and Emordnilap Electronic Watermarks in Simulated Chemical and Biological Image Data

    DTIC Science & Technology

    2004-12-01

    digital watermarking http:// ww*.petitcolas .net/ fabien/ steganography / email: fapp2@cl.cam.ac.uk a=double(imread(’custom-a.jpg’)); %load in image ...MATLAB Algorithms for Rapid Detection and Embedding of Palindrome and Emordnilap Electronic Watermarks in Simulated Chemical and Biological Image ...approach (Ref 2-4) to watermarking involves be used to inform the viewer of data (such as photographs putting the cover image in the first 4

  4. Optical cell tracking analysis using a straight-forward approach to minimize processing time for high frame rate data

    NASA Astrophysics Data System (ADS)

    Seeto, Wen Jun; Lipke, Elizabeth Ann

    2016-03-01

    Tracking of rolling cells via in vitro experiment is now commonly performed using customized computer programs. In most cases, two critical challenges continue to limit analysis of cell rolling data: long computation times due to the complexity of tracking algorithms and difficulty in accurately correlating a given cell with itself from one frame to the next, which is typically due to errors caused by cells that either come close in proximity to each other or come in contact with each other. In this paper, we have developed a sophisticated, yet simple and highly effective, rolling cell tracking system to address these two critical problems. This optical cell tracking analysis (OCTA) system first employs ImageJ for cell identification in each frame of a cell rolling video. A custom MATLAB code was written to use the geometric and positional information of all cells as the primary parameters for matching each individual cell with itself between consecutive frames and to avoid errors when tracking cells that come within close proximity to one another. Once the cells are matched, rolling velocity can be obtained for further analysis. The use of ImageJ for cell identification eliminates the need for high level MATLAB image processing knowledge. As a result, only fundamental MATLAB syntax is necessary for cell matching. OCTA has been implemented in the tracking of endothelial colony forming cell (ECFC) rolling under shear. The processing time needed to obtain tracked cell data from a 2 min ECFC rolling video recorded at 70 frames per second with a total of over 8000 frames is less than 6 min using a computer with an Intel® Core™ i7 CPU 2.80 GHz (8 CPUs). This cell tracking system benefits cell rolling analysis by substantially reducing the time required for post-acquisition data processing of high frame rate video recordings and preventing tracking errors when individual cells come in close proximity to one another.

  5. An introduction to MATLAB.

    PubMed

    Sobie, Eric A

    2011-09-13

    This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy.

  6. An Introduction to MATLAB

    PubMed Central

    Sobie, Eric A.

    2014-01-01

    This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy. PMID:21934110

  7. Development of MATLAB software to control data acquisition from a multichannel systems multi-electrode array.

    PubMed

    Messier, Erik

    2016-08-01

    A Multichannel Systems (MCS) microelectrode array data acquisition (DAQ) unit is used to collect multichannel electrograms (EGM) from a Langendorff perfused rabbit heart system to study sudden cardiac death (SCD). MCS provides software through which data being processed by the DAQ unit can be displayed and saved, but this software's combined utility with MATLAB is not very effective. MCSs software stores recorded EGM data in a MathCad (MCD) format, which is then converted to a text file format. These text files are very large, and it is therefore very time consuming to import the EGM data into MATLAB for real-time analysis. Therefore, customized MATLAB software was developed to control the acquisition of data from the MCS DAQ unit, and provide specific laboratory accommodations for this study of SCD. The developed DAQ unit control software will be able to accurately: provide real time display of EGM signals; record and save EGM signals in MATLAB in a desired format; and produce real time analysis of the EGM signals; all through an intuitive GUI.

  8. Remote sensing image segmentation based on Hadoop cloud platform

    NASA Astrophysics Data System (ADS)

    Li, Jie; Zhu, Lingling; Cao, Fubin

    2018-01-01

    To solve the problem that the remote sensing image segmentation speed is slow and the real-time performance is poor, this paper studies the method of remote sensing image segmentation based on Hadoop platform. On the basis of analyzing the structural characteristics of Hadoop cloud platform and its component MapReduce programming, this paper proposes a method of image segmentation based on the combination of OpenCV and Hadoop cloud platform. Firstly, the MapReduce image processing model of Hadoop cloud platform is designed, the input and output of image are customized and the segmentation method of the data file is rewritten. Then the Mean Shift image segmentation algorithm is implemented. Finally, this paper makes a segmentation experiment on remote sensing image, and uses MATLAB to realize the Mean Shift image segmentation algorithm to compare the same image segmentation experiment. The experimental results show that under the premise of ensuring good effect, the segmentation rate of remote sensing image segmentation based on Hadoop cloud Platform has been greatly improved compared with the single MATLAB image segmentation, and there is a great improvement in the effectiveness of image segmentation.

  9. Resource-saving accommodation of the enterprises of service for travelling by car in the context of sustainable development of territories

    NASA Astrophysics Data System (ADS)

    Popov, Vyacheslav; Ermakov, Alexander; Mukhamedzhanova, Olga

    2017-10-01

    Sustainable development of trailering in Russia needs energy efficient and environmentally safe localization of the providing infrastructure which includes customer services, such as enterprises of hospitality (campings). Their rational placement minimizes the fuel consumption by vehicles, but also emissions of harmful substances into the atmosphere. The article presents rational localization of the sites for the construction of such enterprises using the MATLAB program. The program provides several levels of the task solution: from the total characteristic of the territory (the head interface) to the analysis of the possibility of forwarding charges on visit of the enterprises of car service (petrol station, automobile spare parts shops, car repair enterprises, cafe, campings and so on). The program offered implementation of the optimization by the criterion of decrease in energy costs allows to establish the preferable fields of their rational localization.

  10. A Summary of the Naval Postgraduate School Research Program and Recent Publications

    DTIC Science & Technology

    1990-09-01

    principles to divide the spectrum of MATLAB computer program on a 386-type a wide-band spread-spectrum signal into sub- computer. Because of the high rf...original in time and a large data sample was required. An signal. Effects due the fiber optic pickup array extended version of MATLAB that allows and...application, such as orbital mechanics and weather prediction. Professor Gragg has also developed numerous MATLAB programs for linear programming problems

  11. Estimating aquifer transmissivity from specific capacity using MATLAB.

    PubMed

    McLin, Stephen G

    2005-01-01

    Historically, specific capacity information has been used to calculate aquifer transmissivity when pumping test data are unavailable. This paper presents a simple computer program written in the MATLAB programming language that estimates transmissivity from specific capacity data while correcting for aquifer partial penetration and well efficiency. The program graphically plots transmissivity as a function of these factors so that the user can visually estimate their relative importance in a particular application. The program is compatible with any computer operating system running MATLAB, including Windows, Macintosh OS, Linux, and Unix. Two simple examples illustrate program usage.

  12. PScan 1.0: flexible software framework for polygon based multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Li, Yongxiao; Lee, Woei Ming

    2016-12-01

    Multiphoton laser scanning microscopes exhibit highly localized nonlinear optical excitation and are powerful instruments for in-vivo deep tissue imaging. Customized multiphoton microscopy has a significantly superior performance for in-vivo imaging because of precise control over the scanning and detection system. To date, there have been several flexible software platforms catered to custom built microscopy systems i.e. ScanImage, HelioScan, MicroManager, that perform at imaging speeds of 30-100fps. In this paper, we describe a flexible software framework for high speed imaging systems capable of operating from 5 fps to 1600 fps. The software is based on the MATLAB image processing toolbox. It has the capability to communicate directly with a high performing imaging card (Matrox Solios eA/XA), thus retaining high speed acquisition. The program is also designed to communicate with LabVIEW and Fiji for instrument control and image processing. Pscan 1.0 can handle high imaging rates and contains sufficient flexibility for users to adapt to their high speed imaging systems.

  13. MNPBEM - A Matlab toolbox for the simulation of plasmonic nanoparticles

    NASA Astrophysics Data System (ADS)

    Hohenester, Ulrich; Trügler, Andreas

    2012-02-01

    MNPBEM is a Matlab toolbox for the simulation of metallic nanoparticles (MNP), using a boundary element method (BEM) approach. The main purpose of the toolbox is to solve Maxwell's equations for a dielectric environment where bodies with homogeneous and isotropic dielectric functions are separated by abrupt interfaces. Although the approach is in principle suited for arbitrary body sizes and photon energies, it is tested (and probably works best) for metallic nanoparticles with sizes ranging from a few to a few hundreds of nanometers, and for frequencies in the optical and near-infrared regime. The toolbox has been implemented with Matlab classes. These classes can be easily combined, which has the advantage that one can adapt the simulation programs flexibly for various applications. Program summaryProgram title: MNPBEM Catalogue identifier: AEKJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v2 No. of lines in distributed program, including test data, etc.: 15 700 No. of bytes in distributed program, including test data, etc.: 891 417 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b) Computer: Any which supports Matlab 7.11.0 (R2010b) Operating system: Any which supports Matlab 7.11.0 (R2010b) RAM: ⩾1 GByte Classification: 18 Nature of problem: Solve Maxwell's equations for dielectric particles with homogeneous dielectric functions separated by abrupt interfaces. Solution method: Boundary element method using electromagnetic potentials. Running time: Depending on surface discretization between seconds and hours.

  14. Improve Problem Solving Skills through Adapting Programming Tools

    NASA Technical Reports Server (NTRS)

    Shaykhian, Linda H.; Shaykhian, Gholam Ali

    2007-01-01

    There are numerous ways for engineers and students to become better problem-solvers. The use of command line and visual programming tools can help to model a problem and formulate a solution through visualization. The analysis of problem attributes and constraints provide insight into the scope and complexity of the problem. The visualization aspect of the problem-solving approach tends to make students and engineers more systematic in their thought process and help them catch errors before proceeding too far in the wrong direction. The problem-solver identifies and defines important terms, variables, rules, and procedures required for solving a problem. Every step required to construct the problem solution can be defined in program commands that produce intermediate output. This paper advocates improved problem solving skills through using a programming tool. MatLab created by MathWorks, is an interactive numerical computing environment and programming language. It is a matrix-based system that easily lends itself to matrix manipulation, and plotting of functions and data. MatLab can be used as an interactive command line or a sequence of commands that can be saved in a file as a script or named functions. Prior programming experience is not required to use MatLab commands. The GNU Octave, part of the GNU project, a free computer program for performing numerical computations, is comparable to MatLab. MatLab visual and command programming are presented here.

  15. The TimeStudio Project: An open source scientific workflow system for the behavioral and brain sciences.

    PubMed

    Nyström, Pär; Falck-Ytter, Terje; Gredebäck, Gustaf

    2016-06-01

    This article describes a new open source scientific workflow system, the TimeStudio Project, dedicated to the behavioral and brain sciences. The program is written in MATLAB and features a graphical user interface for the dynamic pipelining of computer algorithms developed as TimeStudio plugins. TimeStudio includes both a set of general plugins (for reading data files, modifying data structures, visualizing data structures, etc.) and a set of plugins specifically developed for the analysis of event-related eyetracking data as a proof of concept. It is possible to create custom plugins to integrate new or existing MATLAB code anywhere in a workflow, making TimeStudio a flexible workbench for organizing and performing a wide range of analyses. The system also features an integrated sharing and archiving tool for TimeStudio workflows, which can be used to share workflows both during the data analysis phase and after scientific publication. TimeStudio thus facilitates the reproduction and replication of scientific studies, increases the transparency of analyses, and reduces individual researchers' analysis workload. The project website ( http://timestudioproject.com ) contains the latest releases of TimeStudio, together with documentation and user forums.

  16. Calculus Demonstrations Using MATLAB

    ERIC Educational Resources Information Center

    Dunn, Peter K.; Harman, Chris

    2002-01-01

    The note discusses ways in which technology can be used in the calculus learning process. In particular, five MATLAB programs are detailed for use by instructors or students that demonstrate important concepts in introductory calculus: Newton's method, differentiation and integration. Two of the programs are animated. The programs and the…

  17. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.

    PubMed

    Delorme, Arnaud; Makeig, Scott

    2004-03-15

    We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.

  18. Comparison of cyclic correlation algorithm implemented in matlab and python

    NASA Astrophysics Data System (ADS)

    Carr, Richard; Whitney, James

    Simulation is a necessary step for all engineering projects. Simulation gives the engineers an approximation of how their devices will perform under different circumstances, without hav-ing to build, or before building a physical prototype. This is especially true for space bound devices, i.e., space communication systems, where the impact of system malfunction or failure is several orders of magnitude over that of terrestrial applications. Therefore having a reliable simulation tool is key in developing these devices and systems. Math Works Matrix Laboratory (MATLAB) is a matrix based software used by scientists and engineers to solve problems and perform complex simulations. MATLAB has a number of applications in a wide variety of fields which include communications, signal processing, image processing, mathematics, eco-nomics and physics. Because of its many uses MATLAB has become the preferred software for many engineers; it is also very expensive, especially for students and startups. One alternative to MATLAB is Python. The Python is a powerful, easy to use, open source programming environment that can be used to perform many of the same functions as MATLAB. Python programming environment has been steadily gaining popularity in niche programming circles. While there are not as many function included in the software as MATLAB, there are many open source functions that have been developed that are available to be downloaded for free. This paper illustrates how Python can implement the cyclic correlation algorithm and com-pares the results to the cyclic correlation algorithm implemented in the MATLAB environment. Some of the characteristics to be compared are the accuracy and precision of the results, and the length of the programs. The paper will demonstrate that Python is capable of performing simulations of complex algorithms such cyclic correlation.

  19. MATLAB tools for improved characterization and quantification of volcanic incandescence in Webcam imagery; applications at Kilauea Volcano, Hawai'i

    USGS Publications Warehouse

    Patrick, Matthew R.; Kauahikaua, James P.; Antolik, Loren

    2010-01-01

    Webcams are now standard tools for volcano monitoring and are used at observatories in Alaska, the Cascades, Kamchatka, Hawai'i, Italy, and Japan, among other locations. Webcam images allow invaluable documentation of activity and provide a powerful comparative tool for interpreting other monitoring datastreams, such as seismicity and deformation. Automated image processing can improve the time efficiency and rigor of Webcam image interpretation, and potentially extract more information on eruptive activity. For instance, Lovick and others (2008) provided a suite of processing tools that performed such tasks as noise reduction, eliminating uninteresting images from an image collection, and detecting incandescence, with an application to dome activity at Mount St. Helens during 2007. In this paper, we present two very simple automated approaches for improved characterization and quantification of volcanic incandescence in Webcam images at Kilauea Volcano, Hawai`i. The techniques are implemented in MATLAB (version 2009b, Copyright: The Mathworks, Inc.) to take advantage of the ease of matrix operations. Incandescence is a useful indictor of the location and extent of active lava flows and also a potentially powerful proxy for activity levels at open vents. We apply our techniques to a period covering both summit and east rift zone activity at Kilauea during 2008?2009 and compare the results to complementary datasets (seismicity, tilt) to demonstrate their integrative potential. A great strength of this study is the demonstrated success of these tools in an operational setting at the Hawaiian Volcano Observatory (HVO) over the course of more than a year. Although applied only to Webcam images here, the techniques could be applied to any type of sequential images, such as time-lapse photography. We expect that these tools are applicable to many other volcano monitoring scenarios, and the two MATLAB scripts, as they are implemented at HVO, are included in the appendixes. These scripts would require minor to moderate modifications for use elsewhere, primarily to customize directory navigation. If the user has some familiarity with MATLAB, or programming in general, these modifications should be easy. Although we originally anticipated needing the Image Processing Toolbox, the scripts in the appendixes do not require it. Thus, only the base installation of MATLAB is needed. Because fairly basic MATLAB functions are used, we expect that the script can be run successfully by versions earlier than 2009b.

  20. The Realization of Drilling Fault Diagnosis Based on Hybrid Programming with Matlab and VB

    NASA Astrophysics Data System (ADS)

    Wang, Jiangping; Hu, Yingcai

    This paper presents a method using hybrid programming with Matlab and VB based on ActiveX to design the system of drilling accident prediction and diagnosis. So that the powerful calculating function and graphical display function of Matlab and visual development interface of VB are combined fully. The main interface of the diagnosis system is compiled in VB,and the analysis and fault diagnosis are implemented by neural network tool boxes in Matlab.The system has favorable interactive interface,and the fault example validation shows that the diagnosis result is feasible and can meet the demands of drilling accident prediction and diagnosis.

  1. SBEToolbox: A Matlab Toolbox for Biological Network Analysis

    PubMed Central

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418

  2. SBEToolbox: A Matlab Toolbox for Biological Network Analysis.

    PubMed

    Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J

    2013-01-01

    We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.

  3. ESF-X: a low-cost modular experiment computer for space flight experiments

    NASA Astrophysics Data System (ADS)

    Sell, Steven; Zapetis, Joseph; Littlefield, Jim; Vining, Joanne

    2004-08-01

    The high cost associated with spaceflight research often compels experimenters to scale back their research goals significantly purely for budgetary reasons; among experiment systems, control and data collection electronics are a major contributor to total project cost. ESF-X was developed as an architecture demonstration in response to this need: it is a highly capable, radiation-protected experiment support computer, designed to be configurable on demand to each investigator's particular experiment needs, and operational in LEO for missions lasting up to several years (e.g., ISS EXPRESS) without scheduled service or maintenance. ESF-X can accommodate up to 255 data channels (I/O, A/D, D/A, etc.), allocated per customer request, with data rates up to 40kHz. Additionally, ESF-X can be programmed using the graphical block-diagram based programming languages Simulink and MATLAB. This represents a major cost saving opportunity for future investigators, who can now obtain a customized, space-qualified experiment controller at steeply reduced cost compared to 'new' design, and without the performance compromises associated with using preexisting 'generic' systems. This paper documents the functional benchtop prototype, which utilizes a combination of COTS and space-qualified components, along with unit-gravity-specific provisions appropriate to laboratory environment evaluation of the ESF-X design concept and its physical implementation.

  4. High-Speed GPU-Based Fully Three-Dimensional Diffuse Optical Tomographic System

    PubMed Central

    Saikia, Manob Jyoti; Kanhirodan, Rajan; Mohan Vasu, Ram

    2014-01-01

    We have developed a graphics processor unit (GPU-) based high-speed fully 3D system for diffuse optical tomography (DOT). The reduction in execution time of 3D DOT algorithm, a severely ill-posed problem, is made possible through the use of (1) an algorithmic improvement that uses Broyden approach for updating the Jacobian matrix and thereby updating the parameter matrix and (2) the multinode multithreaded GPU and CUDA (Compute Unified Device Architecture) software architecture. Two different GPU implementations of DOT programs are developed in this study: (1) conventional C language program augmented by GPU CUDA and CULA routines (C GPU), (2) MATLAB program supported by MATLAB parallel computing toolkit for GPU (MATLAB GPU). The computation time of the algorithm on host CPU and the GPU system is presented for C and Matlab implementations. The forward computation uses finite element method (FEM) and the problem domain is discretized into 14610, 30823, and 66514 tetrahedral elements. The reconstruction time, so achieved for one iteration of the DOT reconstruction for 14610 elements, is 0.52 seconds for a C based GPU program for 2-plane measurements. The corresponding MATLAB based GPU program took 0.86 seconds. The maximum number of reconstructed frames so achieved is 2 frames per second. PMID:24891848

  5. High-Speed GPU-Based Fully Three-Dimensional Diffuse Optical Tomographic System.

    PubMed

    Saikia, Manob Jyoti; Kanhirodan, Rajan; Mohan Vasu, Ram

    2014-01-01

    We have developed a graphics processor unit (GPU-) based high-speed fully 3D system for diffuse optical tomography (DOT). The reduction in execution time of 3D DOT algorithm, a severely ill-posed problem, is made possible through the use of (1) an algorithmic improvement that uses Broyden approach for updating the Jacobian matrix and thereby updating the parameter matrix and (2) the multinode multithreaded GPU and CUDA (Compute Unified Device Architecture) software architecture. Two different GPU implementations of DOT programs are developed in this study: (1) conventional C language program augmented by GPU CUDA and CULA routines (C GPU), (2) MATLAB program supported by MATLAB parallel computing toolkit for GPU (MATLAB GPU). The computation time of the algorithm on host CPU and the GPU system is presented for C and Matlab implementations. The forward computation uses finite element method (FEM) and the problem domain is discretized into 14610, 30823, and 66514 tetrahedral elements. The reconstruction time, so achieved for one iteration of the DOT reconstruction for 14610 elements, is 0.52 seconds for a C based GPU program for 2-plane measurements. The corresponding MATLAB based GPU program took 0.86 seconds. The maximum number of reconstructed frames so achieved is 2 frames per second.

  6. Quantitative analysis of multiple biokinetic models using a dynamic water phantom: A feasibility study

    PubMed Central

    Chiang, Fu-Tsai; Li, Pei-Jung; Chung, Shih-Ping; Pan, Lung-Fa; Pan, Lung-Kwang

    2016-01-01

    ABSTRACT This study analyzed multiple biokinetic models using a dynamic water phantom. The phantom was custom-made with acrylic materials to model metabolic mechanisms in the human body. It had 4 spherical chambers of different sizes, connected by 8 ditches to form a complex and adjustable water loop. One infusion and drain pole connected the chambers to an auxiliary silicon-based hose, respectively. The radio-active compound solution (TC-99m-MDP labeled) formed a sealed and static water loop inside the phantom. As clean feed water was infused to replace the original solution, the system mimicked metabolic mechanisms for data acquisition. Five cases with different water loop settings were tested and analyzed, with case settings changed by controlling valve poles located in the ditches. The phantom could also be changed from model A to model B by transferring its vertical configuration. The phantom was surveyed with a clinical gamma camera to determine the time-dependent intensity of every chamber. The recorded counts per pixel in each chamber were analyzed and normalized to compare with theoretical estimations from the MATLAB program. Every preset case was represented by uniquely defined, time-dependent, simultaneous differential equations, and a corresponding MATLAB program optimized the solutions by comparing theoretical calculations and practical measurements. A dimensionless agreement (AT) index was recommended to evaluate the comparison in each case. ATs varied from 5.6 to 48.7 over the 5 cases, indicating that this work presented an acceptable feasibility study. PMID:27286096

  7. OXlearn: a new MATLAB-based simulation tool for connectionist models.

    PubMed

    Ruh, Nicolas; Westermann, Gert

    2009-11-01

    OXlearn is a free, platform-independent MATLAB toolbox in which standard connectionist neural network models can be set up, run, and analyzed by means of a user-friendly graphical interface. Due to its seamless integration with the MATLAB programming environment, the inner workings of the simulation tool can be easily inspected and/or extended using native MATLAB commands or components. This combination of usability, transparency, and extendability makes OXlearn an efficient tool for the implementation of basic research projects or the prototyping of more complex research endeavors, as well as for teaching. Both the MATLAB toolbox and a compiled version that does not require access to MATLAB can be downloaded from http://psych.brookes.ac.uk/oxlearn/.

  8. Quantification of larval zebrafish motor function in multiwell plates using open-source MATLAB applications.

    PubMed

    Zhou, Yangzhong; Cattley, Richard T; Cario, Clinton L; Bai, Qing; Burton, Edward A

    2014-07-01

    This article describes a method to quantify the movements of larval zebrafish in multiwell plates, using the open-source MATLAB applications LSRtrack and LSRanalyze. The protocol comprises four stages: generation of high-quality, flatly illuminated video recordings with exposure settings that facilitate object recognition; analysis of the resulting recordings using tools provided in LSRtrack to optimize tracking accuracy and motion detection; analysis of tracking data using LSRanalyze or custom MATLAB scripts; and implementation of validation controls. The method is reliable, automated and flexible, requires <1 h of hands-on work for completion once optimized and shows excellent signal:noise characteristics. The resulting data can be analyzed to determine the following: positional preference; displacement, velocity and acceleration; and duration and frequency of movement events and rest periods. This approach is widely applicable to the analysis of spontaneous or stimulus-evoked zebrafish larval neurobehavioral phenotypes resulting from a broad array of genetic and environmental manipulations, in a multiwell plate format suitable for high-throughput applications.

  9. [Application of the mixed programming with Labview and Matlab in biomedical signal analysis].

    PubMed

    Yu, Lu; Zhang, Yongde; Sha, Xianzheng

    2011-01-01

    This paper introduces the method of mixed programming with Labview and Matlab, and applies this method in a pulse wave pre-processing and feature detecting system. The method has been proved suitable, efficient and accurate, which has provided a new kind of approach for biomedical signal analysis.

  10. Enhancing Student Writing and Computer Programming with LATEX and MATLAB in Multivariable Calculus

    ERIC Educational Resources Information Center

    Sullivan, Eric; Melvin, Timothy

    2016-01-01

    Written communication and computer programming are foundational components of an undergraduate degree in the mathematical sciences. All lower-division mathematics courses at our institution are paired with computer-based writing, coding, and problem-solving activities. In multivariable calculus we utilize MATLAB and LATEX to have students explore…

  11. NASA Scientific Data Purchase Project: From Collection to User

    NASA Technical Reports Server (NTRS)

    Nicholson, Lamar; Policelli, Fritz; Fletcher, Rose

    2002-01-01

    NASA's Scientific Data Purchase (SDP) project is currently a $70 million operation managed by the Earth Science Applications Directorate at Stennis Space Center. The SDP project was developed in 1997 to purchase scientific data from commercial sources for distribution to NASA Earth science researchers. Our current data holdings include 8TB of remote sensing imagery consisting of 18 products from 4 companies. Our anticipated data volume is 60 TB by 2004, and we will be receiving new data products from several additional companies. Our current system capacity is 24 TB, expandable to 89 TB. Operations include tasking of new data collections, archive ordering, shipment verification, data validation, distribution, metrics, finances, customer feedback, and technical support. The program has been included in the Stennis Space Center Commercial Remote Sensing ISO 9001 registration since its inception. Our operational system includes automatic quality control checks on data received (with MatLab analysis); internally developed, custom Web-based interfaces that tie into commercial-off-the-shelf software; and an integrated relational database that links and tracks all data through operations. We've distributed nearly 1500 datasets, and almost 18,000 data files have been downloaded from our public web site; on a 10-point scale, our customer satisfaction index is 8.32 at a 23% response level. More information about the SDP is available on our Web site.

  12. MCR Container Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haas, Nicholas Q; Gillen, Robert E; Karnowski, Thomas P

    MathWorks' MATLAB is widely used in academia and industry for prototyping, data analysis, data processing, etc. Many users compile their programs using the MATLAB Compiler to run on workstations/computing clusters via the free MATLAB Compiler Runtime (MCR). The MCR facilitates the execution of code calling Application Programming Interfaces (API) functions from both base MATLAB and MATLAB toolboxes. In a Linux environment, a sizable number of third-party runtime dependencies (i.e. shared libraries) are necessary. Unfortunately, to the MTLAB community's knowledge, these dependencies are not documented, leaving system administrators and/or end-users to find/install the necessary libraries either as runtime errors resulting frommore » them missing or by inspecting the header information of Executable and Linkable Format (ELF) libraries of the MCR to determine which ones are missing from the system. To address various shortcomings, Docker Images based on Community Enterprise Operating System (CentOS) 7, a derivative of Redhat Enterprise Linux (RHEL) 7, containing recent (2015-2017) MCR releases and their dependencies were created. These images, along with a provided sample Docker Compose YAML Script, can be used to create a simulated computing cluster where MATLAB Compiler created binaries can be executed using a sample Slurm Workload Manager script.« less

  13. The Relationship between Gender and Students' Attitude and Experience of Using a Mathematical Software Program (MATLAB)

    ERIC Educational Resources Information Center

    Ocak, Mehmet A.

    2006-01-01

    This correlation study examined the relationship between gender and the students' attitude and prior knowledge of using one of the mathematical software programs (MATLAB). Participants were selected from one community college, one state university and one private college. Students were volunteers from three Calculus I classrooms (one class from…

  14. Image Algebra Matlab language version 2.3 for image processing and compression research

    NASA Astrophysics Data System (ADS)

    Schmalz, Mark S.; Ritter, Gerhard X.; Hayden, Eric

    2010-08-01

    Image algebra is a rigorous, concise notation that unifies linear and nonlinear mathematics in the image domain. Image algebra was developed under DARPA and US Air Force sponsorship at University of Florida for over 15 years beginning in 1984. Image algebra has been implemented in a variety of programming languages designed specifically to support the development of image processing and computer vision algorithms and software. The University of Florida has been associated with development of the languages FORTRAN, Ada, Lisp, and C++. The latter implementation involved a class library, iac++, that supported image algebra programming in C++. Since image processing and computer vision are generally performed with operands that are array-based, the Matlab™ programming language is ideal for implementing the common subset of image algebra. Objects include sets and set operations, images and operations on images, as well as templates and image-template convolution operations. This implementation, called Image Algebra Matlab (IAM), has been found to be useful for research in data, image, and video compression, as described herein. Due to the widespread acceptance of the Matlab programming language in the computing community, IAM offers exciting possibilities for supporting a large group of users. The control over an object's computational resources provided to the algorithm designer by Matlab means that IAM programs can employ versatile representations for the operands and operations of the algebra, which are supported by the underlying libraries written in Matlab. In a previous publication, we showed how the functionality of IAC++ could be carried forth into a Matlab implementation, and provided practical details of a prototype implementation called IAM Version 1. In this paper, we further elaborate the purpose and structure of image algebra, then present a maturing implementation of Image Algebra Matlab called IAM Version 2.3, which extends the previous implementation of IAM to include polymorphic operations over different point sets, as well as recursive convolution operations and functional composition. We also show how image algebra and IAM can be employed in image processing and compression research, as well as algorithm development and analysis.

  15. The GMT/MATLAB Toolbox

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  16. Three-dimensional rendering of segmented object using matlab - biomed 2010.

    PubMed

    Anderson, Jeffrey R; Barrett, Steven F

    2010-01-01

    The three-dimensional rendering of microscopic objects is a difficult and challenging task that often requires specialized image processing techniques. Previous work has been described of a semi-automatic segmentation process of fluorescently stained neurons collected as a sequence of slice images with a confocal laser scanning microscope. Once properly segmented, each individual object can be rendered and studied as a three-dimensional virtual object. This paper describes the work associated with the design and development of Matlab files to create three-dimensional images from the segmented object data previously mentioned. Part of the motivation for this work is to integrate both the segmentation and rendering processes into one software application, providing a seamless transition from the segmentation tasks to the rendering and visualization tasks. Previously these tasks were accomplished on two different computer systems, windows and Linux. This transition basically limits the usefulness of the segmentation and rendering applications to those who have both computer systems readily available. The focus of this work is to create custom Matlab image processing algorithms for object rendering and visualization, and merge these capabilities to the Matlab files that were developed especially for the image segmentation task. The completed Matlab application will contain both the segmentation and rendering processes in a single graphical user interface, or GUI. This process for rendering three-dimensional images in Matlab requires that a sequence of two-dimensional binary images, representing a cross-sectional slice of the object, be reassembled in a 3D space, and covered with a surface. Additional segmented objects can be rendered in the same 3D space. The surface properties of each object can be varied by the user to aid in the study and analysis of the objects. This inter-active process becomes a powerful visual tool to study and understand microscopic objects.

  17. Operating a Geiger-Muller Tube Using a PC Sound Card

    ERIC Educational Resources Information Center

    Azooz, A. A.

    2009-01-01

    In this paper, a simple MATLAB-based PC program that enables the computer to function as a replacement for the electronic scalar-counter system associated with a Geiger-Muller (GM) tube is described. The program utilizes the ability of MATLAB to acquire data directly from the computer sound card. The signal from the GM tube is applied to the…

  18. Effective approach to spectroscopy and spectral analysis techniques using Matlab

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Lv, Yong

    2017-08-01

    With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching

  19. SU-E-T-157: CARMEN: A MatLab-Based Research Platform for Monte Carlo Treatment Planning (MCTP) and Customized System for Planning Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baeza, J.A.; Ureba, A.; Jimenez-Ortega, E.

    Purpose: Although there exist several radiotherapy research platforms, such as: CERR, the most widely used and referenced; SlicerRT, which allows treatment plan comparison from various sources; and MMCTP, a full MCTP system; it is still needed a full MCTP toolset that provides users complete control of calculation grids, interpolation methods and filters in order to “fairly” compare results from different TPSs, supporting verification with experimental measurements. Methods: This work presents CARMEN, a MatLab-based platform including multicore and GPGPU accelerated functions for loading RT data; designing treatment plans; and evaluating dose matrices and experimental data.CARMEN supports anatomic and functional imaging inmore » DICOM format, as well as RTSTRUCT, RTPLAN and RTDOSE. Besides, it contains numerous tools to accomplish the MCTP process, managing egs4phant and phase space files.CARMEN planning mode assist in designing IMRT, VMAT and MERT treatments via both inverse and direct optimization. The evaluation mode contains a comprehensive toolset (e.g. 2D/3D gamma evaluation, difference matrices, profiles, DVH, etc.) to compare datasets from commercial TPS, MC simulations (i.e. 3ddose) and radiochromic film in a user-controlled manner. Results: CARMEN has been validated against commercial RTPs and well-established evaluation tools, showing coherent behavior of its multiple algorithms. Furthermore, CARMEN platform has been used to generate competitive complex treatment that has been published in comparative studies. Conclusion: A new research oriented MCTP platform with a customized validation toolset has been presented. Despite of being coded with a high-level programming language, CARMEN is agile due to the use of parallel algorithms. The wide-spread use of MatLab provides straightforward access to CARMEN’s algorithms to most researchers. Similarly, our platform can benefit from the MatLab community scientific developments as filters, registration algorithms etc. Finally, CARMEN arises the importance of grid and filtering control in treatment plan comparison.« less

  20. Blueprint XAS: a Matlab-based toolbox for the fitting and analysis of XAS spectra.

    PubMed

    Delgado-Jaime, Mario Ulises; Mewis, Craig Philip; Kennepohl, Pierre

    2010-01-01

    Blueprint XAS is a new Matlab-based program developed to fit and analyse X-ray absorption spectroscopy (XAS) data, most specifically in the near-edge region of the spectrum. The program is based on a methodology that introduces a novel background model into the complete fit model and that is capable of generating any number of independent fits with minimal introduction of user bias [Delgado-Jaime & Kennepohl (2010), J. Synchrotron Rad. 17, 119-128]. The functions and settings on the five panels of its graphical user interface are designed to suit the needs of near-edge XAS data analyzers. A batch function allows for the setting of multiple jobs to be run with Matlab in the background. A unique statistics panel allows the user to analyse a family of independent fits, to evaluate fit models and to draw statistically supported conclusions. The version introduced here (v0.2) is currently a toolbox for Matlab. Future stand-alone versions of the program will also incorporate several other new features to create a full package of tools for XAS data processing.

  1. UmUTracker: A versatile MATLAB program for automated particle tracking of 2D light microscopy or 3D digital holography data

    NASA Astrophysics Data System (ADS)

    Zhang, Hanqing; Stangner, Tim; Wiklund, Krister; Rodriguez, Alvaro; Andersson, Magnus

    2017-10-01

    We present a versatile and fast MATLAB program (UmUTracker) that automatically detects and tracks particles by analyzing video sequences acquired by either light microscopy or digital in-line holographic microscopy. Our program detects the 2D lateral positions of particles with an algorithm based on the isosceles triangle transform, and reconstructs their 3D axial positions by a fast implementation of the Rayleigh-Sommerfeld model using a radial intensity profile. To validate the accuracy and performance of our program, we first track the 2D position of polystyrene particles using bright field and digital holographic microscopy. Second, we determine the 3D particle position by analyzing synthetic and experimentally acquired holograms. Finally, to highlight the full program features, we profile the microfluidic flow in a 100 μm high flow chamber. This result agrees with computational fluid dynamic simulations. On a regular desktop computer UmUTracker can detect, analyze, and track multiple particles at 5 frames per second for a template size of 201 ×201 in a 1024 × 1024 image. To enhance usability and to make it easy to implement new functions we used object-oriented programming. UmUTracker is suitable for studies related to: particle dynamics, cell localization, colloids and microfluidic flow measurement. Program Files doi : http://dx.doi.org/10.17632/fkprs4s6xp.1 Licensing provisions : Creative Commons by 4.0 (CC by 4.0) Programming language : MATLAB Nature of problem: 3D multi-particle tracking is a common technique in physics, chemistry and biology. However, in terms of accuracy, reliable particle tracking is a challenging task since results depend on sample illumination, particle overlap, motion blur and noise from recording sensors. Additionally, the computational performance is also an issue if, for example, a computationally expensive process is executed, such as axial particle position reconstruction from digital holographic microscopy data. Versatile robust tracking programs handling these concerns and providing a powerful post-processing option are significantly limited. Solution method: UmUTracker is a multi-functional tool to extract particle positions from long video sequences acquired with either light microscopy or digital holographic microscopy. The program provides an easy-to-use graphical user interface (GUI) for both tracking and post-processing that does not require any programming skills to analyze data from particle tracking experiments. UmUTracker first conduct automatic 2D particle detection even under noisy conditions using a novel circle detector based on the isosceles triangle sampling technique with a multi-scale strategy. To reduce the computational load for 3D tracking, it uses an efficient implementation of the Rayleigh-Sommerfeld light propagation model. To analyze and visualize the data, an efficient data analysis step, which can for example show 4D flow visualization using 3D trajectories, is included. Additionally, UmUTracker is easy to modify with user-customized modules due to the object-oriented programming style Additional comments: Program obtainable from https://sourceforge.net/projects/umutracker/

  2. Cross-species 3D virtual reality toolbox for visual and cognitive experiments.

    PubMed

    Doucet, Guillaume; Gulli, Roberto A; Martinez-Trujillo, Julio C

    2016-06-15

    Although simplified visual stimuli, such as dots or gratings presented on homogeneous backgrounds, provide strict control over the stimulus parameters during visual experiments, they fail to approximate visual stimulation in natural conditions. Adoption of virtual reality (VR) in neuroscience research has been proposed to circumvent this problem, by combining strict control of experimental variables and behavioral monitoring within complex and realistic environments. We have created a VR toolbox that maximizes experimental flexibility while minimizing implementation costs. A free VR engine (Unreal 3) has been customized to interface with any control software via text commands, allowing seamless introduction into pre-existing laboratory data acquisition frameworks. Furthermore, control functions are provided for the two most common programming languages used in visual neuroscience: Matlab and Python. The toolbox offers milliseconds time resolution necessary for electrophysiological recordings and is flexible enough to support cross-species usage across a wide range of paradigms. Unlike previously proposed VR solutions whose implementation is complex and time-consuming, our toolbox requires minimal customization or technical expertise to interface with pre-existing data acquisition frameworks as it relies on already familiar programming environments. Moreover, as it is compatible with a variety of display and input devices, identical VR testing paradigms can be used across species, from rodents to humans. This toolbox facilitates the addition of VR capabilities to any laboratory without perturbing pre-existing data acquisition frameworks, or requiring any major hardware changes. Copyright © 2016 Z. All rights reserved.

  3. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  4. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  5. Enhanced Modeling of First-Order Plant Equations of Motion for Aeroelastic and Aeroservoelastic Applications

    NASA Technical Reports Server (NTRS)

    Pototzky, Anthony S.

    2010-01-01

    A methodology is described for generating first-order plant equations of motion for aeroelastic and aeroservoelastic applications. The description begins with the process of generating data files representing specialized mode-shapes, such as rigid-body and control surface modes, using both PATRAN and NASTRAN analysis. NASTRAN executes the 146 solution sequence using numerous Direct Matrix Abstraction Program (DMAP) calls to import the mode-shape files and to perform the aeroelastic response analysis. The aeroelastic response analysis calculates and extracts structural frequencies, generalized masses, frequency-dependent generalized aerodynamic force (GAF) coefficients, sensor deflections and load coefficients data as text-formatted data files. The data files are then re-sequenced and re-formatted using a custom written FORTRAN program. The text-formatted data files are stored and coefficients for s-plane equations are fitted to the frequency-dependent GAF coefficients using two Interactions of Structures, Aerodynamics and Controls (ISAC) programs. With tabular files from stored data created by ISAC, MATLAB generates the first-order aeroservoelastic plant equations of motion. These equations include control-surface actuator, turbulence, sensor and load modeling. Altitude varying root-locus plot and PSD plot results for a model of the F-18 aircraft are presented to demonstrate the capability.

  6. Performance Improvements to the Naval Postgraduate School Turbopropulsion Labs Transonic Axially Splittered Rotor

    DTIC Science & Technology

    2013-12-01

    Implementation of current NPS TPL design procedure that uses COTS software (MATLAB, SolidWorks, and ANSYS - CFX ) for the geometric rendering and...procedure that uses commercial-off-the-shelf software (MATLAB, SolidWorks, and ANSYS - CFX ) for the geometric rendering and analysis was modified and... CFX The CFD simulation program in ANSYS Workbench. CFX -Pre CFX boundary conditions and solver settings module. CFX -Solver CFX solver program. CFX

  7. Photon-Limited Information in High Resolution Laser Ranging

    DTIC Science & Technology

    2014-05-28

    entangled photons generated by spontaneous parametric down-conversion of a chirped source to perform ranging measurements. Summary of the Most... Matlab program to collect the photon counts from the time to digital converter (TDC). This entailed setting up Matlab to talk to the TDC to get the...SECURITY CLASSIFICATION OF: This project is an effort under the Information in a Photon (InPho) program at DARPA\\DSO. Its purpose is to investigate

  8. The Julia programming language: the future of scientific computing

    NASA Astrophysics Data System (ADS)

    Gibson, John

    2017-11-01

    Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.

  9. OPTICON: Pro-Matlab software for large order controlled structure design

    NASA Technical Reports Server (NTRS)

    Peterson, Lee D.

    1989-01-01

    A software package for large order controlled structure design is described and demonstrated. The primary program, called OPTICAN, uses both Pro-Matlab M-file routines and selected compiled FORTRAN routines linked into the Pro-Matlab structure. The program accepts structural model information in the form of state-space matrices and performs three basic design functions on the model: (1) open loop analyses; (2) closed loop reduced order controller synthesis; and (3) closed loop stability and performance assessment. The current controller synthesis methods which were implemented in this software are based on the Generalized Linear Quadratic Gaussian theory of Bernstein. In particular, a reduced order Optimal Projection synthesis algorithm based on a homotopy solution method was successfully applied to an experimental truss structure using a 58-state dynamic model. These results are presented and discussed. Current plans to expand the practical size of the design model to several hundred states and the intention to interface Pro-Matlab to a supercomputing environment are discussed.

  10. CMGTooL user's manual

    USGS Publications Warehouse

    Xu, Jingping; Lightsom, Fran; Noble, Marlene A.; Denham, Charles

    2002-01-01

    During the past several years, the sediment transport group in the Coastal and Marine Geology Program (CMGP) of the U. S. Geological Survey has made major revisions to its methodology of processing, analyzing, and maintaining the variety of oceanographic time-series data. First, CMGP completed the transition of the its oceanographic time-series database to a self-documenting NetCDF (Rew et al., 1997) data format. Second, CMGP’s oceanographic data variety and complexity have been greatly expanded from traditional 2-dimensional, single-point time-series measurements (e.g., Electro-magnetic current meters, transmissometers) to more advanced 3-dimensional and profiling time-series measurements due to many new acquisitions of modern instruments such as Acoustic Doppler Current Profiler (RDI, 1996), Acoustic Doppler Velocitimeter, Pulse-Coherence Acoustic Doppler Profiler (SonTek, 2001), Acoustic Bacscatter Sensor (Aquatec, 1001001001001001001). In order to accommodate the NetCDF format of data from the new instruments, a software package of processing, analyzing, and visualizing time-series oceanographic data was developed. It is named CMGTooL. The CMGTooL package contains two basic components: a user-friendly GUI for NetCDF file analysis, processing and manipulation; and a data analyzing program library. Most of the routines in the library are stand-alone programs suitable for batch processing. CMGTooL is written in MATLAB computing language (The Mathworks, 1997), therefore users must have MATLAB installed on their computer in order to use this software package. In addition, MATLAB’s Signal Processing Toolbox is also required by some CMGTooL’s routines. Like most MATLAB programs, all CMGTooL codes are compatible with different computing platforms including PC, MAC, and UNIX machines (Note: CMGTooL has been tested on different platforms that run MATLAB 5.2 (Release 10) or lower versions. Some of the commands related to MAC may not be compatible with later releases of MATLAB). The GUI and some of the library routines call low-level NetCDF file I/O, variable and attribute functions. These NetCDF exclusive functions are supported by a MATLAB toolbox named NetCDF, created by Dr. Charles Denham . This toolbox has to be installed in order to use the CMGTooL GUI. The CMGTooL GUI calls several routines that were initially developed by others. The authors would like to acknowledge the following scientists for their ideas and codes: Dr. Rich Signell (USGS), Dr. Chris Sherwood (USGS), and Dr. Bob Beardsley (WHOI). Many special terms that carry special meanings in either MATLAB or the NetCDF Toolbox are used in this manual. Users are encouraged to read the documents of MATLAB and NetCDF for references.

  11. MILAMIN 2 - Fast MATLAB FEM solver

    NASA Astrophysics Data System (ADS)

    Dabrowski, Marcin; Krotkiewski, Marcin; Schmid, Daniel W.

    2013-04-01

    MILAMIN is a free and efficient MATLAB-based two-dimensional FEM solver utilizing unstructured meshes [Dabrowski et al., G-cubed (2008)]. The code consists of steady-state thermal diffusion and incompressible Stokes flow solvers implemented in approximately 200 lines of native MATLAB code. The brevity makes the code easily customizable. An important quality of MILAMIN is speed - it can handle millions of nodes within minutes on one CPU core of a standard desktop computer, and is faster than many commercial solutions. The new MILAMIN 2 allows three-dimensional modeling. It is designed as a set of functional modules that can be used as building blocks for efficient FEM simulations using MATLAB. The utilities are largely implemented as native MATLAB functions. For performance critical parts we use MUTILS - a suite of compiled MEX functions optimized for shared memory multi-core computers. The most important features of MILAMIN 2 are: 1. Modular approach to defining, tracking, and discretizing the geometry of the model 2. Interfaces to external mesh generators (e.g., Triangle, Fade2d, T3D) and mesh utilities (e.g., element type conversion, fast point location, boundary extraction) 3. Efficient computation of the stiffness matrix for a wide range of element types, anisotropic materials and three-dimensional problems 4. Fast global matrix assembly using a dedicated MEX function 5. Automatic integration rules 6. Flexible prescription (spatial, temporal, and field functions) and efficient application of Dirichlet, Neuman, and periodic boundary conditions 7. Treatment of transient and non-linear problems 8. Various iterative and multi-level solution strategies 9. Post-processing tools (e.g., numerical integration) 10. Visualization primitives using MATLAB, and VTK export functions We provide a large number of examples that show how to implement a custom FEM solver using the MILAMIN 2 framework. The examples are MATLAB scripts of increasing complexity that address a given technical topic (e.g., creating meshes, reordering nodes, applying boundary conditions), a given numerical topic (e.g., using various solution strategies, non-linear iterations), or that present a fully-developed solver designed to address a scientific topic (e.g., performing Stokes flow simulations in synthetic porous medium). References: Dabrowski, M., M. Krotkiewski, and D. W. Schmid MILAMIN: MATLAB-based finite element method solver for large problems, Geochem. Geophys. Geosyst., 9, Q04030, 2008

  12. Wireless photoplethysmographic device for heart rate variability signal acquisition and analysis.

    PubMed

    Reyes, Ivan; Nazeran, Homer; Franco, Mario; Haltiwanger, Emily

    2012-01-01

    The photoplethysmographic (PPG) signal has the potential to aid in the acquisition and analysis of heart rate variability (HRV) signal: a non-invasive quantitative marker of the autonomic nervous system that could be used to assess cardiac health and other physiologic conditions. A low-power wireless PPG device was custom-developed to monitor, acquire and analyze the arterial pulse in the finger. The system consisted of an optical sensor to detect arterial pulse as variations in reflected light intensity, signal conditioning circuitry to process the reflected light signal, a microcontroller to control PPG signal acquisition, digitization and wireless transmission, a receiver to collect the transmitted digital data and convert them back to their analog representations. A personal computer was used to further process the captured PPG signals and display them. A MATLAB program was then developed to capture the PPG data, detect the RR peaks, perform spectral analysis of the PPG data, and extract the HRV signal. A user-friendly graphical user interface (GUI) was developed in LabView to display the PPG data and their spectra. The performance of each module (sensing unit, signal conditioning, wireless transmission/reception units, and graphical user interface) was assessed individually and the device was then tested as a whole. Consequently, PPG data were obtained from five healthy individuals to test the utility of the wireless system. The device was able to reliably acquire the PPG signals from the volunteers. To validate the accuracy of the MATLAB codes, RR peak information from each subject was fed into Kubios software as a text file. Kubios was able to generate a report sheet with the time domain and frequency domain parameters of the acquired data. These features were then compared against those calculated by MATLAB. The preliminary results demonstrate that the prototype wireless device could be used to perform HRV signal acquisition and analysis.

  13. MATLAB as an incentive for student learning of skills

    NASA Astrophysics Data System (ADS)

    Bank, C. G.; Ghent, R. R.

    2016-12-01

    Our course "Computational Geology" takes a holistic approach to student learning by using MATLAB as a focal point to increase students' computing, quantitative reasoning, data analysis, report writing, and teamwork skills. The course, taught since 2007 with recent enrollments around 35 and aimed at 2nd to 3rd-year students, is required for the Geology and Earth and Environmental Systems major programs, and can be chosen as elective in our other programs, including Geophysics. The course is divided into five projects: Pacific plate velocity from the Hawaiian hotspot track, predicting CO2 concentration in the atmosphere, volume of Earth's oceans and sea-level rise, comparing wind directions for Vancouver and Squamish, and groundwater flow. Each project is based on real data, focusses on a mathematical concept (linear interpolation, gradients, descriptive statistics, differential equations) and highlights a programming task (arrays, functions, text file input/output, curve fitting). Working in teams of three, students need to develop a conceptional model to explain the data, and write MATLAB code to visualize the data and match it to their conceptional model. The programming is guided, and students work individually on different aspects (for example: reading the data, fitting a function, unit conversion) which they need to put together to solve the problem. They then synthesize their thought process in a paper. Anecdotal evidence shows that students continue using MATLAB in other courses.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Daly, Don S.; Willse, Alan R.

    The Automated Microarray Image Analysis (AMIA) Toolbox for MATLAB is a flexible, open-source microarray image analysis tool that allows the user to customize analysis of sets of microarray images. This tool provides several methods of identifying and quantify spot statistics, as well as extensive diagnostic statistics and images to identify poor data quality or processing. The open nature of this software allows researchers to understand the algorithms used to provide intensity estimates and to modify them easily if desired.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thorne, N; Kassaee, A

    Purpose: To develop an algorithm which can calculate the Full Width Half Maximum (FWHM) of a Proton Pencil Beam from a 2D dimensional ion chamber array (IBA Matrixx) with limited spatial resolution ( 7.6 mm inter chamber distance). The algorithm would allow beam FWHM measurements to be taken during daily QA without an appreciable time increase. Methods: Combinations of 147 MeV single spot beams were delivered onto an IBA Matrixx and concurrently on EBT3 films for a standard. Data were collected around the Bragg Peak region and evaluated by a custom MATLAB script based on our algorithm using a leastmore » squared analysis. A set of artificial data, modified with random noise, was also processed to test for robustness. Results: The Matlab script processed Matixx data shows acceptable agreement (within 5%) with film measurements with no single measurement differing by more than 1.8 mm. In cases where the spots show some degree of asymmetry, the algorithm is able to resolve the differences. The algorithm was able to process artificial data with noise up to 15% of the maximum value. Time assays of each measurement took less than 3 minutes to perform, indicating that such measurements may be efficiently added to daily QA treatment. Conclusion: The developed algorithm can be implemented in daily QA program for Proton Pencil Beam scanning beams (PBS) with Matrixx to extract spot size and position information. The developed algorithm may be extended to small field sizes in photon clinic.« less

  16. Case studies on optimization problems in MATLAB and COMSOL multiphysics by means of the livelink

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    LiveLink for COMSOL is a tool that integrates COMSOL Multiphysics with MATLAB to extend one's modeling with scripting programming in the MATLAB environment. It allows user to utilize the full power of MATLAB and its toolboxes in preprocessing, model manipulation, and post processing. At first, the head script launches COMSOL with MATLAB and defines initial value of all parameters, refers to the objective function J described in the objective function and creates and runs the defined optimization task. Once the task is launches, the COMSOL model is being called in the iteration loop (from MATLAB environment by use of API interface), changing defined optimization parameters so that the objective function is minimized, using fmincon function to find a local or global minimum of constrained linear or nonlinear multivariable function. Once the minimum is found, it returns exit flag, terminates optimization and returns the optimized values of the parameters. The cooperation with MATLAB via LiveLink enhances a powerful computational environment with complex multiphysics simulations. The paper will introduce using of the LiveLink for COMSOL for chosen case studies in the field of technical cybernetics and bioengineering.

  17. The Waveform Suite: A robust platform for accessing and manipulating seismic waveforms in MATLAB

    NASA Astrophysics Data System (ADS)

    Reyes, C. G.; West, M. E.; McNutt, S. R.

    2009-12-01

    The Waveform Suite, developed at the University of Alaska Geophysical Institute, is an open-source collection of MATLAB classes that provide a means to import, manipulate, display, and share waveform data while ensuring integrity of the data and stability for programs that incorporate them. Data may be imported from a variety of sources, such as Antelope, Winston databases, SAC files, SEISAN, .mat files, or other user-defined file formats. The waveforms being manipulated in MATLAB are isolated from their stored representations, relieving the overlying programs from the responsibility of understanding the specific format in which data is stored or retrieved. The waveform class provides an object oriented framework that simplifies manipulations to waveform data. Playing with data becomes easier because the tedious aspects of data manipulation have been automated. The user is able to change multiple waveforms simultaneously using standard mathematical operators and other syntactically familiar functions. Unlike MATLAB structs or workspace variables, the data stored within waveform class objects are protected from modification, and instead are accessed through standardized functions, such as get and set; these are already familiar to users of MATLAB’s graphical features. This prevents accidental or nonsensical modifications to the data, which in turn simplifies troubleshooting of complex programs. Upgrades to the internal structure of the waveform class are invisible to applications which use it, making maintenance easier. We demonstrate the Waveform Suite’s capabilities on seismic data from Okmok and Redoubt volcanoes. Years of data from Okmok were retrieved from Antelope and Winston databases. Using the Waveform Suite, we built a tremor-location program. Because the program was built on the Waveform Suite, modifying it to operate on real-time data from Redoubt involved only minimal code changes. The utility of the Waveform Suite as a foundation for large developments is demonstrated with the Correlation Toolbox for MATLAB. This mature package contains 50+ codes for carrying out various type of waveform correlation analyses (multiplet analysis, clustering, interferometry, …) This package is greatly strengthened by delegating numerous book-keeping and signal processing tasks to the underlying Waveform Suite. The Waveform Suite’s built-in tools for searching arbitrary directory/file structures is demonstrated with matched video and audio from the recent eruption of Redoubt Volcano. These tools were used to find subsets of photo images corresponding to specific seismic traces. Using Waveform’s audio file routines, matched video and audio were assembled to produce outreach-quality eruption products. The Waveform Suite is not designed as a ready-to-go replacement for more comprehensive packages such as SAC or AH. Rather, it is a suite of classes which provide core time series functionality in a MATLAB environment. It is designed to be a more robust alternative to the numerous ad hoc MATLAB formats that exist. Complex programs may be created upon the Waveform Suite’s framework, while existing programs may be modified to take advantage of the Waveform Suites capabilities.

  18. MATLAB-Based Program for Teaching Autocorrelation Function and Noise Concepts

    ERIC Educational Resources Information Center

    Jovanovic Dolecek, G.

    2012-01-01

    An attractive MATLAB-based tool for teaching the basics of autocorrelation function and noise concepts is presented in this paper. This tool enhances traditional in-classroom lecturing. The demonstrations of the tool described here highlight the description of the autocorrelation function (ACF) in a general case for wide-sense stationary (WSS)…

  19. An implementation framework for wastewater treatment models requiring a minimum programming expertise.

    PubMed

    Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J

    2009-01-01

    Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies.

  20. MOFA Software for the COBRA Toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griesemer, Marc; Navid, Ali

    MOFA-COBRA is a software code for Matlab that performs Multi-Objective Flux Analysis (MOFA), a solving of linear programming problems. Teh leading software package for conducting different types of analyses using constrain-based models is the COBRA Toolbox for Matlab. MOFA-COBRA is an added tool for COBRA that solves multi-objective problems using a novel algorithm.

  1. Kinematic simulation and analysis of robot based on MATLAB

    NASA Astrophysics Data System (ADS)

    Liao, Shuhua; Li, Jiong

    2018-03-01

    The history of industrial automation is characterized by quick update technology, however, without a doubt, the industrial robot is a kind of special equipment. With the help of MATLAB matrix and drawing capacity in the MATLAB environment each link coordinate system set up by using the d-h parameters method and equation of motion of the structure. Robotics, Toolbox programming Toolbox and GUIDE to the joint application is the analysis of inverse kinematics and path planning and simulation, preliminary solve the problem of college students the car mechanical arm positioning theory, so as to achieve the aim of reservation.

  2. An advanced software suite for the processing and analysis of silicon luminescence images

    NASA Astrophysics Data System (ADS)

    Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.

    2017-06-01

    Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.

  3. Tip-Clearance Measurement in the First Stage of the Compressor of an Aircraft Engine.

    PubMed

    García, Iker; Przysowa, Radosław; Amorebieta, Josu; Zubia, Joseba

    2016-11-11

    In this article, we report the design of a reflective intensity-modulated optical fiber sensor for blade tip-clearance measurement, and the experimental results for the first stage of a compressor of an aircraft engine operating in real conditions. The tests were performed in a ground test cell, where the engine completed four cycles from idling state to takeoff and back to idling state. During these tests, the rotational speed of the compressor ranged between 7000 and 15,600 rpm. The main component of the sensor is a tetrafurcated bundle of optical fibers, with which the resulting precision of the experimental measurements was 12 µm for a measurement range from 2 to 4 mm. To get this precision the effect of temperature on the optoelectronic components of the sensor was compensated by calibrating the sensor in a climate chamber. A custom-designed MATLAB program was employed to simulate the behavior of the sensor prior to its manufacture.

  4. Tip-Clearance Measurement in the First Stage of the Compressor of an Aircraft Engine

    PubMed Central

    García, Iker; Przysowa, Radosław; Amorebieta, Josu; Zubia, Joseba

    2016-01-01

    In this article, we report the design of a reflective intensity-modulated optical fiber sensor for blade tip-clearance measurement, and the experimental results for the first stage of a compressor of an aircraft engine operating in real conditions. The tests were performed in a ground test cell, where the engine completed four cycles from idling state to takeoff and back to idling state. During these tests, the rotational speed of the compressor ranged between 7000 and 15,600 rpm. The main component of the sensor is a tetrafurcated bundle of optical fibers, with which the resulting precision of the experimental measurements was 12 µm for a measurement range from 2 to 4 mm. To get this precision the effect of temperature on the optoelectronic components of the sensor was compensated by calibrating the sensor in a climate chamber. A custom-designed MATLAB program was employed to simulate the behavior of the sensor prior to its manufacture. PMID:27845709

  5. Frequency Domain Identification Toolbox

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Juang, Jer-Nan; Chen, Chung-Wen

    1996-01-01

    This report documents software written in MATLAB programming language for performing identification of systems from frequency response functions. MATLAB is a commercial software environment which allows easy manipulation of data matrices and provides other intrinsic matrix functions capabilities. Algorithms programmed in this collection of subroutines have been documented elsewhere but all references are provided in this document. A main feature of this software is the use of matrix fraction descriptions and system realization theory to identify state space models directly from test data. All subroutines have templates for the user to use as guidelines.

  6. Simscape Modeling Verification in the Simulink Development Environment

    NASA Technical Reports Server (NTRS)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  7. A Remote Lab for Experiments with a Team of Mobile Robots

    PubMed Central

    Casini, Marco; Garulli, Andrea; Giannitrapani, Antonio; Vicino, Antonio

    2014-01-01

    In this paper, a remote lab for experimenting with a team of mobile robots is presented. Robots are built with the LEGO Mindstorms technology and user-defined control laws can be directly coded in the Matlab programming language and validated on the real system. The lab is versatile enough to be used for both teaching and research purposes. Students can easily go through a number of predefined mobile robotics experiences without having to worry about robot hardware or low-level programming languages. More advanced experiments can also be carried out by uploading custom controllers. The capability to have full control of the vehicles, together with the possibility to define arbitrarily complex environments through the definition of virtual obstacles, makes the proposed facility well suited to quickly test and compare different control laws in a real-world scenario. Moreover, the user can simulate the presence of different types of exteroceptive sensors on board of the robots or a specific communication architecture among the agents, so that decentralized control strategies and motion coordination algorithms can be easily implemented and tested. A number of possible applications and real experiments are presented in order to illustrate the main features of the proposed mobile robotics remote lab. PMID:25192316

  8. A remote lab for experiments with a team of mobile robots.

    PubMed

    Casini, Marco; Garulli, Andrea; Giannitrapani, Antonio; Vicino, Antonio

    2014-09-04

    In this paper, a remote lab for experimenting with a team of mobile robots is presented. Robots are built with the LEGO Mindstorms technology and user-defined control laws can be directly coded in the Matlab programming language and validated on the real system. The lab is versatile enough to be used for both teaching and research purposes. Students can easily go through a number of predefined mobile robotics experiences without having to worry about robot hardware or low-level programming languages. More advanced experiments can also be carried out by uploading custom controllers. The capability to have full control of the vehicles, together with the possibility to define arbitrarily complex environments through the definition of virtual obstacles, makes the proposed facility well suited to quickly test and compare different control laws in a real-world scenario. Moreover, the user can simulate the presence of different types of exteroceptive sensors on board of the robots or a specific communication architecture among the agents, so that decentralized control strategies and motion coordination algorithms can be easily implemented and tested. A number of possible applications and real experiments are presented in order to illustrate the main features of the proposed mobile robotics remote lab.

  9. Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.

    PubMed

    Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O

    2006-03-01

    The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.

  10. Applying CBR to machine tool product configuration design oriented to customer requirements

    NASA Astrophysics Data System (ADS)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2017-01-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  11. visPIG--a web tool for producing multi-region, multi-track, multi-scale plots of genetic data.

    PubMed

    Scales, Matthew; Jäger, Roland; Migliorini, Gabriele; Houlston, Richard S; Henrion, Marc Y R

    2014-01-01

    We present VISual Plotting Interface for Genetics (visPIG; http://vispig.icr.ac.uk), a web application to produce multi-track, multi-scale, multi-region plots of genetic data. visPIG has been designed to allow users not well versed with mathematical software packages and/or programming languages such as R, Matlab®, Python, etc., to integrate data from multiple sources for interpretation and to easily create publication-ready figures. While web tools such as the UCSC Genome Browser or the WashU Epigenome Browser allow custom data uploads, such tools are primarily designed for data exploration. This is also true for the desktop-run Integrative Genomics Viewer (IGV). Other locally run data visualisation software such as Circos require significant computer skills of the user. The visPIG web application is a menu-based interface that allows users to upload custom data tracks and set track-specific parameters. Figures can be downloaded as PDF or PNG files. For sensitive data, the underlying R code can also be downloaded and run locally. visPIG is multi-track: it can display many different data types (e.g association, functional annotation, intensity, interaction, heat map data,…). It also allows annotation of genes and other custom features in the plotted region(s). Data tracks can be plotted individually or on a single figure. visPIG is multi-region: it supports plotting multiple regions, be they kilo- or megabases apart or even on different chromosomes. Finally, visPIG is multi-scale: a sub-region of particular interest can be 'zoomed' in. We describe the various features of visPIG and illustrate its utility with examples. visPIG is freely available through http://vispig.icr.ac.uk under a GNU General Public License (GPLv3).

  12. Integrating products of Bessel functions with an additional exponential or rational factor

    NASA Astrophysics Data System (ADS)

    Van Deun, Joris; Cools, Ronald

    2008-04-01

    We provide two MATLAB programs to compute integrals of the form ex∏i=1kJν_i(ax)dxand 0∞xr+x∏i=1kJν_i(ax)dx with Jν_i(x) the Bessel function of the first kind and (real) order ν. The parameter m is a real number such that ∑ν+m>-1 (to assure integrability near zero), r is real and the numbers c and a are all strictly positive. The program can deliver accurate error estimates. Program summaryProgram title: BESSELINTR, BESSELINTC Catalogue identifier: AEAH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1601 No. of bytes in distributed program, including test data, etc.: 13 161 Distribution format: tar.gz Programming language: Matlab (version ⩾6.5), Octave (version ⩾2.1.69) Computer: All supporting Matlab or Octave Operating system: All supporting Matlab or Octave RAM: For k Bessel functions our program needs approximately ( 500+140k) double precision variables Classification: 4.11 Nature of problem: The problem consists in integrating an arbitrary product of Bessel functions with an additional rational or exponential factor over a semi-infinite interval. Difficulties arise from the irregular oscillatory behaviour and the possible slow decay of the integrand, which prevents truncation at a finite point. Solution method: The interval of integration is split into a finite and infinite part. The integral over the finite part is computed using Gauss-Legendre quadrature. The integrand on the infinite part is approximated using asymptotic expansions and this approximation is integrated exactly with the aid of the upper incomplete gamma function. In the case where a rational factor is present, this factor is first expanded in a Taylor series around infinity. Restrictions: Some (and eventually all) numerical accuracy is lost when one or more of the parameters r,c,a or v grow very large, or when r becomes small. Running time: Less than 0.02 s for a simple problem (two Bessel functions, small parameters), a few seconds for a more complex problem (more than six Bessel functions, large parameters), in Matlab 7.4 (R2007a) on a 2.4 GHz AMD Opteron Processor 250. References:J. Van Deun, R. Cools, Algorithm 858: Computing infinite range integrals of an arbitrary product of Bessel functions, ACM Trans. Math. Software 32 (4) (2006) 580-596.

  13. OMPC: an Open-Source MATLAB-to-Python Compiler.

    PubMed

    Jurica, Peter; van Leeuwen, Cees

    2009-01-01

    Free access to scientific information facilitates scientific progress. Open-access scientific journals are a first step in this direction; a further step is to make auxiliary and supplementary materials that accompany scientific publications, such as methodological procedures and data-analysis tools, open and accessible to the scientific community. To this purpose it is instrumental to establish a software base, which will grow toward a comprehensive free and open-source language of technical and scientific computing. Endeavors in this direction are met with an important obstacle. MATLAB((R)), the predominant computation tool in many fields of research, is a closed-source commercial product. To facilitate the transition to an open computation platform, we propose Open-source MATLAB((R))-to-Python Compiler (OMPC), a platform that uses syntax adaptation and emulation to allow transparent import of existing MATLAB((R)) functions into Python programs. The imported MATLAB((R)) modules will run independently of MATLAB((R)), relying on Python's numerical and scientific libraries. Python offers a stable and mature open source platform that, in many respects, surpasses commonly used, expensive commercial closed source packages. The proposed software will therefore facilitate the transparent transition towards a free and general open-source lingua franca for scientific computation, while enabling access to the existing methods and algorithms of technical computing already available in MATLAB((R)). OMPC is available at http://ompc.juricap.com.

  14. The Biopsychology-Toolbox: a free, open-source Matlab-toolbox for the control of behavioral experiments.

    PubMed

    Rose, Jonas; Otto, Tobias; Dittrich, Lars

    2008-10-30

    The Biopsychology-Toolbox is a free, open-source Matlab-toolbox for the control of behavioral experiments. The major aim of the project was to provide a set of basic tools that allow programming novices to control basic hardware used for behavioral experimentation without limiting the power and flexibility of the underlying programming language. The modular design of the toolbox allows portation of parts as well as entire paradigms between different types of hardware. In addition to the toolbox, this project offers a platform for the exchange of functions, hardware solutions and complete behavioral paradigms.

  15. Operating a Geiger Müller tube using a PC sound card

    NASA Astrophysics Data System (ADS)

    Azooz, A. A.

    2009-01-01

    In this paper, a simple MATLAB-based PC program that enables the computer to function as a replacement for the electronic scalar-counter system associated with a Geiger-Müller (GM) tube is described. The program utilizes the ability of MATLAB to acquire data directly from the computer sound card. The signal from the GM tube is applied to the computer sound card via the line in port. All standard GM experiments, pulse shape and statistical analysis experiments can be carried out using this system. A new visual demonstration of dead time effects is also presented.

  16. Intelligent traffic lights based on MATLAB

    NASA Astrophysics Data System (ADS)

    Nie, Ying

    2018-04-01

    In this paper, I describes the traffic lights system and it has some. Through analysis, I used MATLAB technology, transformed the camera photographs into digital signals. Than divided the road vehicle is into three methods: very congestion, congestion, a little congestion. Through the MCU programming, solved the different roads have different delay time, and Used this method, saving time and resources, so as to reduce road congestion.

  17. GRAFLAB 2.3 for UNIX - A MATLAB database, plotting, and analysis tool: User`s guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, W.N.

    1998-03-01

    This report is a user`s manual for GRAFLAB, which is a new database, analysis, and plotting package that has been written entirely in the MATLAB programming language. GRAFLAB is currently used for data reduction, analysis, and archival. GRAFLAB was written to replace GRAFAID, which is a FORTRAN database, analysis, and plotting package that runs on VAX/VMS.

  18. Massively parallel data processing for quantitative total flow imaging with optical coherence microscopy and tomography

    NASA Astrophysics Data System (ADS)

    Sylwestrzak, Marcin; Szlag, Daniel; Marchand, Paul J.; Kumar, Ashwin S.; Lasser, Theo

    2017-08-01

    We present an application of massively parallel processing of quantitative flow measurements data acquired using spectral optical coherence microscopy (SOCM). The need for massive signal processing of these particular datasets has been a major hurdle for many applications based on SOCM. In view of this difficulty, we implemented and adapted quantitative total flow estimation algorithms on graphics processing units (GPU) and achieved a 150 fold reduction in processing time when compared to a former CPU implementation. As SOCM constitutes the microscopy counterpart to spectral optical coherence tomography (SOCT), the developed processing procedure can be applied to both imaging modalities. We present the developed DLL library integrated in MATLAB (with an example) and have included the source code for adaptations and future improvements. Catalogue identifier: AFBT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 913552 No. of bytes in distributed program, including test data, etc.: 270876249 Distribution format: tar.gz Programming language: CUDA/C, MATLAB. Computer: Intel x64 CPU, GPU supporting CUDA technology. Operating system: 64-bit Windows 7 Professional. Has the code been vectorized or parallelized?: Yes, CPU code has been vectorized in MATLAB, CUDA code has been parallelized. RAM: Dependent on users parameters, typically between several gigabytes and several tens of gigabytes Classification: 6.5, 18. Nature of problem: Speed up of data processing in optical coherence microscopy Solution method: Utilization of GPU for massively parallel data processing Additional comments: Compiled DLL library with source code and documentation, example of utilization (MATLAB script with raw data) Running time: 1,8 s for one B-scan (150 × faster in comparison to the CPU data processing time)

  19. Extracting the Data From the LCM vk4 Formatted Output File

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, James G.

    These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and computemore » laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.« less

  20. A general spectral method for the numerical simulation of one-dimensional interacting fermions

    NASA Astrophysics Data System (ADS)

    Clason, Christian; von Winckel, Gregory

    2012-08-01

    This software implements a general framework for the direct numerical simulation of systems of interacting fermions in one spatial dimension. The approach is based on a specially adapted nodal spectral Galerkin method, where the basis functions are constructed to obey the antisymmetry relations of fermionic wave functions. An efficient Matlab program for the assembly of the stiffness and potential matrices is presented, which exploits the combinatorial structure of the sparsity pattern arising from this discretization to achieve optimal run-time complexity. This program allows the accurate discretization of systems with multiple fermions subject to arbitrary potentials, e.g., for verifying the accuracy of multi-particle approximations such as Hartree-Fock in the few-particle limit. It can be used for eigenvalue computations or numerical solutions of the time-dependent Schrödinger equation. The new version includes a Python implementation of the presented approach. New version program summaryProgram title: assembleFermiMatrix Catalogue identifier: AEKO_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKO_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 332 No. of bytes in distributed program, including test data, etc.: 5418 Distribution format: tar.gz Programming language: MATLAB/GNU Octave, Python Computer: Any architecture supported by MATLAB, GNU Octave or Python Operating system: Any supported by MATLAB, GNU Octave or Python RAM: Depends on the data Classification: 4.3, 2.2. External routines: Python 2.7+, NumPy 1.3+, SciPy 0.10+ Catalogue identifier of previous version: AEKO_v1_0 Journal reference of previous version: Comput. Phys. Commun. 183 (2012) 405 Does the new version supersede the previous version?: Yes Nature of problem: The direct numerical solution of the multi-particle one-dimensional Schrödinger equation in a quantum well is challenging due to the exponential growth in the number of degrees of freedom with increasing particles. Solution method: A nodal spectral Galerkin scheme is used where the basis functions are constructed to obey the antisymmetry relations of the fermionic wave function. The assembly of these matrices is performed efficiently by exploiting the combinatorial structure of the sparsity patterns. Reasons for new version: A Python implementation is now included. Summary of revisions: Added a Python implementation; small documentation fixes in Matlab implementation. No change in features of the package. Restrictions: Only one-dimensional computational domains with homogeneous Dirichlet or periodic boundary conditions are supported. Running time: Seconds to minutes.

  1. Upgrading Custom Simulink Library Components for Use in Newer Versions of Matlab

    NASA Technical Reports Server (NTRS)

    Stewart, Camiren L.

    2014-01-01

    The Spaceport Command and Control System (SCCS) at Kennedy Space Center (KSC) is a control system for monitoring and launching manned launch vehicles. Simulations of ground support equipment (GSE) and the launch vehicle systems are required throughout the life cycle of SCCS to test software, hardware, and procedures to train the launch team. The simulations of the GSE at the launch site in conjunction with off-line processing locations are developed using Simulink, a piece of Commercial Off-The-Shelf (COTS) software. The simulations that are built are then converted into code and ran in a simulation engine called Trick, a Government off-the-shelf (GOTS) piece of software developed by NASA. In the world of hardware and software, it is not uncommon to see the products that are utilized be upgraded and patched or eventually fade away into an obsolete status. In the case of SCCS simulation software, Matlab, a MathWorks product, has released a number of stable versions of Simulink since the deployment of the software on the Development Work Stations in the Linux environment (DWLs). The upgraded versions of Simulink has introduced a number of new tools and resources that, if utilized fully and correctly, will save time and resources during the overall development of the GSE simulation and its correlating documentation. Unfortunately, simply importing the already built simulations into the new Matlab environment will not suffice as it will produce results that may not be expected as they were in the version that is currently being utilized. Thus, an upgrade execution plan was developed and executed to fully upgrade the simulation environment to one of the latest versions of Matlab.

  2. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  3. Poster - 09: A MATLAB-based Program for Automated Quality Assurance of a Prostate Brachytherapy Ultrasound System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poon, Justin; Sabondjian, Eric; Sankreacha, Raxa

    Purpose: A robust Quality Assurance (QA) program is essential for prostate brachytherapy ultrasound systems due to the importance of imaging accuracy during treatment and planning. Task Group 128 of the American Association of Physicists in Medicine has recommended a set of QA tests covering grayscale visibility, depth of penetration, axial and lateral resolution, distance measurement, area measurement, volume measurement, and template/electronic grid alignment. Making manual measurements on the ultrasound system can be slow and inaccurate, so a MATLAB program was developed for automation of the described tests. Methods: Test images were acquired using a BK Medical Flex Focus 400 ultrasoundmore » scanner and 8848 transducer with the CIRS Brachytherapy QA Phantom – Model 045A. For each test, the program automatically segments the inputted image(s), makes the appropriate measurements, and indicates if the test passed or failed. The program was tested by analyzing two sets of images, where the measurements from the first set were used as baseline values. Results: The program successfully analyzed the images for each test and determined if any action limits were exceeded. All tests passed – the measurements made by the program were consistent and met the requirements outlined by Task Group 128. Conclusions: The MATLAB program we have developed can be used for automated QA of an ultrasound system for prostate brachytherapy. The GUI provides a user-friendly way to analyze images without the need for any manual measurement, potentially removing intra- and inter-user variability for more consistent results.« less

  4. HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.

    PubMed

    Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C

    2004-07-01

    A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.

  5. Earth Science Curriculum Enrichment Through Matlab!

    NASA Astrophysics Data System (ADS)

    Salmun, H.; Buonaiuto, F. S.

    2016-12-01

    The use of Matlab in Earth Science undergraduate courses in the Department of Geography at Hunter College began as a pilot project in Fall 2008 and has evolved and advanced to being a significant component of an Advanced Oceanography course, the selected tool for data analysis in other courses and the main focus of a graduate course for doctoral students at The city University of New York (CUNY) working on research related to geophysical, oceanic and atmospheric dynamics. The primary objectives of these efforts were to enhance the Earth Science curriculum through course specific applications, to increase undergraduate programming and data analysis skills, and to develop a Matlab users network within the Department and the broader Hunter College and CUNY community. Students have had the opportunity to learn Matlab as a stand-alone course, within an independent study group, or as a laboratory component within related STEM classes. All of these instructional efforts incorporated the use of prepackaged Matlab exercises and a research project. Initial exercises were designed to cover basic scripting and data visualization techniques. Students were provided data and a skeleton script to modify and improve upon based on the laboratory instructions. As student's programming skills increased throughout the semester more advanced scripting, data mining and data analysis were assigned. In order to illustrate the range of applications within the Earth Sciences, laboratory exercises were constructed around topics selected from the disciplines of Geology, Physics, Oceanography, Meteorology and Climatology. In addition the structure of the research component of the courses included both individual and team projects.

  6. Stress-oriented driver assistance system for electric vehicles.

    PubMed

    Athanasiou, Georgia; Tsotoulidis, Savvas; Mitronikas, Epaminondas; Lymberopoulos, Dimitrios

    2014-01-01

    Stress is physiological and physical reaction that appears in highly demanding situations and affects human's perception and reaction capability. Occurrence of stress events within highly dynamic road environment could lead to life-threatening situation. With the perspective of safety and comfort driving provision to anxious drivers, in this paper a stress-oriented Driver Assistance System (DAS) is proposed. The DAS deployed on Electric Vehicle. This novel DAS customizes driving command signal in respect to road context, when stress is detected. The effectiveness of this novel DAS is verified by simulation in MATLAB/SIMULINK environment.

  7. Using the Generic Mapping Tools From Within the MATLAB, Octave and Julia Computing Environments

    NASA Astrophysics Data System (ADS)

    Luis, J. M. F.; Wessel, P.

    2016-12-01

    The Generic Mapping Tools (GMT) is a widely used software infrastructure tool set for analyzing and displaying geoscience data. Its power to analyze and process data and produce publication-quality graphics has made it one of several standard processing toolsets used by a large segment of the Earth and Ocean Sciences. GMT's strengths lie in superior publication-quality vector graphics, geodetic-quality map projections, robust data processing algorithms scalable to enormous data sets, and ability to run under all common operating systems. The GMT tool chest offers over 120 modules sharing a common set of command options, file structures, and documentation. GMT modules are command line tools that accept input and write output, and this design allows users to write scripts in which one module's output becomes another module's input, creating highly customized GMT workflows. With the release of GMT 5, these modules are high-level functions with a C API, potentially allowing users access to high-level GMT capabilities from any programmable environment. Many scientists who use GMT also use other computational tools, such as MATLAB® and its clone Octave. We have built a MATLAB/Octave interface on top of the GMT 5 C API. Thus, MATLAB or Octave now has full access to all GMT modules as well as fundamental input/output of GMT data objects via a MEX function. Internally, the GMT/MATLAB C API defines six high-level composite data objects that handle input and output of data via individual GMT modules. These are data tables, grids, text tables (text/data mixed records), color palette tables, raster images (1-4 color bands), and PostScript. The API is responsible for translating between the six GMT objects and the corresponding native MATLAB objects. References to data arrays are passed if transposing of matrices is not required. The GMT and MATLAB/Octave combination is extremely flexible, letting the user harvest the general numerical and graphical capabilities of both systems, and represents a giant step forward in interoperability between GMT and other software package. We will present examples of the symbiotic benefits of combining these platforms. Two other extensions are also in the works: a nearly finished Julia wrapper and an embryonic Python module. Publication supported by FCT- project UID/GEO/50019/2013 - Instituto D. Luiz

  8. Rapid-X - An FPGA Development Toolset Using a Custom Simulink Library for MTCA.4 Modules

    NASA Astrophysics Data System (ADS)

    Prędki, Paweł; Heuer, Michael; Butkowski, Łukasz; Przygoda, Konrad; Schlarb, Holger; Napieralski, Andrzej

    2015-06-01

    The recent introduction of advanced hardware architectures such as the Micro Telecommunications Computing Architecture (MTCA) caused a change in the approach to implementation of control schemes in many fields. The development has been moving away from traditional programming languages ( C/C++), to hardware description languages (VHDL, Verilog), which are used in FPGA development. With MATLAB/Simulink it is possible to describe complex systems with block diagrams and simulate their behavior. Those diagrams are then used by the HDL experts to implement exactly the required functionality in hardware. Both the porting of existing applications and adaptation of new ones require a lot of development time from them. To solve this, Xilinx System Generator, a toolbox for MATLAB/Simulink, allows rapid prototyping of those block diagrams using hardware modelling. It is still up to the firmware developer to merge this structure with the hardware-dependent HDL project. This prevents the application engineer from quickly verifying the proposed schemes in real hardware. The framework described in this article overcomes these challenges, offering a hardware-independent library of components that can be used in Simulink/System Generator models. The components are subsequently translated into VHDL entities and integrated with a pre-prepared VHDL project template. Furthermore, the entire implementation process is run in the background, giving the user an almost one-click path from control scheme modelling and simulation to bit-file generation. This approach allows the application engineers to quickly develop new schemes and test them in real hardware environment. The applications may range from simple data logging or signal generation ones to very advanced controllers. Taking advantage of the Simulink simulation capabilities and user-friendly hardware implementation routines, the framework significantly decreases the development time of FPGA-based applications.

  9. Validation of a Custom-made Software for DQE Assessment in Mammography Digital Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayala-Dominguez, L.; Perez-Ponce, H.; Brandan, M. E.

    2010-12-07

    This works presents the validation of a custom-made software, designed and developed in Matlab, intended for routine evaluation of detective quantum efficiency DQE, according to algorithms described in the IEC 62220-1-2 standard. DQE, normalized noise power spectrum NNPS and pre-sampling modulation transfer function MTF were calculated from RAW images from a GE Senographe DS (FineView disabled) and a Siemens Novation system. Calculated MTF is in close agreement with results obtained with alternative codes: MTF lowbar tool (Maidment), ImageJ plug-in (Perez-Ponce) and MIQuaELa (Ayala). Overall agreement better than {approx_equal}90% was found in MTF; the largest differences were observed at frequencies closemore » to the Nyquist limit. For the measurement of NNPS and DQE, agreement is similar to that obtained in the MTF. These results suggest that the developed software can be used with confidence for image quality assessment.« less

  10. Tool for Analysis and Reduction of Scientific Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN as a whole (up to version 2.0) has been summarized, and selected aspects of ASPEN have been discussed in several previous NASA Tech Briefs articles. Restated briefly, ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and randomaccess memories. Domain-specific reasoning modules (e.g., modules for determining orbits for spacecraft) can easily be plugged into ASPEN 3.0. Improvements over other, similar software that have been incorporated into ASPEN 3.0 include a provision for more expressive time-line values, new parsing capabilities afforded by an ASPEN language based on Extensible Markup Language, improved search capabilities, and improved interfaces to other, utility-type software (notably including MATLAB).

  11. Dexterity: A MATLAB-based analysis software suite for processing and visualizing data from tasks that measure arm or forelimb function.

    PubMed

    Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B

    2017-07-15

    Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. 31 CFR 1024.220 - Customer identification programs for mutual funds.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... mutual funds. 1024.220 Section 1024.220 Money and Finance: Treasury Regulations Relating to Money and... FUNDS Programs § 1024.220 Customer identification programs for mutual funds. (a) Customer identification program: minimum requirements—(1) In general. A mutual fund must implement a written Customer...

  13. MATLAB Algorithms for Rapid Detection and Embedding of Palindrome and Emordnilap Electronic Watermarks in Simulated Chemical and Biological Image Data

    DTIC Science & Technology

    2004-11-16

    MATLAB Algorithms for Rapid Detection and Embedding of Palindrome and Emordnilap Electronic Watermarks in Simulated Chemical and Biological Image ...and Emordnilap Electronic Watermarks in Simulated Chemical and Biological Image Data 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...Conference on Chemical and Biological Defense Research. Held in Hunt Valley, Maryland on 15-17 November 2004., The original document contains color images

  14. Nuclear Fuel Depletion Analysis Using Matlab Software

    NASA Astrophysics Data System (ADS)

    Faghihi, F.; Nematollahi, M. R.

    Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.

  15. Pulseq-Graphical Programming Interface: Open source visual environment for prototyping pulse sequences and integrated magnetic resonance imaging algorithm development.

    PubMed

    Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam

    2018-03-11

    To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. A Simulation Program for Dynamic Infrared (IR) Spectra

    ERIC Educational Resources Information Center

    Zoerb, Matthew C.; Harris, Charles B.

    2013-01-01

    A free program for the simulation of dynamic infrared (IR) spectra is presented. The program simulates the spectrum of two exchanging IR peaks based on simple input parameters. Larger systems can be simulated with minor modifications. The program is available as an executable program for PCs or can be run in MATLAB on any operating system. Source…

  17. Laying the cornerstone: an employee-driven customer service program.

    PubMed

    Davis, Stephen M; Chinnis, Ann S; Dunmire, J Erin

    2006-01-01

    In the 21st-century healthcare environment, customer service remains critical to the fiscal viability of healthcare organizations. Continued competition for patients and diminishing reimbursements have necessitated the establishment of customer service programs to attract patients and retain outstanding employees. These programs should increase quality experiences for both internal customers (employees) and external customers (patients). This article describes a unique employee-driven customer service initiative titled Serving Together Achieving Results. Obstacles to implementing a customer service program in a multifaceted academic setting are highlighted, and the use of a novel tool, Q technique, to prioritize employee feedback is discussed.

  18. Investigating Customers' Experiences with Their Financial Services Customer Education Programs as It Impacts Customer Loyalty to the Financial Firm

    ERIC Educational Resources Information Center

    Islam, Kaliym A.

    2017-01-01

    The problem addressed in this study was that customer education programs are intended to strengthen customer loyalty; however, research on the effects of customer education on customer loyalty remains insufficient. This phenomenological study investigated how the lived experiences of customers' participating in financial services' customer…

  19. ALGORITHMS AND PROGRAMS FOR STRONG GRAVITATIONAL LENSING IN KERR SPACE-TIME INCLUDING POLARIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Bin; Maddumage, Prasad; Kantowski, Ronald

    2015-05-15

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravitymore » field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.« less

  20. Op-Ug TD Optimizer Tool Based on Matlab Code to Find Transition Depth From Open Pit to Block Caving / Narzędzie Optymalizacyjne Oparte O Kod Matlab Wykorzystane Do Określania Głębokości Przejściowej Od Wydobycia Odkrywkowego Do Wybierania Komorami

    NASA Astrophysics Data System (ADS)

    Bakhtavar, E.

    2015-09-01

    In this study, transition from open pit to block caving has been considered as a challenging problem. For this purpose, the linear integer programing code of Matlab was initially developed on the basis of the binary integer model proposed by Bakhtavar et al (2012). Then a program based on graphical user interface (GUI) was set up and named "Op-Ug TD Optimizer". It is a beneficial tool for simple application of the model in all situations where open pit is considered together with block caving method for mining an ore deposit. Finally, Op-Ug TD Optimizer has been explained step by step through solving the transition from open pit to block caving problem of a case ore deposit. W pracy tej rozważano skomplikowane zagadnienie przejścia od wybierania odkrywkowego do komorowego. W tym celu opracowano kod programowania liniowego w środowisku MATLAB w oparciu o model liczb binarnych zaproponowany przez Bakhtavara (2012). Następnie opracowano program z wykorzystujący graficzny interfejs użytkownika o nazwie Optymalizator Op-Ug TD. Jest to niezwykle cenne narzędzie umożliwiające stosowanie modelu dla wszystkich warunków w sytuacjach gdy rozważamy prowadzenie wydobycia metodą odkrywkową oraz wydobycie komorowe przy eksploatacji złóż rud żelaza. W końcowej części pracy podano szczegółową instrukcję stosowanie programu Optymalizator na przedstawionym przykładzie przejścia od wydobycia rud żelaza metodami odkrywkowymi poprzez wydobycie komorami.

  1. Algorithms and Programs for Strong Gravitational Lensing In Kerr Space-time Including Polarization

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Kantowski, Ronald; Dai, Xinyu; Baron, Eddie; Maddumage, Prasad

    2015-05-01

    Active galactic nuclei (AGNs) and quasars are important astrophysical objects to understand. Recently, microlensing observations have constrained the size of the quasar X-ray emission region to be of the order of 10 gravitational radii of the central supermassive black hole. For distances within a few gravitational radii, light paths are strongly bent by the strong gravity field of the central black hole. If the central black hole has nonzero angular momentum (spin), then a photon’s polarization plane will be rotated by the gravitational Faraday effect. The observed X-ray flux and polarization will then be influenced significantly by the strong gravity field near the source. Consequently, linear gravitational lensing theory is inadequate for such extreme circumstances. We present simple algorithms computing the strong lensing effects of Kerr black holes, including the effects on polarization. Our algorithms are realized in a program “KERTAP” in two versions: MATLAB and Python. The key ingredients of KERTAP are a graphic user interface, a backward ray-tracing algorithm, a polarization propagator dealing with gravitational Faraday rotation, and algorithms computing observables such as flux magnification and polarization angles. Our algorithms can be easily realized in other programming languages such as FORTRAN, C, and C++. The MATLAB version of KERTAP is parallelized using the MATLAB Parallel Computing Toolbox and the Distributed Computing Server. The Python code was sped up using Cython and supports full implementation of MPI using the “mpi4py” package. As an example, we investigate the inclination angle dependence of the observed polarization and the strong lensing magnification of AGN X-ray emission. We conclude that it is possible to perform complex numerical-relativity related computations using interpreted languages such as MATLAB and Python.

  2. C%2B%2B tensor toolbox user manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd D.; Kolda, Tamara Gibson

    2012-04-01

    The C++ Tensor Toolbox is a software package for computing tensor decompositions. It is based on the Matlab Tensor Toolbox, and is particularly optimized for sparse data sets. This user manual briefly overviews tensor decomposition mathematics, software capabilities, and installation of the package. Tensors (also known as multidimensional arrays or N-way arrays) are used in a variety of applications ranging from chemometrics to network analysis. The Tensor Toolbox provides classes for manipulating dense, sparse, and structured tensors in C++. The Toolbox compiles into libraries and is intended for use with custom applications written by users.

  3. Synthesis of multi-loop automatic control systems by the nonlinear programming method

    NASA Astrophysics Data System (ADS)

    Voronin, A. V.; Emelyanova, T. A.

    2017-01-01

    The article deals with the problem of calculation of the multi-loop control systems optimal tuning parameters by numerical methods and nonlinear programming methods. For this purpose, in the paper the Optimization Toolbox of Matlab is used.

  4. 75 FR 19464 - Interagency Guidance on Response Programs for Unauthorized Access to Customer Information and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-14

    ... for Unauthorized Access to Customer Information and Customer Notice AGENCY: Office of Thrift...: Interagency Guidance on Response Programs for Unauthorized Access to Customer Information and Customer Notice... physical safeguards to: (1) Ensure the security and confidentiality of customer records and information; (2...

  5. 17 CFR 270.0-11 - Customer identification programs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Customer identification... (CONTINUED) RULES AND REGULATIONS, INVESTMENT COMPANY ACT OF 1940 § 270.0-11 Customer identification programs... implementing regulation at 31 CFR 103.131, which requires a customer identification program to be implemented...

  6. Coded Modulation in C and MATLAB

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon; Andrews, Kenneth S.

    2011-01-01

    This software, written separately in C and MATLAB as stand-alone packages with equivalent functionality, implements encoders and decoders for a set of nine error-correcting codes and modulators and demodulators for five modulation types. The software can be used as a single program to simulate the performance of such coded modulation. The error-correcting codes implemented are the nine accumulate repeat-4 jagged accumulate (AR4JA) low-density parity-check (LDPC) codes, which have been approved for international standardization by the Consultative Committee for Space Data Systems, and which are scheduled to fly on a series of NASA missions in the Constellation Program. The software implements the encoder and decoder functions, and contains compressed versions of generator and parity-check matrices used in these operations.

  7. Building "My First NMRviewer": A Project Incorporating Coding and Programming Tasks in the Undergraduate Chemistry Curricula

    ERIC Educational Resources Information Center

    Arrabal-Campos, Francisco M.; Cortés-Villena, Alejandro; Fernández, Ignacio

    2017-01-01

    This paper presents a programming project named NMRviewer that allows students to visualize transformed and processed 1 H NMR data in an accessible, interactive format while allowing instructors to incorporate programming content into the chemistry curricula. Using the MATLAB graphical user interface development environment (GUIDE), students can…

  8. How to get students to love (or not hate) MATLAB and programming

    NASA Astrophysics Data System (ADS)

    Reckinger, Shanon; Reckinger, Scott

    2014-11-01

    An effective programming course geared toward engineering students requires the utilization of modern teaching philosophies. A newly designed course that focuses on programming in MATLAB involves flipping the classroom and integrating various active teaching techniques. Vital aspects of the new course design include: lengthening in-class contact hours, Process-Oriented Guided Inquiry Learning (POGIL) method worksheets (self-guided instruction), student created video content posted on YouTube, clicker questions (used in class to practice reading and debugging code), programming exams that don't require computers, integrating oral exams into the classroom, fostering an environment for formal and informal peer learning, and designing in a broader theme to tie together assignments. However, possibly the most important piece to this programming course puzzle: the instructor needs to be able to find programming mistakes very fast and then lead individuals and groups through the steps to find their mistakes themselves. The effectiveness of the new course design is demonstrated through pre- and post- concept exam results and student evaluation feedback. Students reported that the course was challenging and required a lot of effort, but left largely positive feedback.

  9. A Compilation of MATLAB Scripts and Functions for MACGMC Analyses

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.

    2017-01-01

    The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic simulations with minimal manual intervention. This document is formatted to provide MATLAB source files and descriptions of how to utilize them. It is assumed that the user has a basic understanding of how MATLAB scripts work and some MATLAB programming experience.

  10. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    NASA Astrophysics Data System (ADS)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html

  11. Repeat Customer Success in Extension

    ERIC Educational Resources Information Center

    Bess, Melissa M.; Traub, Sarah M.

    2013-01-01

    Four multi-session research-based programs were offered by two Extension specialist in one rural Missouri county. Eleven participants who came to multiple Extension programs could be called "repeat customers." Based on the total number of participants for all four programs, 25% could be deemed as repeat customers. Repeat customers had…

  12. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    2016-12-01

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  13. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  14. A Matlab-based finite-difference solver for the Poisson problem with mixed Dirichlet-Neumann boundary conditions

    NASA Astrophysics Data System (ADS)

    Reimer, Ashton S.; Cheviakov, Alexei F.

    2013-03-01

    A Matlab-based finite-difference numerical solver for the Poisson equation for a rectangle and a disk in two dimensions, and a spherical domain in three dimensions, is presented. The solver is optimized for handling an arbitrary combination of Dirichlet and Neumann boundary conditions, and allows for full user control of mesh refinement. The solver routines utilize effective and parallelized sparse vector and matrix operations. Computations exhibit high speeds, numerical stability with respect to mesh size and mesh refinement, and acceptable error values even on desktop computers. Catalogue identifier: AENQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License v3.0 No. of lines in distributed program, including test data, etc.: 102793 No. of bytes in distributed program, including test data, etc.: 369378 Distribution format: tar.gz Programming language: Matlab 2010a. Computer: PC, Macintosh. Operating system: Windows, OSX, Linux. RAM: 8 GB (8, 589, 934, 592 bytes) Classification: 4.3. Nature of problem: To solve the Poisson problem in a standard domain with “patchy surface”-type (strongly heterogeneous) Neumann/Dirichlet boundary conditions. Solution method: Finite difference with mesh refinement. Restrictions: Spherical domain in 3D; rectangular domain or a disk in 2D. Unusual features: Choice between mldivide/iterative solver for the solution of large system of linear algebraic equations that arise. Full user control of Neumann/Dirichlet boundary conditions and mesh refinement. Running time: Depending on the number of points taken and the geometry of the domain, the routine may take from less than a second to several hours to execute.

  15. War-gaming application for future space systems acquisition: MATLAB implementation of war-gaming acquisition models and simulation results

    NASA Astrophysics Data System (ADS)

    Vienhage, Paul; Barcomb, Heather; Marshall, Karel; Black, William A.; Coons, Amanda; Tran, Hien T.; Nguyen, Tien M.; Guillen, Andy T.; Yoh, James; Kizer, Justin; Rogers, Blake A.

    2017-05-01

    The paper describes the MATLAB (MathWorks) programs that were developed during the REU workshop1 to implement The Aerospace Corporation developed Unified Game-based Acquisition Framework and Advanced Game - based Mathematical Framework (UGAF-AGMF) and its associated War-Gaming Engine (WGE) models. Each game can be played from the perspectives of the Department of Defense Acquisition Authority (DAA) or of an individual contractor (KTR). The programs also implement Aerospace's optimum "Program and Technical Baseline (PTB) and associated acquisition" strategy that combines low Total Ownership Cost (TOC) with innovative designs while still meeting warfighter needs. The paper also describes the Bayesian Acquisition War-Gaming approach using Monte Carlo simulations, a numerical analysis technique to account for uncertainty in decision making, which simulate the PTB development and acquisition processes and will detail the procedure of the implementation and the interactions between the games.

  16. Counter Weapon Control

    DTIC Science & Technology

    2015-03-26

    and the realistic space. These plot were generated using Matlab as teh program to run the simulations. Figure 67. Position 1, Scenario 1 Figure 68...The circle of Apollonius”. Mathematics Education Program J. Wilson, EMAT, 2009 . 12. Oyler, Dave W, Pierre T Kabamba, and Anouck R Girard. “Pursuit

  17. ELRIS2D: A MATLAB Package for the 2D Inversion of DC Resistivity/IP Data

    NASA Astrophysics Data System (ADS)

    Akca, Irfan

    2016-04-01

    ELRIS2D is an open source code written in MATLAB for the two-dimensional inversion of direct current resistivity (DCR) and time domain induced polarization (IP) data. The user interface of the program is designed for functionality and ease of use. All available settings of the program can be reached from the main window. The subsurface is discre-tized using a hybrid mesh generated by the combination of structured and unstructured meshes, which reduces the computational cost of the whole inversion procedure. The inversion routine is based on the smoothness constrained least squares method. In order to verify the program, responses of two test models and field data sets were inverted. The models inverted from the synthetic data sets are consistent with the original test models in both DC resistivity and IP cases. A field data set acquired in an archaeological site is also used for the verification of outcomes of the program in comparison with the excavation results.

  18. VQone MATLAB toolbox: A graphical experiment builder for image and video quality evaluations: VQone MATLAB toolbox.

    PubMed

    Nuutinen, Mikko; Virtanen, Toni; Rummukainen, Olli; Häkkinen, Jukka

    2016-03-01

    This article presents VQone, a graphical experiment builder, written as a MATLAB toolbox, developed for image and video quality ratings. VQone contains the main elements needed for the subjective image and video quality rating process. This includes building and conducting experiments and data analysis. All functions can be controlled through graphical user interfaces. The experiment builder includes many standardized image and video quality rating methods. Moreover, it enables the creation of new methods or modified versions from standard methods. VQone is distributed free of charge under the terms of the GNU general public license and allows code modifications to be made so that the program's functions can be adjusted according to a user's requirements. VQone is available for download from the project page (http://www.helsinki.fi/psychology/groups/visualcognition/).

  19. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    NASA Technical Reports Server (NTRS)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.

  20. 78 FR 60966 - Self-Regulatory Organizations: Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    .... As such, marketing fee programs,\\7\\ and customer posting incentive programs,\\8\\ are based on... its current Priority Customer Rebate Program (the ``Program'') until October 31, 2013.\\3\\ The Program... Priority Customer \\6\\ order transmitted by that Member which is executed on the Exchange in all multiply...

  1. MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments.

    PubMed

    Elliott, Mark T; Welchman, Andrew E; Wing, Alan M

    2009-02-15

    Investigating movement timing and synchronisation at the sub-second range relies on an experimental setup that has high temporal fidelity, is able to deliver output cues and can capture corresponding responses. Modern, multi-tasking operating systems make this increasingly challenging when using standard PC hardware and programming languages. This paper describes a new free suite of tools (available from http://www.snipurl.com/mattap) for use within the MATLAB programming environment, compatible with Microsoft Windows and a range of data acquisition hardware. The toolbox allows flexible generation of timing cues with high temporal accuracy, the capture and automatic storage of corresponding participant responses and an integrated analysis module for the rapid processing of results. A simple graphical user interface is used to navigate the toolbox and so can be operated easily by users not familiar with programming languages. However, it is also fully extensible and customisable, allowing adaptation for individual experiments and facilitating the addition of new modules in future releases. Here we discuss the relevance of the MatTAP (MATLAB Timing Analysis Package) toolbox to current timing experiments and compare its use to alternative methods. We validate the accuracy of the analysis module through comparison to manual observation methods and replicate a previous sensorimotor synchronisation experiment to demonstrate the versatility of the toolbox features demanded by such movement synchronisation paradigms.

  2. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  3. Developing Matlab scripts for image analysis and quality assessment

    NASA Astrophysics Data System (ADS)

    Vaiopoulos, A. D.

    2011-11-01

    Image processing is a very helpful tool in many fields of modern sciences that involve digital imaging examination and interpretation. Processed images however, often need to be correlated with the original image, in order to ensure that the resulting image fulfills its purpose. Aside from the visual examination, which is mandatory, image quality indices (such as correlation coefficient, entropy and others) are very useful, when deciding which processed image is the most satisfactory. For this reason, a single program (script) was written in Matlab language, which automatically calculates eight indices by utilizing eight respective functions (independent function scripts). The program was tested in both fused hyperspectral (Hyperion-ALI) and multispectral (ALI, Landsat) imagery and proved to be efficient. Indices were found to be in agreement with visual examination and statistical observations.

  4. 20 CFR 669.330 - How are services delivered to the customer?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false How are services delivered to the customer... Farmworker Jobs Program Customers and Available Program Services § 669.330 How are services delivered to the customer? To ensure that all services are focused on the customer's needs, services are provided through a...

  5. 20 CFR 669.330 - How are services delivered to the customer?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false How are services delivered to the customer... Farmworker Jobs Program Customers and Available Program Services § 669.330 How are services delivered to the customer? To ensure that all services are focused on the customer's needs, services are provided through a...

  6. The impact of customer focus on program participation rates in the Virginia WIC Program (Special Supplemental Nutrition Program for Women, Infants, and Children).

    PubMed

    Chance, K G; Green, C G

    2001-01-01

    It has been shown in the for-profit sector (business, service, and manufacturing) that the success of an organization depends on its ability to satisfy customer requirements while eliminating waste and reducing costs. The purpose of this article was to examine the impact of current practices in customer focus on program participation rates in the Virginia WIC Program. The results of this study showed that the use of customer-focused strategies was correlated to program participation rates in the WIC Program. The mean data showed that teamwork and accessibility were at unsatisfactory levels in Virginia.

  7. Temporal and spatial variation of beaked and sperm whales foraging activity in Hawai'i, as determined with passive acoustics.

    PubMed

    Giorli, Giacomo; Neuheimer, Anna; Copeland, Adrienne; Au, Whitlow W L

    2016-10-01

    Beaked and sperm whales are top predators living in the waters off the Kona coast of Hawai'i. Temporal and spatial analyses of the foraging activity of these two species were studied with passive acoustics techniques. Three passive acoustics recorders moored to the ocean floor were used to monitor the foraging activity of these whales in three locations along the Kona coast of the island of Hawaii. Data were analyzed using automatic detector/classification systems: M3R (Marine Mammal Monitoring on Navy Ranges), and custom-designed Matlab programs. The temporal variation in foraging activity was species-specific: beaked whales foraged more at night in the north, and more during the day-time off Kailua-Kona. No day-time/night-time preference was found in the southern end of the sampling range. Sperm whales foraged mainly at night in the north, but no day-time/night-time preference was observed off Kailua-Kona and in the south. A Generalized Linear Model was then applied to assess whether location and chlorophyll concentration affected the foraging activity of each species. Chlorophyll concentration and location influenced the foraging activity of both these species of deep-diving odontocetes.

  8. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  9. Investigation of Matlab® as platform in navigation and control of an Automatic Guided Vehicle utilising an omnivision sensor.

    PubMed

    Kotze, Ben; Jordaan, Gerrit

    2014-08-25

    Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed.

  10. Investigation of Matlab® as Platform in Navigation and Control of an Automatic Guided Vehicle Utilising an Omnivision Sensor

    PubMed Central

    Kotze, Ben; Jordaan, Gerrit

    2014-01-01

    Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed. PMID:25157548

  11. Parallel Computation of the Jacobian Matrix for Nonlinear Equation Solvers Using MATLAB

    NASA Technical Reports Server (NTRS)

    Rose, Geoffrey K.; Nguyen, Duc T.; Newman, Brett A.

    2017-01-01

    Demonstrating speedup for parallel code on a multicore shared memory PC can be challenging in MATLAB due to underlying parallel operations that are often opaque to the user. This can limit potential for improvement of serial code even for the so-called embarrassingly parallel applications. One such application is the computation of the Jacobian matrix inherent to most nonlinear equation solvers. Computation of this matrix represents the primary bottleneck in nonlinear solver speed such that commercial finite element (FE) and multi-body-dynamic (MBD) codes attempt to minimize computations. A timing study using MATLAB's Parallel Computing Toolbox was performed for numerical computation of the Jacobian. Several approaches for implementing parallel code were investigated while only the single program multiple data (spmd) method using composite objects provided positive results. Parallel code speedup is demonstrated but the goal of linear speedup through the addition of processors was not achieved due to PC architecture.

  12. TRIAC II. A MatLab code for track measurements from SSNT detectors

    NASA Astrophysics Data System (ADS)

    Patiris, D. L.; Blekas, K.; Ioannides, K. G.

    2007-08-01

    A computer program named TRIAC II written in MATLAB and running with a friendly GUI has been developed for recognition and parameters measurements of particles' tracks from images of Solid State Nuclear Track Detectors. The program, using image analysis tools, counts the number of tracks and depending on the current working mode classifies them according to their radii (Mode I—circular tracks) or their axis (Mode II—elliptical tracks), their mean intensity value (brightness) and their orientation. Images of the detectors' surfaces are input to the code, which generates text files as output, including the number of counted tracks with the associated track parameters. Hough transform techniques are used for the estimation of the number of tracks and their parameters, providing results even in cases of overlapping tracks. Finally, it is possible for the user to obtain informative histograms as well as output files for each image and/or group of images. Program summaryTitle of program:TRIAC II Catalogue identifier:ADZC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZC_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: Pentium III, 600 MHz Installations: MATLAB 7.0 Operating system under which the program has been tested: Windows XP Programming language used:MATLAB Memory required to execute with typical data:256 MB No. of bits in a word:32 No. of processors used:one Has the code been vectorized or parallelized?:no No. of lines in distributed program, including test data, etc.:25 964 No. of bytes in distributed program including test data, etc.: 4 354 510 Distribution format:tar.gz Additional comments: This program requires the MatLab Statistical toolbox and the Image Processing Toolbox to be installed. Nature of physical problem: Following the passage of a charged particle (protons and heavier) through a Solid State Nuclear Track Detector (SSNTD), a damage region is created, usually named latent track. After the chemical etching of the detectors in aqueous NaOH or KOH solutions, latent tracks can be sufficiently enlarged (with diameters of 1 μm or more) to become visible under an optical microscope. Using the appropriate apparatus, one can record images of the SSNTD's surface. The shapes of the particle's tracks are strongly dependent on their charge, energy and the angle of incidence. Generally, they have elliptical shapes and in the special case of vertical incidence, they are circular. The manual counting of tracks is a tedious and time-consuming task. An automatic system is needed to speed up the process and to increase the accuracy of the results. Method of solution: TRIAC II is based on a segmentation method that groups image pixels according to their intensity value (brightness) in a number of grey level groups. After the segmentation of pixels, the program recognizes and separates the track from the background, subsequently performing image morphology, where oversized objects or objects smaller than a threshold value are removed. Finally, using the appropriate Hough transform technique, the program counts the tracks, even those which overlap and classifies them according to their shape parameters and brightness. Typical running time: The analysis of an image with a PC (Intel Pentium III processor running at 600 MHz) requires 2 to 10 minutes, depending on the number of observed tracks and the digital resolution of the image. Unusual features of the program: This program has been tested with images of CR-39 detectors exposed to alpha particles. Also, in low contrast images with few or small tracks, background pixels can be recognized as track pixels. To avoid this problem the brightness of the background pixels should be sufficiently higher than that of the track pixels.

  13. EEGgui: a program used to detect electroencephalogram anomalies after traumatic brain injury.

    PubMed

    Sick, Justin; Bray, Eric; Bregy, Amade; Dietrich, W Dalton; Bramlett, Helen M; Sick, Thomas

    2013-05-21

    Identifying and quantifying pathological changes in brain electrical activity is important for investigations of brain injury and neurological disease. An example is the development of epilepsy, a secondary consequence of traumatic brain injury. While certain epileptiform events can be identified visually from electroencephalographic (EEG) or electrocorticographic (ECoG) records, quantification of these pathological events has proved to be more difficult. In this study we developed MATLAB-based software that would assist detection of pathological brain electrical activity following traumatic brain injury (TBI) and present our MATLAB code used for the analysis of the ECoG. Software was developed using MATLAB(™) and features of the open access EEGLAB. EEGgui is a graphical user interface in the MATLAB programming platform that allows scientists who are not proficient in computer programming to perform a number of elaborate analyses on ECoG signals. The different analyses include Power Spectral Density (PSD), Short Time Fourier analysis and Spectral Entropy (SE). ECoG records used for demonstration of this software were derived from rats that had undergone traumatic brain injury one year earlier. The software provided in this report provides a graphical user interface for displaying ECoG activity and calculating normalized power density using fast fourier transform of the major brain wave frequencies (Delta, Theta, Alpha, Beta1, Beta2 and Gamma). The software further detects events in which power density for these frequency bands exceeds normal ECoG by more than 4 standard deviations. We found that epileptic events could be identified and distinguished from a variety of ECoG phenomena associated with normal changes in behavior. We further found that analysis of spectral entropy was less effective in distinguishing epileptic from normal changes in ECoG activity. The software presented here was a successful modification of EEGLAB in the Matlab environment that allows detection of epileptiform ECoG signals in animals after TBI. The code allows import of large EEG or ECoG data records as standard text files and uses fast fourier transform as a basis for detection of abnormal events. The software can also be used to monitor injury-induced changes in spectral entropy if required. We hope that the software will be useful for other investigators in the field of traumatic brain injury and will stimulate future advances of quantitative analysis of brain electrical activity after neurological injury or disease.

  14. 19 CFR 115.14 - Meeting on program.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 1 2012-04-01 2012-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...

  15. 19 CFR 115.14 - Meeting on program.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 1 2014-04-01 2014-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...

  16. 19 CFR 115.14 - Meeting on program.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 1 2013-04-01 2013-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...

  17. 19 CFR 115.14 - Meeting on program.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...

  18. 19 CFR 115.14 - Meeting on program.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 1 2011-04-01 2011-04-01 false Meeting on program. 115.14 Section 115.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY CARGO CONTAINER AND ROAD VEHICLE CERTIFICATION PURSUANT TO INTERNATIONAL CUSTOMS CONVENTIONS...

  19. Customer Education: The Silent Revolution.

    ERIC Educational Resources Information Center

    Zemke, Ron

    1985-01-01

    Discusses the marketing value and strategic necessity of planned and promoted customer education. The article examines customer training by the manufacturer as a definite trend in the microcomputer industry. Elements of a good customer training program are described along with suggestions for starting such a program. (CT)

  20. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services

    PubMed Central

    Castaño-Díez, Daniel

    2017-01-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909

  1. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services.

    PubMed

    Castaño-Díez, Daniel

    2017-06-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.

  2. tweezercalib 2.0: Faster version of MatLab package for precise calibration of optical tweezers

    NASA Astrophysics Data System (ADS)

    Hansen, Poul Martin; Tolić-Nørrelykke, Iva Marija; Flyvbjerg, Henrik; Berg-Sørensen, Kirstine

    2006-03-01

    We present a vectorized version of the MatLab (MathWorks Inc.) package tweezercalib for calibration of optical tweezers with precision. The calibration is based on the power spectrum of the Brownian motion of a dielectric bead trapped in the tweezers. Precision is achieved by accounting for a number of factors that affect this power spectrum, as described in vs. 1 of the package [I.M. Tolić-Nørrelykke, K. Berg-Sørensen, H. Flyvbjerg, Matlab program for precision calibration of optical tweezers, Comput. Phys. Comm. 159 (2004) 225-240]. The graphical user interface allows the user to include or leave out each of these factors. Several "health tests" are applied to the experimental data during calibration, and test results are displayed graphically. Thus, the user can easily see whether the data comply with the theory used for their interpretation. Final calibration results are given with statistical errors and covariance matrix. New version program summaryTitle of program: tweezercalib Catalogue identifier: ADTV_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTV_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Reference in CPC to previous version: I.M. Tolić-Nørrelykke, K. Berg-Sørensen, H. Flyvbjerg, Comput. Phys. Comm. 159 (2004) 225 Catalogue identifier of previous version: ADTV Does the new version supersede the original program: Yes Computer for which the program is designed and others on which it has been tested: General computer running MatLab (Mathworks Inc.) Operating systems under with the program has been tested: Windows2000, Windows-XP, Linux Programming language used: MatLab (Mathworks Inc.), standard license Memory required to execute with typical data: Of order four times the size of the data file High speed storage required: none No. of lines in distributed program, including test data, etc.: 135 989 No. of bytes in distributed program, including test data, etc.: 1 527 611 Distribution format: tar. gz Nature of physical problem: Calibrate optical tweezers with precision by fitting theory to experimental power spectrum of position of bead doing Brownian motion in incompressible fluid, possibly near microscope cover slip, while trapped in optical tweezers. Thereby determine spring constant of optical trap and conversion factor for arbitrary-units-to-nanometers for detection system. Method of solution: Elimination of cross-talk between quadrant photo-diode's output channels for positions (optional). Check that distribution of recorded positions agrees with Boltzmann distribution of bead in harmonic trap. Data compression and noise reduction by blocking method applied to power spectrum. Full accounting for hydrodynamic effects: Frequency-dependent drag force and interaction with nearby cover slip (optional). Full accounting for electronic filters (optional), for "virtual filtering" caused by detection system (optional). Full accounting for aliasing caused by finite sampling rate (optional). Standard non-linear least-squares fitting. Statistical support for fit is given, with several plots facilitating inspection of consistency and quality of data and fit. Summary of revisions: A faster fitting routine, adapted from [J. Nocedal, Y.x. Yuan, Combining trust region and line search techniques, Technical Report OTC 98/04, Optimization Technology Center, 1998; W.H. Press, B.P. Flannery, S.A. Teukolsky, W.T. Vetterling, Numerical Recipes. The Art of Scientific Computing, Cambridge University Press, Cambridge, 1986], is applied. It uses fewer function evaluations, and the remaining function evaluations have been vectorized. Calls to routines in Toolboxes not included with a standard MatLab license have been replaced by calls to routines that are included in the present package. Fitting parameters are rescaled to ensure that they are all of roughly the same size (of order 1) while being fitted. Generally, the program package has been updated to comply with MatLab, vs. 7.0, and optimized for speed. Restrictions on the complexity of the problem: Data should be positions of bead doing Brownian motion while held by optical tweezers. For high precision in final results, data should be time series measured over a long time, with sufficiently high experimental sampling rate: The sampling rate should be well above the characteristic frequency of the trap, the so-called corner frequency. Thus, the sampling frequency should typically be larger than 10 kHz. The Fast Fourier Transform used works optimally when the time series contain 2 data points, and long measurement time is obtained with n>12-15. Finally, the optics should be set to ensure a harmonic trapping potential in the range of positions visited by the bead. The fitting procedure checks for harmonic potential. Typical running time: Seconds Unusual features of the program: None References: The theoretical underpinnings for the procedure are found in [K. Berg-Sørensen, H. Flyvbjerg, Power spectrum analysis for optical tweezers, Rev. Sci. Ins. 75 (2004) 594-612].

  3. A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.

    PubMed

    Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher

    2017-08-01

    The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.

  4. Extreme Programming in a Research Environment

    NASA Technical Reports Server (NTRS)

    Wood, William A.; Kleb, William L.

    2002-01-01

    This article explores the applicability of Extreme Programming in a scientific research context. The cultural environment at a government research center differs from the customer-centric business view. The chief theoretical difficulty lies in defining the customer to developer relationship. Specifically, can Extreme Programming be utilized when the developer and customer are the same person? Eight of Extreme Programming's 12 practices are perceived to be incompatible with the existing research culture. Further, six of the nine 'environments that I know don't do well with XP' apply. A pilot project explores the use of Extreme Programming in scientific research. The applicability issues are addressed and it is concluded that Extreme Programming can function successfully in situations for which it appears to be ill-suited. A strong discipline for mentally separating the customer and developer roles is found to be key for applying Extreme Programming in a field that lacks a clear distinction between the customer and the developer.

  5. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-29

    ... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image System (DIS) and Simplified Entry (SE); Correction AGENCY: U.S. Customs and Border Protection, Department...

  6. A time scheduling model of logistics service supply chain based on the customer order decoupling point: a perspective from the constant service operation time.

    PubMed

    Liu, Weihua; Yang, Yi; Xu, Haitao; Liu, Xiaoyan; Wang, Yijia; Liang, Zhicheng

    2014-01-01

    In mass customization logistics service, reasonable scheduling of the logistics service supply chain (LSSC), especially time scheduling, is benefit to increase its competitiveness. Therefore, the effect of a customer order decoupling point (CODP) on the time scheduling performance should be considered. To minimize the total order operation cost of the LSSC, minimize the difference between the expected and actual time of completing the service orders, and maximize the satisfaction of functional logistics service providers, this study establishes an LSSC time scheduling model based on the CODP. Matlab 7.8 software is used in the numerical analysis for a specific example. Results show that the order completion time of the LSSC can be delayed or be ahead of schedule but cannot be infinitely advanced or infinitely delayed. Obtaining the optimal comprehensive performance can be effective if the expected order completion time is appropriately delayed. The increase in supply chain comprehensive performance caused by the increase in the relationship coefficient of logistics service integrator (LSI) is limited. The relative concern degree of LSI on cost and service delivery punctuality leads to not only changes in CODP but also to those in the scheduling performance of the LSSC.

  7. A Time Scheduling Model of Logistics Service Supply Chain Based on the Customer Order Decoupling Point: A Perspective from the Constant Service Operation Time

    PubMed Central

    Yang, Yi; Xu, Haitao; Liu, Xiaoyan; Wang, Yijia; Liang, Zhicheng

    2014-01-01

    In mass customization logistics service, reasonable scheduling of the logistics service supply chain (LSSC), especially time scheduling, is benefit to increase its competitiveness. Therefore, the effect of a customer order decoupling point (CODP) on the time scheduling performance should be considered. To minimize the total order operation cost of the LSSC, minimize the difference between the expected and actual time of completing the service orders, and maximize the satisfaction of functional logistics service providers, this study establishes an LSSC time scheduling model based on the CODP. Matlab 7.8 software is used in the numerical analysis for a specific example. Results show that the order completion time of the LSSC can be delayed or be ahead of schedule but cannot be infinitely advanced or infinitely delayed. Obtaining the optimal comprehensive performance can be effective if the expected order completion time is appropriately delayed. The increase in supply chain comprehensive performance caused by the increase in the relationship coefficient of logistics service integrator (LSI) is limited. The relative concern degree of LSI on cost and service delivery punctuality leads to not only changes in CODP but also to those in the scheduling performance of the LSSC. PMID:24715818

  8. PROPOSAL FOR A SIMPLE AND EFFICIENT MONTHLY QUALITY MANAGEMENT PROGRAM ASSESSING THE CONSISTENCY OF ROBOTIC IMAGE-GUIDED SMALL ANIMAL RADIATION SYSTEMS

    PubMed Central

    Brodin, N. Patrik; Guha, Chandan; Tomé, Wolfgang A.

    2015-01-01

    Modern pre-clinical radiation therapy (RT) research requires high precision and accurate dosimetry to facilitate the translation of research findings into clinical practice. Several systems are available that provide precise delivery and on-board imaging capabilities, highlighting the need for a quality management program (QMP) to ensure consistent and accurate radiation dose delivery. An ongoing, simple, and efficient QMP for image-guided robotic small animal irradiators used in pre-clinical RT research is described. Protocols were developed and implemented to assess the dose output constancy (based on the AAPM TG-61 protocol), cone-beam computed tomography (CBCT) image quality and object representation accuracy (using a custom-designed imaging phantom), CBCT-guided target localization accuracy and consistency of the CBCT-based dose calculation. To facilitate an efficient read-out and limit the user dependence of the QMP data analysis, a semi-automatic image analysis and data representation program was developed using the technical computing software MATLAB. The results of the first six months experience using the suggested QMP for a Small Animal Radiation Research Platform (SARRP) are presented, with data collected on a bi-monthly basis. The dosimetric output constancy was established to be within ±1 %, the consistency of the image resolution was within ±0.2 mm, the accuracy of CBCT-guided target localization was within ±0.5 mm, and dose calculation consistency was within ±2 s (± 3 %) per treatment beam. Based on these results, this simple quality assurance program allows for the detection of inconsistencies in dosimetric or imaging parameters that are beyond the acceptable variability for a reliable and accurate pre-clinical RT system, on a monthly or bi-monthly basis. PMID:26425981

  9. Proposal for a Simple and Efficient Monthly Quality Management Program Assessing the Consistency of Robotic Image-Guided Small Animal Radiation Systems.

    PubMed

    Brodin, N Patrik; Guha, Chandan; Tomé, Wolfgang A

    2015-11-01

    Modern pre-clinical radiation therapy (RT) research requires high precision and accurate dosimetry to facilitate the translation of research findings into clinical practice. Several systems are available that provide precise delivery and on-board imaging capabilities, highlighting the need for a quality management program (QMP) to ensure consistent and accurate radiation dose delivery. An ongoing, simple, and efficient QMP for image-guided robotic small animal irradiators used in pre-clinical RT research is described. Protocols were developed and implemented to assess the dose output constancy (based on the AAPM TG-61 protocol), cone-beam computed tomography (CBCT) image quality and object representation accuracy (using a custom-designed imaging phantom), CBCT-guided target localization accuracy and consistency of the CBCT-based dose calculation. To facilitate an efficient read-out and limit the user dependence of the QMP data analysis, a semi-automatic image analysis and data representation program was developed using the technical computing software MATLAB. The results of the first 6-mo experience using the suggested QMP for a Small Animal Radiation Research Platform (SARRP) are presented, with data collected on a bi-monthly basis. The dosimetric output constancy was established to be within ±1 %, the consistency of the image resolution was within ±0.2 mm, the accuracy of CBCT-guided target localization was within ±0.5 mm, and dose calculation consistency was within ±2 s (±3%) per treatment beam. Based on these results, this simple quality assurance program allows for the detection of inconsistencies in dosimetric or imaging parameters that are beyond the acceptable variability for a reliable and accurate pre-clinical RT system, on a monthly or bi-monthly basis.

  10. Enabling On-Demand Database Computing with MIT SuperCloud Database Management System

    DTIC Science & Technology

    2015-09-15

    arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created

  11. Automating an integrated spatial data-mining model for landfill site selection

    NASA Astrophysics Data System (ADS)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  12. BioSigPlot: an opensource tool for the visualization of multi-channel biomedical signals with Matlab.

    PubMed

    Boudet, Samuel; Peyrodie, Laurent; Gallois, Philippe; de l'Aulnoit, Denis Houzé; Cao, Hua; Forzy, Gérard

    2013-01-01

    This paper presents a Matlab-based software (MathWorks inc.) called BioSigPlot for the visualization of multi-channel biomedical signals, particularly for the EEG. This tool is designed for researchers on both engineering and medicine who have to collaborate to visualize and analyze signals. It aims to provide a highly customizable interface for signal processing experimentation in order to plot several kinds of signals while integrating the common tools for physician. The main advantages compared to other existing programs are the multi-dataset displaying, the synchronization with video and the online processing. On top of that, this program uses object oriented programming, so that the interface can be controlled by both graphic controls and command lines. It can be used as EEGlab plug-in but, since it is not limited to EEG, it would be distributed separately. BioSigPlot is distributed free of charge (http://biosigplot.sourceforge.net), under the terms of GNU Public License for non-commercial use and open source development.

  13. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.

    PubMed

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A

    2015-02-01

    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Is NIPARS Working as Advertised? An Analysis of NIPARS Program Customer Service

    DTIC Science & Technology

    1992-09-01

    AD-A259 733IN I II I ll IMiiiI Gil III 11 AFIT/GLM/LSM/92S- 17 IS NIPARS WORKING AS ADVERTISED ? AN ANALYSIS OFNIPARS PROGRAM CUSTOMER SERVICE THESIS...and/or Dist Speoiai. AFIT/GLM/LSM/92S-17 IS NIPARS WORKING AS ADVERTISED ? AN ANALYSIS OF NIPARS PROGRAM CUSTOMER SERVICE THESIS Presented to the...measures. x1i IS NIPARS WORKING AS ADVERTISED ? AN ANALYSIS OF NIPARS PROGRAM CUSTOMER SERVICE L Introduction 1.1 Overview Foreign policy must start with

  15. Sustained Energy Savings Achieved through Successful Industrial Customer Interaction with Ratepayer Programs: Case Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Amelie; Hedman, Bruce; Taylor, Robert P.

    Many states have implemented ratepayer-funded programs to acquire energy efficiency as a predictable and reliable resource for meeting existing and future energy demand. These programs have become a fixture in many U.S. electricity and natural gas markets as they help postpone or eliminate the need for expensive generation and transmission investments. Industrial energy efficiency (IEE) is an energy efficiency resource that is not only a low cost option for many of these efficiency programs, but offers productivity and competitive benefits to manufacturers as it reduces their energy costs. However, some industrial customers are less enthusiastic about participating in these programs.more » IEE ratepayer programs suffer low participation by industries across many states today despite a continual increase in energy efficiency program spending across all types of customers, and significant energy efficiency funds can often go unused for industrial customers. This paper provides four detailed case studies of companies that benefited from participation in their utility’s energy efficiency program offerings and highlights the business value brought to them by participation in these programs. The paper is designed both for rate-payer efficiency program administrators interested in improving the attractiveness and effectiveness of industrial efficiency programs for their industrial customers and for industrial customers interested in maximizing the value of participating in efficiency programs.« less

  16. Extreme Programming: Maestro Style

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark

    2009-01-01

    "Extreme Programming: Maestro Style" is the name of a computer programming methodology that has evolved as a custom version of a methodology, called extreme programming that has been practiced in the software industry since the late 1990s. The name of this version reflects its origin in the work of the Maestro team at NASA's Jet Propulsion Laboratory that develops software for Mars exploration missions. Extreme programming is oriented toward agile development of software resting on values of simplicity, communication, testing, and aggressiveness. Extreme programming involves use of methods of rapidly building and disseminating institutional knowledge among members of a computer-programming team to give all the members a shared view that matches the view of the customers for whom the software system is to be developed. Extreme programming includes frequent planning by programmers in collaboration with customers, continually examining and rewriting code in striving for the simplest workable software designs, a system metaphor (basically, an abstraction of the system that provides easy-to-remember software-naming conventions and insight into the architecture of the system), programmers working in pairs, adherence to a set of coding standards, collaboration of customers and programmers, frequent verbal communication, frequent releases of software in small increments of development, repeated testing of the developmental software by both programmers and customers, and continuous interaction between the team and the customers. The environment in which the Maestro team works requires the team to quickly adapt to changing needs of its customers. In addition, the team cannot afford to accept unnecessary development risk. Extreme programming enables the Maestro team to remain agile and provide high-quality software and service to its customers. However, several factors in the Maestro environment have made it necessary to modify some of the conventional extreme-programming practices. The single most influential of these factors is that continuous interaction between customers and programmers is not feasible.

  17. PLDAPS: A Hardware Architecture and Software Toolbox for Neurophysiology Requiring Complex Visual Stimuli and Online Behavioral Control.

    PubMed

    Eastman, Kyler M; Huk, Alexander C

    2012-01-01

    Neurophysiological studies in awake, behaving primates (both human and non-human) have focused with increasing scrutiny on the temporal relationship between neural signals and behaviors. Consequently, laboratories are often faced with the problem of developing experimental equipment that can support data recording with high temporal precision and also be flexible enough to accommodate a wide variety of experimental paradigms. To this end, we have developed a MATLAB toolbox that integrates several modern pieces of equipment, but still grants experimenters the flexibility of a high-level programming language. Our toolbox takes advantage of three popular and powerful technologies: the Plexon apparatus for neurophysiological recordings (Plexon, Inc., Dallas, TX, USA), a Datapixx peripheral (Vpixx Technologies, Saint-Bruno, QC, Canada) for control of analog, digital, and video input-output signals, and the Psychtoolbox MATLAB toolbox for stimulus generation (Brainard, 1997; Pelli, 1997; Kleiner et al., 2007). The PLDAPS ("Platypus") system is designed to support the study of the visual systems of awake, behaving primates during multi-electrode neurophysiological recordings, but can be easily applied to other related domains. Despite its wide range of capabilities and support for cutting-edge video displays and neural recording systems, the PLDAPS system is simple enough for someone with basic MATLAB programming skills to design their own experiments.

  18. Making It Count: Understanding the Value of Energy Efficiency Financing Programs Funded by Utility Customers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kramer, Chris; Fadrhonc, Emily Martin; Goldman, Charles

    Utility customer-supported financing programs are receiving increased attention as a strategy for achieving energy saving goals. Rationales for using utility customer funds to support financing initiatives

  19. 78 FR 66094 - Self-Regulatory Organizations: Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-04

    ... Priority Customer Rebate Program (the ``Program'') until November 30, 2013.\\3\\ The Program currently... Customer \\6\\ order transmitted by that Member which is executed on the Exchange in all multiply-listed... thresholds are calculated based on the customer average daily volume over the course of the month. Volume...

  20. [Effects of a Customized Birth Control Program for Married Immigrant Postpartum Mothers].

    PubMed

    Kim, So Young; Choi, So Young

    2016-12-01

    This study was conducted to develop a customized birth control program and identify its effects on attitude, subjective norm, behavioral control, intention, and behavior of contraception among immigrant postpartum mothers. In this experimental study, Vietnamese, Filipino or Cambodian married immigrant postpartum mothers were recruited. They were assigned to the experiment group (n=21) or control group (n=21). The customized birth control program was provided to the experimental group for 4 weeks. The experimental group showed a significant increase in the score of attitude, subjective norm, behavioral control, intention, and behavior of contraception. Findings in this study indicate that the customized postpartum birth control program, a systematic and integrative intervention program composed of customized health education, counseling and telephone monitoring, is able to provide effective planning for postpartum health promotion and birth control behavior practice in married immigrant women.

  1. Customer response to day-ahead wholesale market electricity prices: Case study of RTP program experience in New York

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldman, C.; Hopper, N.; Sezgen, O.

    2004-07-01

    There is growing interest in policies, programs and tariffs that encourage customer loads to provide demand response (DR) to help discipline wholesale electricity markets. Proposals at the retail level range from eliminating fixed rate tariffs as the default service for some or all customer groups to reinstituting utility-sponsored load management programs with market-based inducements to curtail. Alternative rate designs include time-of-use (TOU), day-ahead real-time pricing (RTP), critical peak pricing, and even pricing usage at real-time market balancing prices. Some Independent System Operators (ISOs) have implemented their own DR programs whereby load curtailment capabilities are treated as a system resource andmore » are paid an equivalent value. The resulting load reductions from these tariffs and programs provide a variety of benefits, including limiting the ability of suppliers to increase spot and long-term market-clearing prices above competitive levels (Neenan et al., 2002; Boren stein, 2002; Ruff, 2002). Unfortunately, there is little information in the public domain to characterize and quantify how customers actually respond to these alternative dynamic pricing schemes. A few empirical studies of large customer RTP response have shown modest results for most customers, with a few very price-responsive customers providing most of the aggregate response (Herriges et al., 1993; Schwarz et al., 2002). However, these studies examined response to voluntary, two-part RTP programs implemented by utilities in states without retail competition.1 Furthermore, the researchers had limited information on customer characteristics so they were unable to identify the drivers to price response. In the absence of a compelling characterization of why customers join RTP programs and how they respond to prices, many initiatives to modernize retail electricity rates seem to be stymied.« less

  2. Focus on Customer Service. Service Management: How to Plan for it Rather Than Hope for It [and] Learning to Say "Yes": A Customer Service Program for Library Staff [and] Maintaining Momentum in a Quality Improvement Process.

    ERIC Educational Resources Information Center

    Brewer, Julie; And Others

    1995-01-01

    Presents three articles that discuss customer service in libraries, with a focus on planning for service management, a customer service program for library staff, and a quality improvement process. Highlights include developing and implementing service strategies, dealing with requests, redefining work relationships, coworkers as customers,…

  3. E-Learning Technologies: Employing Matlab Web Server to Facilitate the Education of Mathematical Programming

    ERIC Educational Resources Information Center

    Karagiannis, P.; Markelis, I.; Paparrizos, K.; Samaras, N.; Sifaleras, A.

    2006-01-01

    This paper presents new web-based educational software (webNetPro) for "Linear Network Programming." It includes many algorithms for "Network Optimization" problems, such as shortest path problems, minimum spanning tree problems, maximum flow problems and other search algorithms. Therefore, webNetPro can assist the teaching process of courses such…

  4. Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization

    ERIC Educational Resources Information Center

    Gelman, Andrew; Lee, Daniel; Guo, Jiqiang

    2015-01-01

    Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…

  5. 78 FR 42138 - Self-Regulatory Organizations; Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-15

    ... Effectiveness of Proposed Rule Change To Adopt a Priority Customer Rebate Program July 9, 2013. Pursuant to... of the Proposed Rule Change The Exchange is filing a proposal to adopt a Priority Customer Rebate... Priority Customer Rebate Program (the ``Program'') for the period beginning July 1, 2013 and ending...

  6. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. A customer-friendly Space Station

    NASA Technical Reports Server (NTRS)

    Pivirotto, D. S.

    1984-01-01

    This paper discusses the relationship of customers to the Space Station Program currently being defined by NASA. Emphasis is on definition of the Program such that the Space Station will be conducive to use by customers, that is by people who utilize the services provided by the Space Station and its associated platforms and vehicles. Potential types of customers are identified. Scenarios are developed for ways in which different types of customers can utilize the Space Station. Both management and technical issues involved in making the Station 'customer friendly' are discussed.

  8. MATLAB for laser speckle contrast analysis (LASCA): a practice-based approach

    NASA Astrophysics Data System (ADS)

    Postnikov, Eugene B.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Laser Speckle Contrast Analysis (LASCA) is one of the most powerful modern methods for revealing blood dynamics. The experimental design and theory for this method are well established, and the computational recipie is often regarded to be trivial. However, the achieved performance and spatial resolution may considerable differ for different implementations. We comprise a minireview of known approaches to the spatial laser speckle contrast data processing and their realization in MATLAB code providing an explicit correspondence to the mathematical representation, a discussion of available implementations. We also present the algorithm based on the 2D Haar wavelet transform, also supplied with the program code. This new method provides an opportunity to introduce horizontal, vertical and diagonal speckle contrasts; it may be used for processing highly anisotropic images of vascular trees. We provide the comparative analysis of the accuracy of vascular pattern detection and the processing times with a special attention to details of the used MATLAB procedures.

  9. The Customer Comes First: Implementing a Customer Service Program at the University of Minnesota, Twin Cities Libraries

    ERIC Educational Resources Information Center

    Bayer, Jerrie; Llewellyn, Steven

    2011-01-01

    Library customers have more remote information choices than ever before, so we must ensure that when they do come to the library, they experience a welcoming environment, a high standard of service, and receive equitable levels of service across campus. Developing a customer service program was a logical next step to reinforce the ongoing…

  10. Matlab Based LOCO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Portmann, Greg; /LBL, Berkeley; Safranek, James

    The LOCO algorithm has been used by many accelerators around the world. Although the uses for LOCO vary, the most common use has been to find calibration errors and correct the optics functions. The light source community in particular has made extensive use of the LOCO algorithms to tightly control the beta function and coupling. Maintaining high quality beam parameters requires constant attention so a relatively large effort was put into software development for the LOCO application. The LOCO code was originally written in FORTRAN. This code worked fine but it was somewhat awkward to use. For instance, the FORTRANmore » code itself did not calculate the model response matrix. It required a separate modeling code such as MAD to calculate the model matrix then one manually loads the data into the LOCO code. As the number of people interested in LOCO grew, it required making it easier to use. The decision to port LOCO to Matlab was relatively easy. It's best to use a matrix programming language with good graphics capability; Matlab was also being used for high level machine control; and the accelerator modeling code AT, [5], was already developed for Matlab. Since LOCO requires collecting and processing a relative large amount of data, it is very helpful to have the LOCO code compatible with the high level machine control, [3]. A number of new features were added while porting the code from FORTRAN and new methods continue to evolve, [7][9]. Although Matlab LOCO was written with AT as the underlying tracking code, a mechanism to connect to other modeling codes has been provided.« less

  11. Creating a successful relationship with customers.

    PubMed

    Cotton, L; Sparrow, E

    1998-01-01

    In 1997, several employers commissioned an inpatient survey for a group of businesses that included hospitals in southeast Michigan. Its results indicated that the University of Michigan Health System (UMHS) needed to become more customer-focused. To meet this challenge, UMHS mandated that customer service to its patients and their families should be its first priority. A pilot project in the radiology department's pediatric division was established to recognize and reward employees for outstanding service to customers. The program is now used to reward employees throughout the radiology department, on the assumption that when employees feel special, so will their customers. Management's focus is on employees--they are the health system. The department also invested in employee development, a continuous training program that centers on customer service and teaches tools and skills for better communication. The goal of the development program at UMHS is to exceed the needs of its customers.

  12. State formulating lifeline program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-09-01

    The Board of Public Utilities (BPU) of New Jersey is formulating a lifeline program which would provide low-income and elderly customers with reduced utility rates. It is estimated that 30% of the households in New Jersey will qualify for the program. While the legislation calls for the lowest effective rate of any customer class, each utility would have its own lifeline program because of differing rates among utility companies. Eligibility requirements would be applied statewide. The utilities will fund the new program by restructuring the existing rates for regular customers. In which case lifeline recipients' rate would decrease while regularmore » customers' bills would increase. Eventually, the BPU expects to fund about 10% of the senior citizens' portion of the program with the state's casino gambling revenues.« less

  13. Customer satisfaction planning and industrial engineering move hospital towards in-house stockless program.

    PubMed

    Burton, R; Mauk, D

    1993-03-01

    By integrating customer satisfaction planning and industrial engineering techniques when examining internal costs and efficiencies, materiel managers are able to better realize what concepts will best meet their customers' needs. Defining your customer(s), applying industrial engineering techniques, completing work sampling studies, itemizing recommendations and benefits to each alternative, performing feasibility and cost-analysis matrixes and utilizing resources through productivity monitoring will get you on the right path toward selecting concepts to use. This article reviews the above procedures as they applied to one hospital's decision-making process to determine whether to incorporate a stockless inventory program. Through an analysis of customer demand, the hospital realized that stockless was the way to go, but not by outsourcing the function--the hospital incorporated an in-house stockless inventory program.

  14. Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK

    PubMed Central

    2014-01-01

    Background Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system’s set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This “code-based” approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. Results As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. Conclusions The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts. PMID:24725437

  15. Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK.

    PubMed

    Wang, Kaier; Steyn-Ross, Moira L; Steyn-Ross, D Alistair; Wilson, Marcus T; Sleigh, Jamie W; Shiraishi, Yoichi

    2014-04-11

    Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system's set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This "code-based" approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts.

  16. Customized Training Marketing Plan.

    ERIC Educational Resources Information Center

    Lay, Ted

    This report outlines Oregon's Lane Community College's (LCC's) plan for marketing its customized training program for business, community organizations, public agencies, and their employees. Following a mission statement for the customized training program, a brief analysis is provided of the economic environment; of competition from educational…

  17. Using MountainsMap (Digital Surf) surface analysis software as an analysis tool for x-ray mirror optical metrology data

    NASA Astrophysics Data System (ADS)

    Duffy, Alan; Yates, Brian; Takacs, Peter

    2012-09-01

    The Optical Metrology Facility at the Canadian Light Source (CLS) has recently purchased MountainsMap surface analysis software from Digital Surf and we report here our experiences with this package and its usefulness as a tool for examining metrology data of synchrotron x-ray mirrors. The package has a number of operators that are useful for determining surface roughness and slope error including compliance with ISO standards (viz. ISO 4287 and ISO 25178). The software is extensible with MATLAB scripts either by loading an m-file or by a user written script. This makes it possible to apply a custom operator to measurement data sets. Using this feature we have applied the simple six-line MATLAB code for the direct least square fitting of ellipses developed by Fitzgibbon et. al. to investigate the residual slope error of elliptical mirrors upon the removal of the best-fit-ellipse. The software includes support for many instruments (e.g. Zygo, MicroMap, etc...) and can import ASCII data (e.g. LTP data). The stitching module allows the user to assemble overlapping images and we report on our experiences with this feature applied to MicroMap surface roughness data. The power spectral density function was determined for the stitched and unstitched data and compared.

  18. Neurofeedback training aimed to improve focused attention and alertness in children with ADHD: a study of relative power of EEG rhythms using custom-made software application.

    PubMed

    Hillard, Brent; El-Baz, Ayman S; Sears, Lonnie; Tasman, Allan; Sokhadze, Estate M

    2013-07-01

    Neurofeedback is a nonpharmacological treatment for attention-deficit hyperactivity disorder (ADHD). We propose that operant conditioning of electroencephalogram (EEG) in neurofeedback training aimed to mitigate inattention and low arousal in ADHD, will be accompanied by changes in EEG bands' relative power. Patients were 18 children diagnosed with ADHD. The neurofeedback protocol ("Focus/Alertness" by Peak Achievement Trainer) has a focused attention and alertness training mode. The neurofeedback protocol provides one for Focus and one for Alertness. This does not allow for collecting information regarding changes in specific EEG bands (delta, theta, alpha, low and high beta, and gamma) power within the 2 to 45 Hz range. Quantitative EEG analysis was completed on each of twelve 25-minute-long sessions using a custom-made MatLab application to determine the relative power of each of the aforementioned EEG bands throughout each session, and from the first session to the last session. Additional statistical analysis determined significant changes in relative power within sessions (from minute 1 to minute 25) and between sessions (from session 1 to session 12). Analysis was of relative power of theta, alpha, low and high beta, theta/alpha, theta/beta, and theta/low beta and theta/high beta ratios. Additional secondary measures of patients' post-neurofeedback outcomes were assessed, using an audiovisual selective attention test (IVA + Plus) and behavioral evaluation scores from the Aberrant Behavior Checklist. Analysis of data computed in the MatLab application, determined that theta/low beta and theta/alpha ratios decreased significantly from session 1 to session 12, and from minute 1 to minute 25 within sessions. The findings regarding EEG changes resulting from brain wave self-regulation training, along with behavioral evaluations, will help elucidate neural mechanisms of neurofeedback aimed to improve focused attention and alertness in ADHD.

  19. 31 CFR 1023.220 - Customer identification programs for broker-dealers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Finance (Continued) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR BROKERS OR DEALERS IN SECURITIES Programs § 1023.220 Customer identification programs for broker-dealers. (a...

  20. 31 CFR 1023.220 - Customer identification programs for broker-dealers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Finance (Continued) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR BROKERS OR DEALERS IN SECURITIES Programs § 1023.220 Customer identification programs for broker-dealers. (a...

  1. 31 CFR 1023.220 - Customer identification programs for broker-dealers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Finance (Continued) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR BROKERS OR DEALERS IN SECURITIES Programs § 1023.220 Customer identification programs for broker-dealers. (a...

  2. 31 CFR 1023.220 - Customer identification programs for broker-dealers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Finance (Continued) FINANCIAL CRIMES ENFORCEMENT NETWORK, DEPARTMENT OF THE TREASURY RULES FOR BROKERS OR DEALERS IN SECURITIES Programs § 1023.220 Customer identification programs for broker-dealers. (a...

  3. 77 FR 67865 - Enhancing Protections Afforded Customers and Customer Funds Held by Futures Commission Merchants...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-14

    ... Parts 1, 3, 22 et al. Enhancing Protections Afforded Customers and Customer Funds Held by Futures... Customers and Customer Funds Held by Futures Commission Merchants and Derivatives Clearing Organizations... amend existing regulations to require enhanced customer protections, risk management programs, internal...

  4. A Switching-Mode Power Supply Design Tool to Improve Learning in a Power Electronics Course

    ERIC Educational Resources Information Center

    Miaja, P. F.; Lamar, D. G.; de Azpeitia, M.; Rodriguez, A.; Rodriguez, M.; Hernando, M. M.

    2011-01-01

    The static design of ac/dc and dc/dc switching-mode power supplies (SMPS) relies on a simple but repetitive process. Although specific spreadsheets, available in various computer-aided design (CAD) programs, are widely used, they are difficult to use in educational applications. In this paper, a graphic tool programmed in MATLAB is presented,…

  5. Development of a Simple Image Processing Application that Makes Abdominopelvic Tumor Visible on Positron Emission Tomography/Computed Tomography Image.

    PubMed

    Pandey, Anil Kumar; Saroha, Kartik; Sharma, Param Dev; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    In this study, we have developed a simple image processing application in MATLAB that uses suprathreshold stochastic resonance (SSR) and helps the user to visualize abdominopelvic tumor on the exported prediuretic positron emission tomography/computed tomography (PET/CT) images. A brainstorming session was conducted for requirement analysis for the program. It was decided that program should load the screen captured PET/CT images and then produces output images in a window with a slider control that should enable the user to view the best image that visualizes the tumor, if present. The program was implemented on personal computer using Microsoft Windows and MATLAB R2013b. The program has option for the user to select the input image. For the selected image, it displays output images generated using SSR in a separate window having a slider control. The slider control enables the user to view images and select one which seems to provide the best visualization of the area(s) of interest. The developed application enables the user to select, process, and view output images in the process of utilizing SSR to detect the presence of abdominopelvic tumor on prediuretic PET/CT image.

  6. Using MATLAB Software on the Peregrine System | High-Performance Computing

    Science.gov Websites

    | NREL MATLAB Software on the Peregrine System Using MATLAB Software on the Peregrine System Learn how to use MATLAB software on the Peregrine system. Running MATLAB in Batch Mode Using the node. Understanding Versions and Licenses Learn about the MATLAB software versions and licenses

  7. 19 CFR 24.22 - Fees for certain services.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... following address: U.S. Customs and Border Protection, Attn: DTOPS Program Administrator, 6650 Telecom Drive... address: U.S. Customs and Border Protection, Attn: DTOPS Program Administrator, 6650 Telecom Drive, Suite....S. Customs and Border Protection, Revenue Division, Attn: User Fee Team, 6650 Telecom Drive, Suite...

  8. 19 CFR 24.22 - Fees for certain services.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... following address: U.S. Customs and Border Protection, Attn: DTOPS Program Administrator, 6650 Telecom Drive... address: U.S. Customs and Border Protection, Attn: DTOPS Program Administrator, 6650 Telecom Drive, Suite....S. Customs and Border Protection, Revenue Division, Attn: User Fee Team, 6650 Telecom Drive, Suite...

  9. General MACOS Interface for Modeling and Analysis for Controlled Optical Systems

    NASA Technical Reports Server (NTRS)

    Sigrist, Norbert; Basinger, Scott A.; Redding, David C.

    2012-01-01

    The General MACOS Interface (GMI) for Modeling and Analysis for Controlled Optical Systems (MACOS) enables the use of MATLAB as a front-end for JPL s critical optical modeling package, MACOS. MACOS is JPL s in-house optical modeling software, which has proven to be a superb tool for advanced systems engineering of optical systems. GMI, coupled with MACOS, allows for seamless interfacing with modeling tools from other disciplines to make possible integration of dynamics, structures, and thermal models with the addition of control systems for deformable optics and other actuated optics. This software package is designed as a tool for analysts to quickly and easily use MACOS without needing to be an expert at programming MACOS. The strength of MACOS is its ability to interface with various modeling/development platforms, allowing evaluation of system performance with thermal, mechanical, and optical modeling parameter variations. GMI provides an improved means for accessing selected key MACOS functionalities. The main objective of GMI is to marry the vast mathematical and graphical capabilities of MATLAB with the powerful optical analysis engine of MACOS, thereby providing a useful tool to anyone who can program in MATLAB. GMI also improves modeling efficiency by eliminating the need to write an interface function for each task/project, reducing error sources, speeding up user/modeling tasks, and making MACOS well suited for fast prototyping.

  10. Traffic Pattern Detection Using the Hough Transformation for Anomaly Detection to Improve Maritime Domain Awareness

    DTIC Science & Technology

    2013-12-01

    Programming code in the Python language used in AIS data preprocessing is contained in Appendix A. The MATLAB programming code used to apply the Hough...described in Chapter III is applied to archived AIS data in this chapter. The implementation of the method, including programming techniques used, is...is contained in the second. To provide a proof of concept for the algorithm described in Chapter III, the PYTHON programming language was used for

  11. 31 CFR 103.131 - Customer identification programs for mutual funds.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Finance FINANCIAL RECORDKEEPING AND REPORTING OF CURRENCY AND FOREIGN TRANSACTIONS Anti-Money Laundering Programs Anti-Money Laundering Programs § 103.131 Customer identification programs for mutual funds. (a... mutual fund's anti-money laundering program required under the regulations implementing 31 U.S.C. 5318(h...

  12. 31 CFR 103.122 - Customer identification programs for broker-dealers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Finance FINANCIAL RECORDKEEPING AND REPORTING OF CURRENCY AND FOREIGN TRANSACTIONS Anti-Money Laundering Programs Anti-Money Laundering Programs § 103.122 Customer identification programs for broker-dealers. (a... anti-money laundering compliance program required under 31 U.S.C. 5318(h). (2) Identity verification...

  13. Proportional Topology Optimization: A New Non-Sensitivity Method for Solving Stress Constrained and Minimum Compliance Problems and Its Implementation in MATLAB

    PubMed Central

    Biyikli, Emre; To, Albert C.

    2015-01-01

    A new topology optimization method called the Proportional Topology Optimization (PTO) is presented. As a non-sensitivity method, PTO is simple to understand, easy to implement, and is also efficient and accurate at the same time. It is implemented into two MATLAB programs to solve the stress constrained and minimum compliance problems. Descriptions of the algorithm and computer programs are provided in detail. The method is applied to solve three numerical examples for both types of problems. The method shows comparable efficiency and accuracy with an existing optimality criteria method which computes sensitivities. Also, the PTO stress constrained algorithm and minimum compliance algorithm are compared by feeding output from one algorithm to the other in an alternative manner, where the former yields lower maximum stress and volume fraction but higher compliance compared to the latter. Advantages and disadvantages of the proposed method and future works are discussed. The computer programs are self-contained and publicly shared in the website www.ptomethod.org. PMID:26678849

  14. Weight optimization of plane truss using genetic algorithm

    NASA Astrophysics Data System (ADS)

    Neeraja, D.; Kamireddy, Thejesh; Santosh Kumar, Potnuru; Simha Reddy, Vijay

    2017-11-01

    Optimization of structure on basis of weight has many practical benefits in every engineering field. The efficiency is proportionally related to its weight and hence weight optimization gains prime importance. Considering the field of civil engineering, weight optimized structural elements are economical and easier to transport to the site. In this study, genetic optimization algorithm for weight optimization of steel truss considering its shape, size and topology aspects has been developed in MATLAB. Material strength and Buckling stability have been adopted from IS 800-2007 code of construction steel. The constraints considered in the present study are fabrication, basic nodes, displacements, and compatibility. Genetic programming is a natural selection search technique intended to combine good solutions to a problem from many generations to improve the results. All solutions are generated randomly and represented individually by a binary string with similarities of natural chromosomes, and hence it is termed as genetic programming. The outcome of the study is a MATLAB program, which can optimise a steel truss and display the optimised topology along with element shapes, deflections, and stress results.

  15. Peer Learning in a MATLAB Programming Course

    NASA Astrophysics Data System (ADS)

    Reckinger, Shanon

    2016-11-01

    Three forms of research-based peer learning were implemented in the design of a MATLAB programming course for mechanical engineering undergraduate students. First, a peer learning program was initiated. These undergraduate peer learning leaders played two roles in the course, (I) they were in the classroom helping students' with their work, and, (II) they led optional two hour helps sessions outside of the class time. The second form of peer learning was implemented through the inclusion of a peer discussion period following in class clicker quizzes. The third form of peer learning had the students creating video project assignments and posting them on YouTube to explain course topics to their peers. Several other more informal techniques were used to encourage peer learning. Student feedback in the form of both instructor-designed survey responses and formal course evaluations (quantitative and narrative) will be presented. Finally, effectiveness will be measured by formal assessment, direct and indirect to these peer learning methods. This will include both academic data/grades and pre/post test scores. Overall, the course design and its inclusion of these peer learning techniques demonstrate effectiveness.

  16. Analytical and Experimental Evaluation of Digital Control Systems for the Semi-Span Super-Sonic Transport (S4T) Wind Tunnel Model

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.; Christhilf, David; Perry, Boyd, III

    2012-01-01

    An important objective of the Semi-Span Super-Sonic Transport (S4T) wind tunnel model program was the demonstration of Flutter Suppression (FS), Gust Load Alleviation (GLA), and Ride Quality Enhancement (RQE). It was critical to evaluate the stability and robustness of these control laws analytically before testing them and experimentally while testing them to ensure safety of the model and the wind tunnel. MATLAB based software was applied to evaluate the performance of closed-loop systems in terms of stability and robustness. Existing software tools were extended to use analytical representations of the S4T and the control laws to analyze and evaluate the control laws prior to testing. Lessons were learned about the complex windtunnel model and experimental testing. The open-loop flutter boundary was determined from the closed-loop systems. A MATLAB/Simulink Simulation developed under the program is available for future work to improve the CPE process. This paper is one of a series of that comprise a special session, which summarizes the S4T wind-tunnel program.

  17. tweezercalib 2.1: Faster version of MatLab package for precise calibration of optical tweezers

    NASA Astrophysics Data System (ADS)

    Hansen, Poul Martin; Tolic-Nørrelykke, Iva Marija; Flyvbjerg, Henrik; Berg-Sørensen, Kirstine

    2006-10-01

    New version program summaryTitle of program: tweezercalib Catalogue identifier:ADTV_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTV_v2_1 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions:no No. of lines in distributed program, including test data, etc.: 134 188 No. of bytes in distributed program, including test data, etc.: 1 050 368 Distribution format: tar.gz Programming language: MatLab (Mathworks Inc.), standard license Computer:General computer running MatLab (Mathworks Inc.) Operating system:Windows2000, Windows-XP, Linux RAM:Of order four times the size of the data file Classification:3, 4.14, 18, 23 Catalogue identifier of previous version: ADTV_v2_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 518 Does the new version supersede the previous version?: yes Nature of problem:Calibrate optical tweezers with precision by fitting theory to experimental power spectrum of position of bead doing Brownian motion in incompressible fluid, possibly near microscope cover slip, while trapped in optical tweezers. Thereby determine spring constant of optical trap and conversion factor for arbitrary-units-to-nanometers for detection system. The theoretical underpinnings of the procedure may be found in Ref. [3]. Solution method:Elimination of cross-talk between quadrant photo-diodes, output channels for positions (optional). Check that distribution of recorded positions agrees with Boltzmann distribution of bead in harmonic trap. Data compression and noise reduction by blocking method applied to power spectrum. Full accounting for hydrodynamic effects; Frequency-dependent drag force and interaction with nearby cover slip (optional). Full accounting for electronic filters (optional), for "virtual filtering" caused by detection system (optional). Full accounting for aliasing caused by finite sampling rate (optional). Standard non-linear least-squares fitting with custom written routines based on Refs. [1,2]. Statistical support for fit is given, with several plots facilitating inspection of consistency and quality of data and fit. Reasons for the new version:Recent progress in the field has demonstrated a better approximation of the formula for the theoretical power spectrum with corrections due to frequency dependence of motion and distance to a surface nearby. Summary of revisions:The expression for the theoretical power spectrum when accounting for corrections to Stokes law, P(f), has been updated to agree with a better approximation of the theoretical spectrum, as discussed in Ref. [4] The units of the kinematic viscosity applied in the program is now stated in the input window. Greek letters and exponents are inserted in the input window. The graphical output has improved: The figures now bear a meaningful title and four figures that test the quality of the fit are now combined in one figure with four parts. Restrictions: Data should be positions of bead doing Brownian motion while held by optical tweezers. For high precision in final results, data should be time series measured over a long time, with sufficiently high experimental sampling rate; The sampling rate should be well above the characteristic frequency of the trap, the so-called corner frequency. Thus, the sampling frequency should typically be larger than 10 kHz. The Fast Fourier Transform used works optimally when the time series contain 2 data points, and long measurement time is obtained with n>12-15. Finally, the optics should be set to ensure a harmonic trapping potential in the range of positions visited by the bead. The fitting procedure checks for harmonic potential. Running time:seconds ReferencesJ. Nocedal, Y.x. Yuan, Combining trust region and line search techniques, Technical Report OTC 98/04, Optimization Technology Center, 1998. W.H. Press, B.P. Flannery, S.A. Teukolsky, W.T. Vetterling, Numerical Recipes. The Art of Scientific Computing, Cambridge University Press, Cambridge, 1986. (The theoretical underpinnings for the procedure) K. Berg-Sørensen and Henrik Flyvbjerg, Power spectrum analysis for optical tweezers, Rev. Sci. Ins. 75 (2004) 594-612. S.F. Tolic-Nørrelykke, et al., Calibration of optical tweezers with positions detection in the back-focal-plane, arXiv:physics/0603037 v2, 2006.

  18. Nighttime foraging by deep diving echolocating odontocetes off the Hawaiian islands of Kauai and Ni'ihau as determined by passive acoustic monitors.

    PubMed

    Au, Whitlow W L; Giorli, Giacomo; Chen, Jessica; Copeland, Adrienne; Lammers, Marc; Richlen, Michael; Jarvis, Susan; Morrissey, Ronald; Moretti, David; Klinck, Holger

    2013-05-01

    Remote autonomous ecological acoustic recorders (EARs) were deployed in deep waters at five locations around the island of Kauai and one in waters off Ni'ihau in the main Hawaiian island chain. The EARs were moored to the bottom at depths between 400 and 800 m. The data acquisition sampling rate was 80 kHz and acoustic signals were recorded for 30 s every 5 min to conserve battery power and disk space. The acoustic data were analyzed with the M3R (Marine Mammal Monitoring on Navy Ranges) software, an energy-ratio-mapping algorithm developed at Oregon State University and custom MATLAB programs. A variety of deep diving odontocetes, including pilot whales, Risso's dolphins, sperm whales, spinner and pan-tropical spotted dolphins, and beaked whales were detected at all sites. Foraging activity typically began to increase after dusk, peaked in the middle of the night and began to decrease toward dawn. Between 70% and 84% of biosonar clicks were detected at night. At present it is not clear why some of the known deep diving species, such as sperm whales and beaked whales, concentrate their foraging efforts at night.

  19. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    NASA Technical Reports Server (NTRS)

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  20. Assessing the blood pressure waveform of the carotid artery using an ultrasound image processing method

    PubMed Central

    Fatouraee, Nasser; Saberi, Hazhir

    2017-01-01

    Purpose The aim of this study was to introduce and implement a noninvasive method to derive the carotid artery pressure waveform directly by processing diagnostic sonograms of the carotid artery. Methods Ultrasound image sequences of 20 healthy male subjects (age, 36±9 years) were recorded during three cardiac cycles. The internal diameter and blood velocity waveforms were extracted from consecutive sonograms over the cardiac cycles by using custom analysis programs written in MATLAB. Finally, the application of a mathematical equation resulted in time changes of the arterial pressure. The resulting pressures were calibrated using the mean and the diastolic pressure of the radial artery. Results A good correlation was found between the mean carotid blood pressure obtained from the ultrasound image processing and the mean radial blood pressure obtained using a standard digital sphygmomanometer (R=0.91). The mean absolute difference between the carotid calibrated pulse pressures and those measured clinically was -1.333±6.548 mm Hg. Conclusion The results of this study suggest that consecutive sonograms of the carotid artery can be used for estimating a blood pressure waveform. We believe that our results promote a noninvasive technique for clinical applications that overcomes the reproducibility problems of common carotid artery tonometry with technical and anatomical causes. PMID:27776401

  1. Using the Gurobi Solvers on the Peregrine System | High-Performance

    Science.gov Websites

    Peregrine System Gurobi Optimizer is a suite of solvers for mathematical programming. It is licensed for ('GRB_MATLAB_PATH') >> path(path,grb) Gurobi and GAMS GAMS is a high-level modeling system for mathematical

  2. More-Realistic Digital Modeling of a Human Body

    NASA Technical Reports Server (NTRS)

    Rogge, Renee

    2010-01-01

    A MATLAB computer program has been written to enable improved (relative to an older program) modeling of a human body for purposes of designing space suits and other hardware with which an astronaut must interact. The older program implements a kinematic model based on traditional anthropometric measurements that do provide important volume and surface information. The present program generates a three-dimensional (3D) whole-body model from 3D body-scan data. The program utilizes thin-plate spline theory to reposition the model without need for additional scans.

  3. Energy essays: a focus on utility communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Selnow, G.W.; Crano, W.D.; Ludwig, S.

    The following papers are included: (1) technology, customers, and the feedback loop, (2) utility communications: a need for understanding the American character, (3) utility programs and grass roots communication, (4) reading the tea leaves of public opinion, (5) the need for public opinion surveys in utility communication programs, (6) the role of assessment in effective utility communication programs, (7) utility customer communication; perspectives on current public policy and law, (8) customer communications - a notion in motion, (9) communication when your customer is your owner, (10) radio advertising, (11) television advertising, (12) newspaper advertising, and (13) magazine advertising. (MOW)

  4. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.

    PubMed

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.

  6. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing

    PubMed Central

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment. PMID:25101013

  7. 31 CFR 103.123 - Customer identification programs for futures commission merchants and introducing brokers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TRANSACTIONS Anti-Money Laundering Programs Anti-Money Laundering Programs § 103.123 Customer identification... each futures commission merchant's and introducing broker's anti-money laundering compliance program... money laundering activities, Federal law requires all financial institutions to obtain, verify, and...

  8. A Method for Harmonic Sources Detection based on Harmonic Distortion Power Rate

    NASA Astrophysics Data System (ADS)

    Lin, Ruixing; Xu, Lin; Zheng, Xian

    2018-03-01

    Harmonic sources detection at the point of common coupling is an essential step for harmonic contribution determination and harmonic mitigation. The harmonic distortion power rate index is proposed for harmonic source location based on IEEE Std 1459-2010 in the paper. The method only based on harmonic distortion power is not suitable when the background harmonic is large. To solve this problem, a threshold is determined by the prior information, when the harmonic distortion power is larger than the threshold, the customer side is considered as the main harmonic source, otherwise, the utility side is. A simple model of public power system was built in MATLAB/Simulink and field test results of typical harmonic loads verified the effectiveness of proposed method.

  9. Machining Chatter Analysis for High Speed Milling Operations

    NASA Astrophysics Data System (ADS)

    Sekar, M.; Kantharaj, I.; Amit Siddhappa, Savale

    2017-10-01

    Chatter in high speed milling is characterized by time delay differential equations (DDE). Since closed form solution exists only for simple cases, the governing non-linear DDEs of chatter problems are solved by various numerical methods. Custom codes to solve DDEs are tedious to build, implement and not error free and robust. On the other hand, software packages provide solution to DDEs, however they are not straight forward to implement. In this paper an easy way to solve DDE of chatter in milling is proposed and implemented with MATLAB. Time domain solution permits the study and model of non-linear effects of chatter vibration with ease. Time domain results are presented for various stable and unstable conditions of cut and compared with stability lobe diagrams.

  10. 75 FR 6790 - Interagency Guidance on Response Programs for Unauthorized Access to Customer Information and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-10

    ... for Unauthorized Access to Customer Information and Customer Notice AGENCY: Office of Thrift... for Unauthorized Access to Customer Information and Customer Notice. OMB Number: 1550-0110. Form...) Ensure the security and confidentiality of customer records and information; (2) protect against any...

  11. Mathematical model of ambulance resources in Saint-Petersburg

    NASA Astrophysics Data System (ADS)

    Shavidze, G. G.; Balykina, Y. E.; Lejnina, E. A.; Svirkin, M. V.

    2016-06-01

    Emergency medical system is one of the main elements in city infrastructure. The article contains analysis of existing system of ambulance resource distribution. Paper considers the idea of using multiperiodicity as a tool to increase the efficiency of the Emergency Medical Services. The program developed in programming environment Matlab helps to evaluate the changes in the functioning of the system of emergency medical service.

  12. Application programs written by using customizing tools of a computer-aided design system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.; Huang, R.; Juricic, D.

    1995-12-31

    Customizing tools of Computer-Aided Design Systems have been developed to such a degree as to become equivalent to powerful higher-level programming languages that are especially suitable for graphics applications. Two examples of application programs written by using AutoCAD`s customizing tools are given in some detail to illustrate their power. One tool uses AutoLISP list-processing language to develop an application program that produces four views of a given solid model. The other uses AutoCAD Developmental System, based on program modules written in C, to produce an application program that renders a freehand sketch from a given CAD drawing.

  13. Interactions between Energy Efficiency Programs funded under the Recovery Act and Utility Customer-Funded Energy Efficiency Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldman, Charles A.; Stuart, Elizabeth; Hoffman, Ian

    2011-02-25

    Since the spring of 2009, billions of federal dollars have been allocated to state and local governments as grants for energy efficiency and renewable energy projects and programs. The scale of this American Reinvestment and Recovery Act (ARRA) funding, focused on 'shovel-ready' projects to create and retain jobs, is unprecedented. Thousands of newly funded players - cities, counties, states, and tribes - and thousands of programs and projects are entering the existing landscape of energy efficiency programs for the first time or expanding their reach. The nation's experience base with energy efficiency is growing enormously, fed by federal dollars andmore » driven by broader objectives than saving energy alone. State and local officials made countless choices in developing portfolios of ARRA-funded energy efficiency programs and deciding how their programs would relate to existing efficiency programs funded by utility customers. Those choices are worth examining as bellwethers of a future world where there may be multiple program administrators and funding sources in many states. What are the opportunities and challenges of this new environment? What short- and long-term impacts will this large, infusion of funds have on utility customer-funded programs; for example, on infrastructure for delivering energy efficiency services or on customer willingness to invest in energy efficiency? To what extent has the attribution of energy savings been a critical issue, especially where administrators of utility customer-funded energy efficiency programs have performance or shareholder incentives? Do the new ARRA-funded energy efficiency programs provide insights on roles or activities that are particularly well-suited to state and local program administrators vs. administrators or implementers of utility customer-funded programs? The answers could have important implications for the future of U.S. energy efficiency. This report focuses on a selected set of ARRA-funded energy efficiency programs administered by state energy offices: the State Energy Program (SEP) formula grants, the portion of Energy Efficiency and Conservation Block Grant (EECBG) formula funds administered directly by states, and the State Energy Efficient Appliance Rebate Program (SEEARP). Since these ARRA programs devote significant monies to energy efficiency and serve similar markets as utility customer-funded programs, there are frequent interactions between programs. We exclude the DOE low-income weatherization program and EECBG funding awarded directly to the over 2,200 cities, counties and tribes from our study to keep its scope manageable. We summarize the energy efficiency program design and funding choices made by the 50 state energy offices, 5 territories and the District of Columbia. We then focus on the specific choices made in 12 case study states. These states were selected based on the level of utility customer program funding, diversity of program administrator models, and geographic diversity. Based on interviews with more than 80 energy efficiency actors in those 12 states, we draw observations about states strategies for use of Recovery Act funds. We examine interactions between ARRA programs and utility customer-funded energy efficiency programs in terms of program planning, program design and implementation, policy issues, and potential long-term impacts. We consider how the existing regulatory policy framework and energy efficiency programs in these 12 states may have impacted development of these selected ARRA programs. Finally, we summarize key trends and highlight issues that evaluators of these ARRA programs may want to examine in more depth in their process and impact evaluations.« less

  14. Beyond Customer Satisfaction: Reexamining Customer Loyalty to Evaluate Continuing Education Programs

    ERIC Educational Resources Information Center

    Hoyt, Jeff E.; Howell, Scott L.

    2011-01-01

    This article provides questionnaire items and a theoretical model of factors predictive of customer loyalty for use by administrators to determine ways to increase repeat purchasing in their continuing education programs. Prior studies in the literature are discussed followed by results of applying the model at one institution and a discussion of…

  15. Customized Job Training for Business and Industry. New Directions for Community Colleges, Number 48.

    ERIC Educational Resources Information Center

    Kopecek, Robert J., Ed.; Clarke, Robert G., Ed.

    1984-01-01

    This sourcebook describes and analyzes contracted customized training for business and industry provided by community colleges. First, "Customized Job Training: Should Your Community College Be Involved?" by Robert J. Kopecek identifies issues to be considered in program decision making and suggests an organizational model for program delivery.…

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bird, L.; Kaiser, M.

    In the early 1990s, only a handful of utilities offered their customers a choice of purchasing electricity generated from renewable energy sources. Today, more than 750 utilities--or about 25% of all utilities nationally--provide their customers a "green power" option. Through these programs, more than 70 million customers have the ability to purchase renewable energy to meet some portion or all of their electricity needs--or make contributions to support the development of renewable energy resources. Typically, customers pay a premium above standard electricity rates for this service. This report presents year-end 2006 data on utility green pricing programs, and examines trendsmore » in consumer response and program implementation over time. The data in this report, which were obtained via a questionnaire distributed to utility green pricing program managers, can be used by utilities to benchmark the success of their green power programs.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bird, Lori; Kaiser, Marshall

    In the early 1990s, only a handful of utilities offered their customers a choice of purchasing electricity generated from renewable energy sources. Today, more than 750 utilities—or about 25% of all utilities nationally—provide their customers a “green power” option. Through these programs, more than 70 million customers have the ability to purchase renewable energy to meet some portion or all of their electricity needs—or make contributions to support the development of renewable energy resources. Typically, customers pay a premium above standard electricity rates for this service. This report presents year-end 2006 data on utility green pricing programs, and examines trendsmore » in consumer response and program implementation over time. The data in this report, which were obtained via a questionnaire distributed to utility green pricing program managers, can be used by utilities to benchmark the success of their green power programs.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bird, Lori; Brown, Elizabeth

    In the early 1990s, only a handful of utilities offered their customers a choice of purchasing electricity generated from renewable energy sources. Today, more than 600 utilities—or about 20% of all utilities nationally—provide their customers a “green power” option. Because some utilities offer programs in conjunction with cooperative associations or other publicly owned power entities, the number of distinct programs totals more than 130. Through these programs, more than 50 million customers have the ability to purchase renewable energy to meet some portion or all of their electricity needs—or make contributions to support the development of renewable energy resources. Typically,more » customers pay a premium above standard electricity rates for this service. This report presents year-end 2005 data on utility green pricing programs, and examines trends in consumer response and program implementation over time. The data in this report, which were obtained via a questionnaire distributed to utility green pricing program managers, can be used by utilities to benchmark the success of their green power programs.« less

  19. Nanofiber Nerve Guide for Peripheral Nerve Repair and Regeneration

    DTIC Science & Technology

    2014-01-01

    observing cell migration using live - cell imaging microscopy, and analyzing cell migration with our MATLAB-based programs. Our studies...are then pipetted into the chamber and their path of migration is observed using a live - cell imaging microscope (Fig. 6d). Utilizing this migration

  20. Optical Excitations and Energy Transfer in Nanoparticle Waveguides

    DTIC Science & Technology

    2009-03-01

    All calculations were performed using our own codes given in the Appendix section. The calculations were performed using Scilab programming package...January 2007, invited Speaker) 12. Scilab is a free software compatible to the famous Matlab package. It can be found at their webpage http

  1. Numerical simulation of water hammer in low pressurized pipe: comparison of SimHydraulics and Lax-Wendroff method with experiment

    NASA Astrophysics Data System (ADS)

    Himr, D.

    2013-04-01

    Article describes simulation of unsteady flow during water hammer with two programs, which use different numerical approaches to solve ordinary one dimensional differential equations describing the dynamics of hydraulic elements and pipes. First one is Matlab-Simulink-SimHydraulics, which is a commercial software developed to solve the dynamics of general hydraulic systems. It defines them with block elements. The other software is called HYDRA and it is based on the Lax-Wendrff numerical method, which serves as a tool to solve the momentum and continuity equations. This program was developed in Matlab by Brno University of Technology. Experimental measurements were performed on a simple test rig, which consists of an elastic pipe with strong damping connecting two reservoirs. Water hammer is induced with fast closing the valve. Physical properties of liquid and pipe elasticity parameters were considered in both simulations, which are in very good agreement and differences in comparison with experimental data are minimal.

  2. Rollout and Installation of Risk Management at the IMINT Directorate, National Reconnaissance Office

    DTIC Science & Technology

    1999-12-01

    strong entry into the 21st century. Programs were forging multiple mission partners and customers into cohesive program delivery systems that could...requires a higher level of cooperation and a meshing of divergent program operations. Customer requirements are more complex and distributed. As a... customers , such as the U.S. Central Intelligence Agency (CIA) and the U.S. Department of Defense (DoD), can warn of potential trouble spots around the

  3. Creating a comprehensive customer service program to help convey critical and acute results of radiology studies.

    PubMed

    Towbin, Alexander J; Hall, Seth; Moskovitz, Jay; Johnson, Neil D; Donnelly, Lane F

    2011-01-01

    Communication of acute or critical results between the radiology department and referring clinicians has been a deficiency of many radiology departments. The failure to perform or document these communications can lead to poor patient care, patient safety issues, medical-legal issues, and complaints from referring clinicians. To mitigate these factors, a communication and documentation tool was created and incorporated into our departmental customer service program. This article will describe the implementation of a comprehensive customer service program in a hospital-based radiology department. A comprehensive customer service program was created in the radiology department. Customer service representatives were hired to answer the telephone calls to the radiology reading rooms and to help convey radiology results. The radiologists, referring clinicians, and customer service representatives were then linked via a novel workflow management system. This workflow management system provided tools to help facilitate the communication needs of each group. The number of studies with results conveyed was recorded from the implementation of the workflow management system. Between the implementation of the workflow management system on August 1, 2005, and June 1, 2009, 116,844 radiology results were conveyed to the referring clinicians and documented in the system. This accounts for more than 14% of the 828,516 radiology cases performed in this time frame. We have been successful in creating a comprehensive customer service program to convey and document communication of radiology results. This program has been widely used by the ordering clinicians as well as radiologists since its inception.

  4. MATLAB implementation of a dynamic clamp with bandwidth >125 KHz capable of generating INa at 37°C

    PubMed Central

    Clausen, Chris; Valiunas, Virginijus; Brink, Peter R.; Cohen, Ira S.

    2012-01-01

    We describe the construction of a dynamic clamp with bandwidth >125 KHz that utilizes a high performance, yet low cost, standard home/office PC interfaced with a high-speed (16 bit) data acquisition module. High bandwidth is achieved by exploiting recently available software advances (code-generation technology, optimized real-time kernel). Dynamic-clamp programs are constructed using Simulink, a visual programming language. Blocks for computation of membrane currents are written in the high-level matlab language; no programming in C is required. The instrument can be used in single- or dual-cell configurations, with the capability to modify programs while experiments are in progress. We describe an algorithm for computing the fast transient Na+ current (INa) in real time, and test its accuracy and stability using rate constants appropriate for 37°C. We then construct a program capable of supplying three currents to a cell preparation: INa, the hyperpolarizing-activated inward pacemaker current (If), and an inward-rectifier K+ current (IK1). The program corrects for the IR drop due to electrode current flow, and also records all voltages and currents. We tested this program on dual patch-clamped HEK293 cells where the dynamic clamp controls a current-clamp amplifier and a voltage-clamp amplifier controls membrane potential, and current-clamped HEK293 cells where the dynamic clamp produces spontaneous pacing behavior exhibiting Na+ spikes in otherwise passive cells. PMID:23224681

  5. An interactive program on digitizing historical seismograms

    NASA Astrophysics Data System (ADS)

    Xu, Yihe; Xu, Tao

    2014-02-01

    Retrieving information from analog seismograms is of great importance since they are considered as the unique sources that provide quantitative information of historical earthquakes. We present an algorithm for automatic digitization of the seismograms as an inversion problem that forms an interactive program using Matlab® GUI. The program integrates automatic digitization with manual digitization and users can easily switch between the two modalities and carry out different combinations for the optimal results. Several examples about applying the interactive program are given to illustrate the merits of the method.

  6. 78 FR 73907 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-09

    ... Change Relating to the Customer Rebate Program December 3, 2013. Pursuant to Section 19(b)(1) of the... The Exchange proposes to amend the Customer Rebate Program in Section B of the Pricing Schedule. The..., the Proposed Rule Change 1. Purpose The Exchange proposes to increase certain Customer rebates in the...

  7. Marketing. Nourishing News. Volume 3, Issue 7

    ERIC Educational Resources Information Center

    Idaho State Department of Education, 2009

    2009-01-01

    The use of marketing can effectively enhance the growth and image of the Child Nutrition Programs. The customer has changed over the years, and today's customer in the Child Nutrition Programs wants the food served to be high-quality at low prices. As a result, many child nutrition managers are looking at what the customer is requesting. This…

  8. Computational solution of spike overlapping using data-based subtraction algorithms to resolve synchronous sympathetic nerve discharge

    PubMed Central

    Su, Chun-Kuei; Chiang, Chia-Hsun; Lee, Chia-Ming; Fan, Yu-Pei; Ho, Chiu-Ming; Shyu, Liang-Yu

    2013-01-01

    Sympathetic nerves conveying central commands to regulate visceral functions often display activities in synchronous bursts. To understand how individual fibers fire synchronously, we establish “oligofiber recording techniques” to record “several” nerve fiber activities simultaneously, using in vitro splanchnic sympathetic nerve–thoracic spinal cord preparations of neonatal rats as experimental models. While distinct spike potentials were easily recorded from collagenase-dissociated sympathetic fibers, a problem arising from synchronous nerve discharges is a higher incidence of complex waveforms resulted from spike overlapping. Because commercial softwares do not provide an explicit solution for spike overlapping, a series of custom-made LabVIEW programs incorporated with MATLAB scripts was therefore written for spike sorting. Spikes were represented as data points after waveform feature extraction and automatically grouped by k-means clustering followed by principal component analysis (PCA) to verify their waveform homogeneity. For dissimilar waveforms with exceeding Hotelling's T2 distances from the cluster centroids, a unique data-based subtraction algorithm (SA) was used to determine if they were the complex waveforms resulted from superimposing a spike pattern close to the cluster centroid with the other signals that could be observed in original recordings. In comparisons with commercial software, higher accuracy was achieved by analyses using our algorithms for the synthetic data that contained synchronous spiking and complex waveforms. Moreover, both T2-selected and SA-retrieved spikes were combined as unit activities. Quantitative analyses were performed to evaluate if unit activities truly originated from single fibers. We conclude that applications of our programs can help to resolve synchronous sympathetic nerve discharges (SND). PMID:24198782

  9. Validation of ALFIA: a platform for quantifying near-infrared fluorescent images of lymphatic propulsion in humans

    NASA Astrophysics Data System (ADS)

    Rasmussen, John C.; Bautista, Merrick; Tan, I.-Chih; Adams, Kristen E.; Aldrich, Melissa; Marshall, Milton V.; Fife, Caroline E.; Maus, Erik A.; Smith, Latisha A.; Zhang, Jingdan; Xiang, Xiaoyan; Zhou, Shaohua Kevin; Sevick-Muraca, Eva M.

    2011-02-01

    Recently, we demonstrated near-infrared (NIR) fluorescence imaging for quantifying real-time lymphatic propulsion in humans following intradermal injections of microdose amounts of indocyanine green. However computational methods for image analysis are underdeveloped, hindering the translation and clinical adaptation of NIR fluorescent lymphatic imaging. In our initial work we used ImageJ and custom MatLab programs to manually identify lymphatic vessels and individual propulsion events using the temporal transit of the fluorescent dye. In addition, we extracted the apparent velocities of contractile propagation and time periods between propulsion events. Extensive time and effort were required to analyze the 6-8 gigabytes of NIR fluorescent images obtained for each subject. To alleviate this bottleneck, we commenced development of ALFIA, an integrated software platform which will permit automated, near real-time analysis of lymphatic function using NIR fluorescent imaging. However, prior to automation, the base algorithms calculating the apparent velocity and period must be validated to verify that they produce results consistent with the proof-of-concept programs. To do this, both methods were used to analyze NIR fluorescent images of two subjects and the number of propulsive events identified, the average apparent velocities, and the average periods for each subject were compared. Paired Student's t-tests indicate that the differences between their average results are not significant. With the base algorithms validated, further development and automation of ALFIA can be realized, significantly reducing the amount of user interaction required, and potentially enabling the near real-time, clinical evaluation of NIR fluorescent lymphatic imaging.

  10. Measurement of Civil Engineering Customer Satisfaction in Tactical Air Command: A Prototype Evaluation Program.

    DTIC Science & Technology

    1986-09-01

    customers . The article states that in response to a White House Office of Consumer Affairs study and with the wide use of minicomputers: Companies are...D-A174 l16 MEASUREMENT OF CIVIL ENGINEERING CUSTOMER SRTISFACTIbN 1/ IN TACTICAL AIR CO (U) AIR FORCE INST OF TECH ...... RIGHT-PATTERSON AFB ON...BUREAU OF STANDARDS- 1963-A_ . -_- ’II I-F MEASUREMENT OF CIVIL ENGINEERING CUSTOMER SATISFACTION IN TACTICAL AIR COMMAND: A PROTOTYPE EVALUATION PROGRAM

  11. Nonlinear Boltzmann equation for the homogeneous isotropic case: Minimal deterministic Matlab program

    NASA Astrophysics Data System (ADS)

    Asinari, Pietro

    2010-10-01

    The homogeneous isotropic Boltzmann equation (HIBE) is a fundamental dynamic model for many applications in thermodynamics, econophysics and sociodynamics. Despite recent hardware improvements, the solution of the Boltzmann equation remains extremely challenging from the computational point of view, in particular by deterministic methods (free of stochastic noise). This work aims to improve a deterministic direct method recently proposed [V.V. Aristov, Kluwer Academic Publishers, 2001] for solving the HIBE with a generic collisional kernel and, in particular, for taking care of the late dynamics of the relaxation towards the equilibrium. Essentially (a) the original problem is reformulated in terms of particle kinetic energy (exact particle number and energy conservation during microscopic collisions) and (b) the computation of the relaxation rates is improved by the DVM-like correction, where DVM stands for Discrete Velocity Model (ensuring that the macroscopic conservation laws are exactly satisfied). Both these corrections make possible to derive very accurate reference solutions for this test case. Moreover this work aims to distribute an open-source program (called HOMISBOLTZ), which can be redistributed and/or modified for dealing with different applications, under the terms of the GNU General Public License. The program has been purposely designed in order to be minimal, not only with regards to the reduced number of lines (less than 1000), but also with regards to the coding style (as simple as possible). Program summaryProgram title: HOMISBOLTZ Catalogue identifier: AEGN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 23 340 No. of bytes in distributed program, including test data, etc.: 7 635 236 Distribution format: tar.gz Programming language: Tested with Matlab version ⩽6.5. However, in principle, any recent version of Matlab or Octave should work Computer: All supporting Matlab or Octave Operating system: All supporting Matlab or Octave RAM: 300 MBytes Classification: 23 Nature of problem: The problem consists in integrating the homogeneous Boltzmann equation for a generic collisional kernel in case of isotropic symmetry, by a deterministic direct method. Difficulties arise from the multi-dimensionality of the collisional operator and from satisfying the conservation of particle number and energy (momentum is trivial for this test case) as accurately as possible, in order to preserve the late dynamics. Solution method: The solution is based on the method proposed by Aristov (2001) [1], but with two substantial improvements: (a) the original problem is reformulated in terms of particle kinetic energy (this allows one to ensure exact particle number and energy conservation during microscopic collisions) and (b) a DVM-like correction (where DVM stands for Discrete Velocity Model) is adopted for improving the relaxation rates (this allows one to satisfy exactly the conservation laws at macroscopic level, which is particularly important for describing the late dynamics in the relaxation towards the equilibrium). Both these corrections make possible to derive very accurate reference solutions for this test case. Restrictions: The nonlinear Boltzmann equation is extremely challenging from the computational point of view, in particular for deterministic methods, despite the increased computational power of recent hardware. In this work, only the homogeneous isotropic case is considered, for making possible the development of a minimal program (by a simple scripting language) and allowing the user to check the advantages of the proposed improvements beyond Aristov's (2001) method [1]. The initial conditions are supposed parameterized according to a fixed analytical expression, but this can be easily modified. Running time: From minutes to hours (depending on the adopted discretization of the kinetic energy space). For example, on a 64 bit workstation with Intel CoreTM i7-820Q Quad Core CPU at 1.73 GHz and 8 MBytes of RAM, the provided test run (with the corresponding binary data file storing the pre-computed relaxation rates) requires 154 seconds. References:V.V. Aristov, Direct Methods for Solving the Boltzmann Equation and Study of Nonequilibrium Flows, Kluwer Academic Publishers, 2001.

  12. PETSc Users Manual Revision 3.3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.; Brown, J.; Buschelman, K.

    This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication. PETSc includes an expanding suite of parallel linear, nonlinear equation solvers and time integrators that may be used in application codes written in Fortran, C, C++, Python, and MATLAB (sequential). PETSc provides many of the mechanisms neededmore » within parallel application codes, such as parallel matrix and vector assembly routines. The library is organized hierarchically, enabling users to employ the level of abstraction that is most appropriate for a particular problem. By using techniques of object-oriented programming, PETSc provides enormous flexibility for users. PETSc is a sophisticated set of software tools; as such, for some users it initially has a much steeper learning curve than a simple subroutine library. In particular, for individuals without some computer science background, experience programming in C, C++ or Fortran and experience using a debugger such as gdb or dbx, it may require a significant amount of time to take full advantage of the features that enable efficient software use. However, the power of the PETSc design and the algorithms it incorporates may make the efficient implementation of many application codes simpler than “rolling them” yourself; For many tasks a package such as MATLAB is often the best tool; PETSc is not intended for the classes of problems for which effective MATLAB code can be written. PETSc also has a MATLAB interface, so portions of your code can be written in MATLAB to “try out” the PETSc solvers. The resulting code will not be scalable however because currently MATLAB is inherently not scalable; and PETSc should not be used to attempt to provide a “parallel linear solver” in an otherwise sequential code. Certainly all parts of a previously sequential code need not be parallelized but the matrix generation portion must be parallelized to expect any kind of reasonable performance. Do not expect to generate your matrix sequentially and then “use PETSc” to solve the linear system in parallel. Since PETSc is under continued development, small changes in usage and calling sequences of routines will occur. PETSc is supported; see the web site http://www.mcs.anl.gov/petsc for information on contacting support. A http://www.mcs.anl.gov/petsc/publications may be found a list of publications and web sites that feature work involving PETSc. We welcome any reports of corrections for this document.« less

  13. PETSc Users Manual Revision 3.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.; Brown, J.; Buschelman, K.

    This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication. PETSc includes an expanding suite of parallel linear, nonlinear equation solvers and time integrators that may be used in application codes written in Fortran, C, C++, Python, and MATLAB (sequential). PETSc provides many of the mechanisms neededmore » within parallel application codes, such as parallel matrix and vector assembly routines. The library is organized hierarchically, enabling users to employ the level of abstraction that is most appropriate for a particular problem. By using techniques of object-oriented programming, PETSc provides enormous flexibility for users. PETSc is a sophisticated set of software tools; as such, for some users it initially has a much steeper learning curve than a simple subroutine library. In particular, for individuals without some computer science background, experience programming in C, C++ or Fortran and experience using a debugger such as gdb or dbx, it may require a significant amount of time to take full advantage of the features that enable efficient software use. However, the power of the PETSc design and the algorithms it incorporates may make the efficient implementation of many application codes simpler than “rolling them” yourself; For many tasks a package such as MATLAB is often the best tool; PETSc is not intended for the classes of problems for which effective MATLAB code can be written. PETSc also has a MATLAB interface, so portions of your code can be written in MATLAB to “try out” the PETSc solvers. The resulting code will not be scalable however because currently MATLAB is inherently not scalable; and PETSc should not be used to attempt to provide a “parallel linear solver” in an otherwise sequential code. Certainly all parts of a previously sequential code need not be parallelized but the matrix generation portion must be parallelized to expect any kind of reasonable performance. Do not expect to generate your matrix sequentially and then “use PETSc” to solve the linear system in parallel. Since PETSc is under continued development, small changes in usage and calling sequences of routines will occur. PETSc is supported; see the web site http://www.mcs.anl.gov/petsc for information on contacting support. A http://www.mcs.anl.gov/petsc/publications may be found a list of publications and web sites that feature work involving PETSc. We welcome any reports of corrections for this document.« less

  14. PETSc Users Manual Revision 3.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.; Abhyankar, S.; Adams, M.

    This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication. PETSc includes an expanding suite of parallel linear, nonlinear equation solvers and time integrators that may be used in application codes written in Fortran, C, C++, Python, and MATLAB (sequential). PETSc provides many of the mechanisms neededmore » within parallel application codes, such as parallel matrix and vector assembly routines. The library is organized hierarchically, enabling users to employ the level of abstraction that is most appropriate for a particular problem. By using techniques of object-oriented programming, PETSc provides enormous flexibility for users. PETSc is a sophisticated set of software tools; as such, for some users it initially has a much steeper learning curve than a simple subroutine library. In particular, for individuals without some computer science background, experience programming in C, C++ or Fortran and experience using a debugger such as gdb or dbx, it may require a significant amount of time to take full advantage of the features that enable efficient software use. However, the power of the PETSc design and the algorithms it incorporates may make the efficient implementation of many application codes simpler than “rolling them” yourself. ;For many tasks a package such as MATLAB is often the best tool; PETSc is not intended for the classes of problems for which effective MATLAB code can be written. PETSc also has a MATLAB interface, so portions of your code can be written in MATLAB to “try out” the PETSc solvers. The resulting code will not be scalable however because currently MATLAB is inherently not scalable; and PETSc should not be used to attempt to provide a “parallel linear solver” in an otherwise sequential code. Certainly all parts of a previously sequential code need not be parallelized but the matrix generation portion must be parallelized to expect any kind of reasonable performance. Do not expect to generate your matrix sequentially and then “use PETSc” to solve the linear system in parallel. Since PETSc is under continued development, small changes in usage and calling sequences of routines will occur. PETSc is supported; see the web site http://www.mcs.anl.gov/petsc for information on contacting support. A http://www.mcs.anl.gov/petsc/publications may be found a list of publications and web sites that feature work involving PETSc. We welcome any reports of corrections for this document.« less

  15. Navigation Constellation Design Using a Multi-Objective Genetic Algorithm

    DTIC Science & Technology

    2015-03-26

    programs. This specific tool not only offers high fidelity simulations, but it also offers the visual aid provided by STK . The ability to...MATLAB and STK . STK is a program that allows users to model, analyze, and visualize space systems. Users can create objects such as satellites and...position dilution of precision (PDOP) and system cost. This thesis utilized Satellite Tool Kit ( STK ) to calculate PDOP values of navigation

  16. X-ray Intermolecular Structure Factor (XISF): separation of intra- and intermolecular interactions from total X-ray scattering data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mou, Q.; Benmore, C. J.; Yarger, J. L.

    2015-06-01

    XISF is a MATLAB program developed to separate intermolecular structure factors from total X-ray scattering structure factors for molecular liquids and amorphous solids. The program is built on a trust-region-reflective optimization routine with the r.m.s. deviations of atoms physically constrained. XISF has been optimized for performance and can separate intermolecular structure factors of complex molecules.

  17. X-ray Intermolecular Structure Factor ( XISF ): separation of intra- and intermolecular interactions from total X-ray scattering data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mou, Q.; Benmore, C. J.; Yarger, J. L.

    2015-05-09

    XISFis a MATLAB program developed to separate intermolecular structure factors from total X-ray scattering structure factors for molecular liquids and amorphous solids. The program is built on a trust-region-reflective optimization routine with the r.m.s. deviations of atoms physically constrained.XISFhas been optimized for performance and can separate intermolecular structure factors of complex molecules.

  18. 18 CFR 367.9080 - Account 908, Customer assistance expenses.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... electric equipment. (4) Demonstrations, exhibits, lectures, and other programs designed to instruct..., lectures, and other programs. (2) Loss in value on equipment and appliances used for customer assistance...

  19. 18 CFR 367.9080 - Account 908, Customer assistance expenses.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... electric equipment. (4) Demonstrations, exhibits, lectures, and other programs designed to instruct..., lectures, and other programs. (2) Loss in value on equipment and appliances used for customer assistance...

  20. 18 CFR 367.9080 - Account 908, Customer assistance expenses.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... electric equipment. (4) Demonstrations, exhibits, lectures, and other programs designed to instruct..., lectures, and other programs. (2) Loss in value on equipment and appliances used for customer assistance...

  1. 18 CFR 367.9080 - Account 908, Customer assistance expenses.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... electric equipment. (4) Demonstrations, exhibits, lectures, and other programs designed to instruct..., lectures, and other programs. (2) Loss in value on equipment and appliances used for customer assistance...

  2. Theory research of seam recognition and welding torch pose control based on machine vision

    NASA Astrophysics Data System (ADS)

    Long, Qiang; Zhai, Peng; Liu, Miao; He, Kai; Wang, Chunyang

    2017-03-01

    At present, the automation requirement of the welding become higher, so a method of the welding information extraction by vision sensor is proposed in this paper, and the simulation with the MATLAB has been conducted. Besides, in order to improve the quality of robot automatic welding, an information retrieval method for welding torch pose control by visual sensor is attempted. Considering the demands of welding technology and engineering habits, the relative coordinate systems and variables are strictly defined, and established the mathematical model of the welding pose, and verified its feasibility by using the MATLAB simulation in the paper, these works lay a foundation for the development of welding off-line programming system with high precision and quality.

  3. Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package

    ERIC Educational Resources Information Center

    Ibrahim, Dogan

    2009-01-01

    The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…

  4. 19 CFR 201.149 - Program accessibility: Discrimination prohibited.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Program accessibility: Discrimination prohibited. 201.149 Section 201.149 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.149 Program accessibility: Discrimination...

  5. 19 CFR 201.150 - Program accessibility: Existing facilities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Program accessibility: Existing facilities. 201.150 Section 201.150 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.150 Program accessibility: Existing facilities...

  6. 19 CFR 201.150 - Program accessibility: Existing facilities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 3 2012-04-01 2012-04-01 false Program accessibility: Existing facilities. 201.150 Section 201.150 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.150 Program accessibility: Existing facilities...

  7. 19 CFR 201.150 - Program accessibility: Existing facilities.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 3 2014-04-01 2014-04-01 false Program accessibility: Existing facilities. 201.150 Section 201.150 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.150 Program accessibility: Existing facilities...

  8. 19 CFR 201.149 - Program accessibility: Discrimination prohibited.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 3 2012-04-01 2012-04-01 false Program accessibility: Discrimination prohibited. 201.149 Section 201.149 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.149 Program accessibility: Discrimination...

  9. 19 CFR 201.150 - Program accessibility: Existing facilities.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 3 2013-04-01 2013-04-01 false Program accessibility: Existing facilities. 201.150 Section 201.150 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.150 Program accessibility: Existing facilities...

  10. 19 CFR 201.149 - Program accessibility: Discrimination prohibited.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 3 2014-04-01 2014-04-01 false Program accessibility: Discrimination prohibited. 201.149 Section 201.149 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.149 Program accessibility: Discrimination...

  11. 19 CFR 201.149 - Program accessibility: Discrimination prohibited.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 3 2011-04-01 2011-04-01 false Program accessibility: Discrimination prohibited. 201.149 Section 201.149 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.149 Program accessibility: Discrimination...

  12. 19 CFR 201.149 - Program accessibility: Discrimination prohibited.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 3 2013-04-01 2013-04-01 false Program accessibility: Discrimination prohibited. 201.149 Section 201.149 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.149 Program accessibility: Discrimination...

  13. 19 CFR 201.150 - Program accessibility: Existing facilities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 3 2011-04-01 2011-04-01 false Program accessibility: Existing facilities. 201.150 Section 201.150 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES OF... Conducted by the U.S. International Trade Commission § 201.150 Program accessibility: Existing facilities...

  14. Department of the Navy Amended FY 1992/FY 1993 Biennial Budget Estimates. R,D,T, and E Descriptive Summaries Submitted to Congress January 1992. Research, Development, Test and Evaluation, Navy

    DTIC Science & Technology

    1992-01-01

    counications between technology producer (Navy RDT&X community) and technology customer (Navy/MarLne Corps operating forces). Program technological...Additional programs as rwured by Fleet customer . 3. (U) 1t 1993 Planes Identify issues and provide link to RDT&3 community. Projects will vary according to...fleet customer requLrements. 4. (U) Program to Cmpletions This is a continuing program. D. (U) WORK pIRFORIpD l: iN-DOSE: NsWC Dahlgren, VA; AC

  15. 78 FR 68505 - Enhancing Protections Afforded Customers and Customer Funds Held by Futures Commission Merchants...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-14

    ...The Commodity Futures Trading Commission (``Commission'' or ``CFTC'') is adopting new regulations and amending existing regulations to require enhanced customer protections, risk management programs, internal monitoring and controls, capital and liquidity standards, customer disclosures, and auditing and examination programs for futures commission merchants (``FCMs''). The regulations also address certain related issues concerning derivatives clearing organizations (``DCOs'') and chief compliance officers (``CCOs''). The final rules will afford greater assurances to market participants that: Customer segregated funds, secured amount funds, and cleared swaps funds are protected; customers are provided with appropriate notice of the risks of futures trading and of the FCMs with which they may choose to do business; FCMs are monitoring and managing risks in a robust manner; the capital and liquidity of FCMs are strengthened to safeguard their continued operations; and the auditing and examination programs of the Commission and the self- regulatory organizations (``SROs'') are monitoring the activities of FCMs in a prudent and thorough manner.

  16. Dose response explorer: an integrated open-source tool for exploring and modelling radiotherapy dose volume outcome relationships

    NASA Astrophysics Data System (ADS)

    El Naqa, I.; Suneja, G.; Lindsay, P. E.; Hope, A. J.; Alaly, J. R.; Vicic, M.; Bradley, J. D.; Apte, A.; Deasy, J. O.

    2006-11-01

    Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.

  17. American Recovery and Reinvestment Act of 2009. Experiences from the Consumer Behavior Studies on Engaging Customers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cappers, Peter; Scheer, Richard

    2014-09-01

    One of the most important aspects for the successful implementation of customer-facing programs is to better understand how to engage and communicate with consumers. Customer-facing programs include time-based rates, information and feedback, load management, and energy efficiency. This report presents lessons learned by utilities through consumer behavior studies (CBS) conducted as part of the Department of Energy’s (DOE) Smart Grid Investment Grant (SGIG) program. The SGIG CBS effort presents a unique opportunity to advance the understanding of consumer behaviors in terms of customer acceptance and retention, and electricity consumption and peak demand impacts. The effort includes eleven comprehensive studies withmore » the aim of evaluating the response of residential and small commercial customers to time-based rate programs implemented in conjunction with advanced metering infrastructure and customer systems such as in-home displays, programmable communicating thermostats, and web portals. DOE set guidelines and protocols that sought to help the utilities design studies that would rigorously test and more precisely estimate the impact of time-based rates on customers’ energy usage patterns, as well as identify the key drivers that motivate behavioral changes.« less

  18. Simulating electron energy loss spectroscopy with the MNPBEM toolbox

    NASA Astrophysics Data System (ADS)

    Hohenester, Ulrich

    2014-03-01

    Within the MNPBEM toolbox, we show how to simulate electron energy loss spectroscopy (EELS) of plasmonic nanoparticles using a boundary element method approach. The methodology underlying our approach closely follows the concepts developed by García de Abajo and coworkers (Garcia de Abajo, 2010). We introduce two classes eelsret and eelsstat that allow in combination with our recently developed MNPBEM toolbox for a simple, robust, and efficient computation of EEL spectra and maps. The classes are accompanied by a number of demo programs for EELS simulation of metallic nanospheres, nanodisks, and nanotriangles, and for electron trajectories passing by or penetrating through the metallic nanoparticles. We also discuss how to compute electric fields induced by the electron beam and cathodoluminescence. Catalogue identifier: AEKJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 38886 No. of bytes in distributed program, including test data, etc.: 1222650 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b). Computer: Any which supports Matlab 7.11.0 (R2010b). Operating system: Any which supports Matlab 7.11.0 (R2010b). RAM:≥1 GB Classification: 18. Catalogue identifier of previous version: AEKJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 370 External routines: MESH2D available at www.mathworks.com Does the new version supersede the previous version?: Yes Nature of problem: Simulation of electron energy loss spectroscopy (EELS) for plasmonic nanoparticles. Solution method: Boundary element method using electromagnetic potentials. Reasons for new version: The new version of the toolbox includes two additional classes for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles, and corrects a few minor bugs and inconsistencies. Summary of revisions: New classes “eelsstat” and “eelsret” for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles have been added. A few minor errors in the implementation of dipole excitation have been corrected. Running time: Depending on surface discretization between seconds and hours.

  19. Using MATLAB Software on the Peregrine System | High-Performance Computing

    Science.gov Websites

    Learn how to run MATLAB software in batch mode on the Peregrine system. Below is an example MATLAB job in batch (non-interactive) mode. To try the example out, create both matlabTest.sub and /$USER. In this example, it is also the directory into which MATLAB will write the output file x.dat

  20. Reliability analysis of C-130 turboprop engine components using artificial neural network

    NASA Astrophysics Data System (ADS)

    Qattan, Nizar A.

    In this study, we predict the failure rate of Lockheed C-130 Engine Turbine. More than thirty years of local operational field data were used for failure rate prediction and validation. The Weibull regression model and the Artificial Neural Network model including (feed-forward back-propagation, radial basis neural network, and multilayer perceptron neural network model); will be utilized to perform this study. For this purpose, the thesis will be divided into five major parts. First part deals with Weibull regression model to predict the turbine general failure rate, and the rate of failures that require overhaul maintenance. The second part will cover the Artificial Neural Network (ANN) model utilizing the feed-forward back-propagation algorithm as a learning rule. The MATLAB package will be used in order to build and design a code to simulate the given data, the inputs to the neural network are the independent variables, the output is the general failure rate of the turbine, and the failures which required overhaul maintenance. In the third part we predict the general failure rate of the turbine and the failures which require overhaul maintenance, using radial basis neural network model on MATLAB tool box. In the fourth part we compare the predictions of the feed-forward back-propagation model, with that of Weibull regression model, and radial basis neural network model. The results show that the failure rate predicted by the feed-forward back-propagation artificial neural network model is closer in agreement with radial basis neural network model compared with the actual field-data, than the failure rate predicted by the Weibull model. By the end of the study, we forecast the general failure rate of the Lockheed C-130 Engine Turbine, the failures which required overhaul maintenance and six categorical failures using multilayer perceptron neural network (MLP) model on DTREG commercial software. The results also give an insight into the reliability of the engine turbine under actual operating conditions, which can be used by aircraft operators for assessing system and component failures and customizing the maintenance programs recommended by the manufacturer.

  1. Assessing and Upgrading Ocean Mixing for the Study of Climate Change

    NASA Astrophysics Data System (ADS)

    Howard, A. M.; Fells, J.; Lindo, F.; Tulsee, V.; Canuto, V.; Cheng, Y.; Dubovikov, M. S.; Leboissetier, A.

    2016-12-01

    Climate is critical. Climate variability affects us all; Climate Change is a burning issue. Droughts, floods, other extreme events, and Global Warming's effects on these and problems such as sea-level rise and ecosystem disruption threaten lives. Citizens must be informed to make decisions concerning climate such as "business as usual" vs. mitigating emissions to keep warming within bounds. Medgar Evers undergraduates aid NASA research while learning climate science and developing computer&math skills. To make useful predictions we must realistically model each component of the climate system, including the ocean, whose critical role includes transporting&storing heat and dissolved CO2. We need physically based parameterizations of key ocean processes that can't be put explicitly in a global climate model, e.g. vertical&lateral mixing. The NASA-GISS turbulence group uses theory to model mixing including: 1) a comprehensive scheme for small scale vertical mixing, including convection&shear, internal waves & double-diffusion, and bottom tides 2) a new parameterization for the lateral&vertical mixing by mesoscale eddies. For better understanding we write our own programs. To assess the modelling MATLAB programs visualize and calculate statistics, including means, standard deviations and correlations, on NASA-GISS OGCM output with different mixing schemes and help us study drift from observations. We also try to upgrade the schemes, e.g. the bottom tidal mixing parameterizations' roughness, calculated from high resolution topographic data using Gaussian weighting functions with cut-offs. We study the effects of their parameters to improve them. A FORTRAN program extracts topography data subsets of manageable size for a MATLAB program, tested on idealized cases, to visualize&calculate roughness on. Students are introduced to modeling a complex system, gain a deeper appreciation of climate science, programming skills and familiarity with MATLAB, while furthering climate science by improving our mixing schemes. We are incorporating climate research into our college curriculum. The PI is both a member of the turbulence group at NASA-GISS and an associate professor at Medgar Evers College of CUNY, an urban minority serving institution in central Brooklyn. Supported by NSF Award AGS-1359293.

  2. EEGVIS: A MATLAB Toolbox for Browsing, Exploring, and Viewing Large Datasets.

    PubMed

    Robbins, Kay A

    2012-01-01

    Recent advances in data monitoring and sensor technology have accelerated the acquisition of very large data sets. Streaming data sets from instrumentation such as multi-channel EEG recording usually must undergo substantial pre-processing and artifact removal. Even when using automated procedures, most scientists engage in laborious manual examination and processing to assure high quality data and to indentify interesting or problematic data segments. Researchers also do not have a convenient method of method of visually assessing the effects of applying any stage in a processing pipeline. EEGVIS is a MATLAB toolbox that allows users to quickly explore multi-channel EEG and other large array-based data sets using multi-scale drill-down techniques. Customizable summary views reveal potentially interesting sections of data, which users can explore further by clicking to examine using detailed viewing components. The viewer and a companion browser are built on our MoBBED framework, which has a library of modular viewing components that can be mixed and matched to best reveal structure. Users can easily create new viewers for their specific data without any programming during the exploration process. These viewers automatically support pan, zoom, resizing of individual components, and cursor exploration. The toolbox can be used directly in MATLAB at any stage in a processing pipeline, as a plug-in for EEGLAB, or as a standalone precompiled application without MATLAB running. EEGVIS and its supporting packages are freely available under the GNU general public license at http://visual.cs.utsa.edu/eegvis.

  3. A Collection of Nonlinear Aircraft Simulations in MATLAB

    NASA Technical Reports Server (NTRS)

    Garza, Frederico R.; Morelli, Eugene A.

    2003-01-01

    Nonlinear six degree-of-freedom simulations for a variety of aircraft were created using MATLAB. Data for aircraft geometry, aerodynamic characteristics, mass / inertia properties, and engine characteristics were obtained from open literature publications documenting wind tunnel experiments and flight tests. Each nonlinear simulation was implemented within a common framework in MATLAB, and includes an interface with another commercially-available program to read pilot inputs and produce a three-dimensional (3-D) display of the simulated airplane motion. Aircraft simulations include the General Dynamics F-16 Fighting Falcon, Convair F-106B Delta Dart, Grumman F-14 Tomcat, McDonnell Douglas F-4 Phantom, NASA Langley Free-Flying Aircraft for Sub-scale Experimental Research (FASER), NASA HL-20 Lifting Body, NASA / DARPA X-31 Enhanced Fighter Maneuverability Demonstrator, and the Vought A-7 Corsair II. All nonlinear simulations and 3-D displays run in real time in response to pilot inputs, using contemporary desktop personal computer hardware. The simulations can also be run in batch mode. Each nonlinear simulation includes the full nonlinear dynamics of the bare airframe, with a scaled direct connection from pilot inputs to control surface deflections to provide adequate pilot control. Since all the nonlinear simulations are implemented entirely in MATLAB, user-defined control laws can be added in a straightforward fashion, and the simulations are portable across various computing platforms. Routines for trim, linearization, and numerical integration are included. The general nonlinear simulation framework and the specifics for each particular aircraft are documented.

  4. The Role of Communicative Feedback in Successful Water Conservation Programs

    ERIC Educational Resources Information Center

    Tom, Gail; Tauchus, Gail; Williams, Jared; Tong, Stephanie

    2011-01-01

    The Sacramento County Water Agency has made available 2 water conservation programs to its customers. The Data Logger Program attaches the Meter Master Model 100 EL data logger to the customer's water meter for 1 week and provides a detailed report of water usage from each fixture. The Water Wise House Call Program provides findings and…

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bird, L.; Brown, E.

    In the early 1990s, only a handful of utilities offered their customers a choice of purchasing electricity generated from renewable energy sources. Today, nearly 600 utilities in regulated electricity markets--or almost 20% of all utilities nationally--provide their customers a "green power" option. Because some utilities offer programs in conjunction with cooperative associations or other publicly owned power entities, the number of distinct programs totals about 125. Through these programs, more than 40 million customers spanning 34 states have the ability to purchase renewable energy to meet some portion or all of their electricity needs--or make contributions to support the developmentmore » of renewable energy resources. Typically, customers pay a premium above standard electricity rates for this service. This report presents year-end 2004 data on utility green pricing programs, and examines trends in consumer response and program implementation over time. The data in this report, which were obtained via a questionnaire distributed to utility green pricing program managers, can be used by utilities as benchmarks by which to gauge the success of their green power programs.« less

  6. Buckling analysis of SMA bonded sandwich structure – using FEM

    NASA Astrophysics Data System (ADS)

    Katariya, Pankaj V.; Das, Arijit; Panda, Subrata K.

    2018-03-01

    Thermal buckling strength of smart sandwich composite structure (bonded with shape memory alloy; SMA) examined numerically via a higher-order finite element model in association with marching technique. The excess geometrical distortion of the structure under the elevated environment modeled through Green’s strain function whereas the material nonlinearity counted with the help of marching method. The system responses are computed numerically by solving the generalized eigenvalue equations via a customized MATLAB code. The comprehensive behaviour of the current finite element solutions (minimum buckling load parameter) is established by solving the adequate number of numerical examples including the given input parameter. The current numerical model is extended further to check the influence of various structural parameter of the sandwich panel on the buckling temperature including the SMA effect and reported in details.

  7. H-Bridge Inverter Loading Analysis for an Energy Management System

    DTIC Science & Technology

    2013-06-01

    In order to accomplish the stated objectives, a physics-based model of the system was developed in MATLAB/Simulink. The system was also implemented ...functional architecture and then compile the high level design down to VHDL in order to program the designed functions to the FPGA. B. INSULATED

  8. Small Internal Combustion Engine Testing for a Hybrid-Electric Remotely-Piloted Aircraft

    DTIC Science & Technology

    2011-03-01

    differential equations (ODEs) were formed and solved for numerically using various solvers in MATLAB . From these solutions, engine performance...program 5. □ Make sure eddy-current absorber and sprockets are free of debris and that no loose materials are close enough to become entangled

  9. An Elementary Algorithm to Evaluate Trigonometric Functions to High Precision

    ERIC Educational Resources Information Center

    Johansson, B. Tomas

    2018-01-01

    Evaluation of the cosine function is done via a simple Cordic-like algorithm, together with a package for handling arbitrary-precision arithmetic in the computer program Matlab. Approximations to the cosine function having hundreds of correct decimals are presented with a discussion around errors and implementation.

  10. American Recovery and Reinvestment Act of 2009. Interim Report on Customer Acceptance, Retention, and Response to Time-Based Rates from the Consumer Behavior Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cappers, Peter; Hans, Liesel; Scheer, Richard

    Time-based rate programs1, enabled by utility investments in advanced metering infrastructure (AMI), are increasingly being considered by utilities as tools to reduce peak demand and enable customers to better manage consumption and costs. There are several customer systems that are relatively new to the marketplace and have the potential for improving the effectiveness of these programs, including in-home displays (IHDs), programmable communicating thermostats (PCTs), and web portals. Policy and decision makers are interested in more information about customer acceptance, retention, and response before moving forward with expanded deployments of AMI-enabled new rates and technologies. Under the Smart Grid Investment Grantmore » Program (SGIG), the U.S. Department of Energy (DOE) partnered with several utilities to conduct consumer behavior studies (CBS). The goals involved applying randomized and controlled experimental designs for estimating customer responses more precisely and credibly to advance understanding of time-based rates and customer systems, and provide new information for improving program designs, implementation strategies, and evaluations. The intent was to produce more robust and credible analysis of impacts, costs, benefits, and lessons learned and assist utility and regulatory decision makers in evaluating investment opportunities involving time-based rates. To help achieve these goals, DOE developed technical guidelines to help the CBS utilities estimate customer acceptance, retention, and response more precisely.« less

  11. Benefits of mass customized products: moderating role of product involvement and fashion innovativeness.

    PubMed

    Park, Minjung; Yoo, Jungmin

    2018-02-01

    The objective of this study was to explore impacts and benefits of mass customized products on emotional product attachment, favorable attitudes toward a mass customization program, and the ongoing effect on loyalty intentions. This study further investigated how benefits, attachment, attitudes, and loyalty intentions differed as a function of involvement and fashion innovativeness. 290 female online shoppers in South Korea participated in an online survey. Results of this study revealed that perceived benefits positively influenced emotional product attachment and attitudes toward a mass customization program. In addition, attachment positively influenced attitudes, which in turn affected loyalty intentions. This study also found that benefits, attachment, attitudes, and loyalty intentions were all higher in highly involved consumers (high fashion innovators) than those in less involved consumers (low fashion innovators). This study concludes with theoretical and practical implications for mass customization programs.

  12. 78 FR 27984 - Modification of the National Customs Automation Program Test (NCAP) Regarding Reconciliation for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-13

    ... Customs Automation Program Test (NCAP) Regarding Reconciliation for Filing Certain Post-Importation Claims... Automation Program (NCAP) Reconciliation prototype test to include the filing of post-importation [[Page... notices. DATES: The test is modified to allow Reconciliation of post-importation preferential tariff...

  13. 19 CFR 201.151 - Program accessibility: New construction and alterations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Program accessibility: New construction and alterations. 201.151 Section 201.151 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES... Activities Conducted by the U.S. International Trade Commission § 201.151 Program accessibility: New...

  14. 19 CFR 201.151 - Program accessibility: New construction and alterations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 3 2011-04-01 2011-04-01 false Program accessibility: New construction and alterations. 201.151 Section 201.151 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES... Activities Conducted by the U.S. International Trade Commission § 201.151 Program accessibility: New...

  15. 19 CFR 201.151 - Program accessibility: New construction and alterations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 3 2014-04-01 2014-04-01 false Program accessibility: New construction and alterations. 201.151 Section 201.151 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES... Activities Conducted by the U.S. International Trade Commission § 201.151 Program accessibility: New...

  16. 19 CFR 201.151 - Program accessibility: New construction and alterations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 3 2012-04-01 2012-04-01 false Program accessibility: New construction and alterations. 201.151 Section 201.151 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES... Activities Conducted by the U.S. International Trade Commission § 201.151 Program accessibility: New...

  17. 19 CFR 201.151 - Program accessibility: New construction and alterations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 3 2013-04-01 2013-04-01 false Program accessibility: New construction and alterations. 201.151 Section 201.151 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION GENERAL RULES... Activities Conducted by the U.S. International Trade Commission § 201.151 Program accessibility: New...

  18. 76 FR 34246 - Automated Commercial Environment (ACE); Announcement of National Customs Automation Program Test...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ... CBP with authority to conduct limited test programs or procedures designed to evaluate planned... aspects of this test, including the design, conduct and implementation of the test, in order to determine... Environment (ACE); Announcement of National Customs Automation Program Test of Automated Procedures for In...

  19. Is your company ready for one-to-one marketing?

    PubMed

    Peppers, D; Rogers, M; Dorf, B

    1999-01-01

    One-to-one marketing, also known as relationship marketing, promises to increase the value of your customer base by establishing a learning relationship with each customer. The customer tells you of some need, and you customize your product or service to meet it. Every interaction and modification improves your ability to fit your product to the particular customer. Eventually, even if a competitor offers the same type of service, your customer won't be able to enjoy the same level of convenience without taking the time to teach your competitor the lessons your company has already learned. Although the theory behind one-to-one marketing is simple, implementation is complex. Too many companies have jumped on the one-to-one band-wagon without proper preparation--mistakenly understanding it as an excuse to badger customers with excessive telemarketing and direct mail campaigns. The authors offer practical advice for implementing a one-to-one marketing program correctly. They describe four key steps: identifying your customers, differentiating among them, interacting with them, and customizing your product or service to meet each customer's needs. And they provide activities and exercises, to be administered to employees and customers, that will help you identify your company's readiness to launch a one-to-one initiative. Although some managers dismiss the possibility of one-to-one marketing as an unattainable goal, even a modest program can produce substantial benefits. This tool kit will help you determine what type of program your company can implement now, what you need to do to position your company for a large-scale initiative, and how to set priorities.

  20. 16 CFR 314.3 - Standards for safeguarding customer information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Standards for safeguarding customer... OF CONGRESS STANDARDS FOR SAFEGUARDING CUSTOMER INFORMATION § 314.3 Standards for safeguarding customer information. (a) Information security program. You shall develop, implement, and maintain a...

  1. 75 FR 74082 - Agency Information Collection Activities: Proposed Collection; Comments Requested

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-30

    ... of information collection under review: customer satisfaction surveys. The Department of Justice (DOJ... collection. (2) Title of the Form/Collection: Customer Satisfaction Surveys. (3) Agency form number, if any... program-specific customer satisfaction surveys to more effectively capture customer perception...

  2. 78 FR 4926 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-23

    ... on the proposed rule change from interested persons. \\1\\ 15 U.S.C. 78s(b)(1). \\2\\ 17 CFR 240.19b-4. I....'' Specifically, the Exchange proposes to amend the Customer Rebate Program, Select Symbols,\\5\\ Simple and Complex... Category D to the Customer Rebate Program relating to Customer Simple Orders in Select Symbols. The...

  3. Strategic Mobility 21: Modeling, Simulation, and Analysis

    DTIC Science & Technology

    2010-04-14

    using AnyLogic , which is a Java programmed, multi-method simulation modeling tool developed by XJ Technologies. The last section examines the academic... simulation model from an Arena platform to an AnyLogic based Web Service. MATLAB is useful for small problems with few nodes, but GAMS/CPLEX is better... Transportation Modeling Studio TM . The SCASN modeling and simulation program was designed to be generic in nature to allow for use by both commercial and

  4. Generating Customized Verifiers for Automatically Generated Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2008-01-01

    Program verification using Hoare-style techniques requires many logical annotations. We have previously developed a generic annotation inference algorithm that weaves in all annotations required to certify safety properties for automatically generated code. It uses patterns to capture generator- and property-specific code idioms and property-specific meta-program fragments to construct the annotations. The algorithm is customized by specifying the code patterns and integrating them with the meta-program fragments for annotation construction. However, this is difficult since it involves tedious and error-prone low-level term manipulations. Here, we describe an annotation schema compiler that largely automates this customization task using generative techniques. It takes a collection of high-level declarative annotation schemas tailored towards a specific code generator and safety property, and generates all customized analysis functions and glue code required for interfacing with the generic algorithm core, thus effectively creating a customized annotation inference algorithm. The compiler raises the level of abstraction and simplifies schema development and maintenance. It also takes care of some more routine aspects of formulating patterns and schemas, in particular handling of irrelevant program fragments and irrelevant variance in the program structure, which reduces the size, complexity, and number of different patterns and annotation schemas that are required. The improvements described here make it easier and faster to customize the system to a new safety property or a new generator, and we demonstrate this by customizing it to certify frame safety of space flight navigation code that was automatically generated from Simulink models by MathWorks' Real-Time Workshop.

  5. LC/QTOF-MS fragmentation of N-nitrosodimethylamine precursors in drinking water supplies is predictable and aids their identification.

    PubMed

    Hanigan, David; Ferrer, Imma; Thurman, E Michael; Herckes, Pierre; Westerhoff, Paul

    2017-02-05

    N-Nitrosodimethylamine (NDMA) is carcinogenic in rodents and occurs in chloraminated drinking water and wastewater effluents. NDMA forms via reactions between chloramines and mostly unidentified, N-containing organic matter. We developed a mass spectrometry technique to identify NDMA precursors by analyzing 25 model compounds with LC/QTOF-MS. We searched isolates of 11 drinking water sources and 1 wastewater using a custom MATLAB ® program and extracted ion chromatograms for two fragmentation patterns that were specific to the model compounds. Once a diagnostic fragment was discovered, we conducted MS/MS during a subsequent injection to confirm the precursor ion. Using non-target searches and two diagnostic fragmentation patterns, we discovered 158 potential NDMA precursors. Of these, 16 were identified using accurate mass combined with fragment and retention time matches of analytical standards when available. Five of these sixteen NDMA precursors were previously unidentified in the literature, three of which were metabolites of pharmaceuticals. Except methadone, the newly identified precursors all had NDMA molar yields of less than 5%, indicating that NDMA formation could be additive from multiple compounds, each with low yield. We demonstrate that the method is applicable to other disinfection by-product precursors by predicting and verifying the fragmentation patterns for one nitrosodiethylamine precursor. Copyright © 2016. Published by Elsevier B.V.

  6. Neurovascular Network Explorer 1.0: a database of 2-photon single-vessel diameter measurements with MATLAB(®) graphical user interface.

    PubMed

    Sridhar, Vishnu B; Tian, Peifang; Dale, Anders M; Devor, Anna; Saisan, Payam A

    2014-01-01

    We present a database client software-Neurovascular Network Explorer 1.0 (NNE 1.0)-that uses MATLAB(®) based Graphical User Interface (GUI) for interaction with a database of 2-photon single-vessel diameter measurements from our previous publication (Tian et al., 2010). These data are of particular interest for modeling the hemodynamic response. NNE 1.0 is downloaded by the user and then runs either as a MATLAB script or as a standalone program on a Windows platform. The GUI allows browsing the database according to parameters specified by the user, simple manipulation and visualization of the retrieved records (such as averaging and peak-normalization), and export of the results. Further, we provide NNE 1.0 source code. With this source code, the user can database their own experimental results, given the appropriate data structure and naming conventions, and thus share their data in a user-friendly format with other investigators. NNE 1.0 provides an example of seamless and low-cost solution for sharing of experimental data by a regular size neuroscience laboratory and may serve as a general template, facilitating dissemination of biological results and accelerating data-driven modeling approaches.

  7. MultiElec: A MATLAB Based Application for MEA Data Analysis.

    PubMed

    Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R

    2015-01-01

    We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.

  8. 19 CFR 351.510 - Indirect taxes and import charges (other than export programs).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 3 2013-04-01 2013-04-01 false Indirect taxes and import charges (other than export programs). 351.510 Section 351.510 Customs Duties INTERNATIONAL TRADE ADMINISTRATION, DEPARTMENT... Subsidies § 351.510 Indirect taxes and import charges (other than export programs). (a) Benefit—(1...

  9. Customer Service Training for Public Services Staff at Temple University's Central Library System.

    ERIC Educational Resources Information Center

    Arthur, Gwen

    Arguing that good interpersonal interactions between library staff and their patrons is a major determinant of overall patron satisfaction, this paper describes Temple University's customer service training program for its public services staff. Dubbed the "A+ Service" program, the program focuses on six aspects of library service: (1)…

  10. 78 FR 35044 - U.S. Customs and Border Protection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection Agency Information Collection Activities: Visa Waiver Program Carrier Agreement (CBP Form I-775) AGENCY: U.S. Customs and Border Protection... information collection: 1651-0110. SUMMARY: U.S. Customs and Border Protection (CBP) of the Department of...

  11. SEQassembly: A Practical Tools Program for Coding Sequences Splicing

    NASA Astrophysics Data System (ADS)

    Lee, Hongbin; Yang, Hang; Fu, Lei; Qin, Long; Li, Huili; He, Feng; Wang, Bo; Wu, Xiaoming

    CDS (Coding Sequences) is a portion of mRNA sequences, which are composed by a number of exon sequence segments. The construction of CDS sequence is important for profound genetic analysis such as genotyping. A program in MATLAB environment is presented, which can process batch of samples sequences into code segments under the guide of reference exon models, and splice these code segments of same sample source into CDS according to the exon order in queue file. This program is useful in transcriptional polymorphism detection and gene function study.

  12. Development of an improved MATLAB GUI for the prediction of coefficients of restitution, and integration into LMS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baca, Renee Nicole; Congdon, Michael L.; Brake, Matthew Robert

    In 2012, a Matlab GUI for the prediction of the coefficient of restitution was developed in order to enable the formulation of more accurate Finite Element Analysis (FEA) models of components. This report details the development of a new Rebound Dynamics GUI, and how it differs from the previously developed program. The new GUI includes several new features, such as source and citation documentation for the material database, as well as a multiple materials impact modeler for use with LMS Virtual.Lab Motion (LMS VLM), and a rigid body dynamics modeling software. The Rebound Dynamics GUI has been designed to workmore » with LMS VLM to enable straightforward incorporation of velocity-dependent coefficients of restitution in rigid body dynamics simulations.« less

  13. Spectrum image analysis tool - A flexible MATLAB solution to analyze EEL and CL spectrum images.

    PubMed

    Schmidt, Franz-Philipp; Hofer, Ferdinand; Krenn, Joachim R

    2017-02-01

    Spectrum imaging techniques, gaining simultaneously structural (image) and spectroscopic data, require appropriate and careful processing to extract information of the dataset. In this article we introduce a MATLAB based software that uses three dimensional data (EEL/CL spectrum image in dm3 format (Gatan Inc.'s DigitalMicrograph ® )) as input. A graphical user interface enables a fast and easy mapping of spectral dependent images and position dependent spectra. First, data processing such as background subtraction, deconvolution and denoising, second, multiple display options including an EEL/CL moviemaker and, third, the applicability on a large amount of data sets with a small work load makes this program an interesting tool to visualize otherwise hidden details. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.

    PubMed

    Lee, Leng-Feng; Umberger, Brian R

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.

  15. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB

    PubMed Central

    Lee, Leng-Feng

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184

  16. Building brand equity and customer loyalty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pokorny, G.

    Customer satisfaction and customer loyalty are two different concepts, not merely two different phrases measuring a single consumer attitude. Utilities having identical customer satisfaction ratings based on performance in areas like power reliability, pricing, and quality of service differ dramatically in their levels of customer loyalty. As competitive markets establish themselves, discrepancies in customer loyalty will have profound impacts on each utility`s prospects for market retention, profitability, and ultimately, shareholder value. Meeting pre-existing consumer needs, wants and preferences is the foundation of any utility strategy for building customer loyalty and market retention. Utilities meet their underlying customer expectations by performingmore » well in three discrete areas: product, customer service programs, and customer service transactions. Brand equity is an intervening variable standing between performance and the loyalty a utility desires. It is the totality of customer perceptions about the unique extra value the utility provides above and beyond its basic product, customer service programs and customer service transactions; it is the tangible, palpable reality of a branded utility that exists in the minds of consumers. By learning to manage their brand equity as well as they manage their brand performance, utilities gain control over all the major elements in the value-creation process that creates customer loyalty. By integrating brand performance and brand equity, electric utility companies can truly become in their customers` eyes a brand - a unique, very special, value-added energy services provider that can ask for and deserve a premium price in the marketplace.« less

  17. Department of the Navy Justification of Estimates for Fiscal Years 1988 and 1989 Submitted to Congress January 1987. Operation & Maintenance, Navy. Book 2 of 3. Budget Activity 7: Central Supply and Maintenance

    DTIC Science & Technology

    1987-01-01

    training, customer service, small/uneconomilcal lot manufacturing, preservation and depreservation, aircraft salvage and recovery, and support of depot...for the commercial modification programs. 1,260 4) Increase in commercial A/C preservation ,and customer service effort. 69 9. Program Derr.eas~s...3,499 Salvage 703 459 712 845 Acceptance/Transfer 2,587 1,645- 2,376 2,453 Customer /Fleet Training 2,838 2,008 3,058 2,997 Customer Services 17,276

  18. Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.

    PubMed

    Markiewicz, Tomasz

    2011-03-30

    The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8 GHz 4 GB RAM server with 768x576 pixel size, 1.28 Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server - 3.5 seconds, at remote analysis - 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system.

  19. Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology

    PubMed Central

    2011-01-01

    Background The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. Methods In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. Results The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8GHz 4GB RAM server with 768x576 pixel size, 1.28Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server – 3.5 seconds, at remote analysis – 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. Conclusions The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system. PMID:21489188

  20. Mechanisms and Mitigation of Hearing Loss from Blast Injury

    DTIC Science & Technology

    2012-10-01

    Apple Hill Drive Natick, MA 01760-2098 USA). The matlab program controlled the stimulus presentation and 11 Figure 2: Cochleostomies in scala ...Mechanisms and Mitigation of Hearing Loss from Blast Injury 5b. GRANT NUMBER W81XWH-10-2-0112 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) James R...gauge lm in the shock tube rupture membrane. Lessons learned Tympanic membrane rupture Data were highly variable, with one rupture at 7 PSI, another at

  1. GLS-Finder: An Automated Data-Mining System for Fast Profiling Glucosinolates and its Application in Brassica Vegetables

    USDA-ARS?s Scientific Manuscript database

    A rapid computer-aided program for profiling glucosinolates, “GLS-Finder", was developed. GLS-Finder is a Matlab script based expert system that is capable for qualitative and semi-quantitative analysis of glucosinolates in samples using data generated by ultra-high performance liquid chromatograph...

  2. Automatic Rock Detection and Mapping from HiRISE Imagery

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Adams, Douglas S.; Cheng, Yang

    2008-01-01

    This system includes a C-code software program and a set of MATLAB software tools for statistical analysis and rock distribution mapping. The major functions include rock detection and rock detection validation. The rock detection code has been evolved into a production tool that can be used by engineers and geologists with minor training.

  3. Analysis of 3D Subharmonic Ultrasound Signals from Patients with Known Breast Masses for Lesion Differentiation

    DTIC Science & Technology

    2014-12-01

    4 5 BODY ...Jefferson University. 5 BODY 5.1 Training Component The training component of this research has been split into breast imaging and image...exclusively with Matlab (the programming language that will be used for the research component of this project). Additionally, the PI attended

  4. Development of Silica Fibers and Microstructures with Large and Thermodynamically Stable Second Order Nonlinearity

    DTIC Science & Technology

    2011-06-22

    high degree of symmetry directly leads to a symmetry-enforced selection rule that can produce quantum entanglement [21, 22]. This report is organized...page.) Then, using a Matlab program, we converted the microscope image to a binary bitmap, from which we extract fiber radius at any given location

  5. Simulated Analysis of Linear Reversible Enzyme Inhibition with SCILAB

    ERIC Educational Resources Information Center

    Antuch, Manuel; Ramos, Yaquelin; Álvarez, Rubén

    2014-01-01

    SCILAB is a lesser-known program (than MATLAB) for numeric simulations and has the advantage of being free software. A challenging software-based activity to analyze the most common linear reversible inhibition types with SCILAB is described. Students establish typical values for the concentration of enzyme, substrate, and inhibitor to simulate…

  6. Large-Scale Dynamic Observation Planning for Unmanned Surface Vessels

    DTIC Science & Technology

    2007-06-01

    programming language. In addition, the useful development software NetBeans IDE is free and makes the use of Java very user-friendly. 92...3. We implemented the greedy and 3PAA algorithms in Java using the NetBeans IDE version 5.5. 4. The test datasets were generated in MATLAB. 5

  7. Human dynamics of spending: Longitudinal study of a coalition loyalty program

    NASA Astrophysics Data System (ADS)

    Yi, Il Gu; Jeong, Hyang Min; Choi, Woosuk; Jang, Seungkwon; Lee, Heejin; Kim, Beom Jun

    2014-09-01

    Large-scale data of a coalition loyalty program is analyzed in terms of the temporal dynamics of customers' behaviors. We report that the two main activities of a loyalty program, earning and redemption of points, exhibit very different behaviors. It is also found that as customers become older from their early 20's, both male and female customers increase their earning and redemption activities until they arrive at the turning points, beyond which both activities decrease. The positions of turning points as well as the maximum earned and redeemed points are found to differ for males and females. On top of these temporal behaviors, we identify that there exists a learning effect and customers learn how to earn and redeem points as their experiences accumulate in time.

  8. ISMRM Raw Data Format: A Proposed Standard for MRI Raw Datasets

    PubMed Central

    Inati, Souheil J.; Naegele, Joseph D.; Zwart, Nicholas R.; Roopchansingh, Vinai; Lizak, Martin J.; Hansen, David C.; Liu, Chia-Ying; Atkinson, David; Kellman, Peter; Kozerke, Sebastian; Xue, Hui; Campbell-Washburn, Adrienne E.; Sørensen, Thomas S.; Hansen, Michael S.

    2015-01-01

    Purpose This work proposes the ISMRM Raw Data (ISMRMRD) format as a common MR raw data format, which promotes algorithm and data sharing. Methods A file format consisting of a flexible header and tagged frames of k-space data was designed. Application Programming Interfaces were implemented in C/C++, MATLAB, and Python. Converters for Bruker, General Electric, Philips, and Siemens proprietary file formats were implemented in C++. Raw data were collected using MRI scanners from four vendors, converted to ISMRMRD format, and reconstructed using software implemented in three programming languages (C++, MATLAB, Python). Results Images were obtained by reconstructing the raw data from all vendors. The source code, raw data, and images comprising this work are shared online, serving as an example of an image reconstruction project following a paradigm of reproducible research. Conclusion The proposed raw data format solves a practical problem for the MRI community. It may serve as a foundation for reproducible research and collaborations. The ISMRMRD format is a completely open and community-driven format, and the scientific community is invited (including commercial vendors) to participate either as users or developers. PMID:26822475

  9. Diagnosis of Lung Cancer by Fractal Analysis of Damaged DNA

    PubMed Central

    Namazi, Hamidreza; Kiminezhadmalaie, Mona

    2015-01-01

    Cancer starts when cells in a part of the body start to grow out of control. In fact cells become cancer cells because of DNA damage. A DNA walk of a genome represents how the frequency of each nucleotide of a pairing nucleotide couple changes locally. In this research in order to study the cancer genes, DNA walk plots of genomes of patients with lung cancer were generated using a program written in MATLAB language. The data so obtained was checked for fractal property by computing the fractal dimension using a program written in MATLAB. Also, the correlation of damaged DNA was studied using the Hurst exponent measure. We have found that the damaged DNA sequences are exhibiting higher degree of fractality and less correlation compared with normal DNA sequences. So we confirmed this method can be used for early detection of lung cancer. The method introduced in this research not only is useful for diagnosis of lung cancer but also can be applied for detection and growth analysis of different types of cancers. PMID:26539245

  10. An efficient Matlab script to calculate heterogeneous anisotropically elastic wave propagation in three dimensions

    USGS Publications Warehouse

    Boyd, O.S.

    2006-01-01

    We have created a second-order finite-difference solution to the anisotropic elastic wave equation in three dimensions and implemented the solution as an efficient Matlab script. This program allows the user to generate synthetic seismograms for three-dimensional anisotropic earth structure. The code was written for teleseismic wave propagation in the 1-0.1 Hz frequency range but is of general utility and can be used at all scales of space and time. This program was created to help distinguish among various types of lithospheric structure given the uneven distribution of sources and receivers commonly utilized in passive source seismology. Several successful implementations have resulted in a better appreciation for subduction zone structure, the fate of a transform fault with depth, lithospheric delamination, and the effects of wavefield focusing and defocusing on attenuation. Companion scripts are provided which help the user prepare input to the finite-difference solution. Boundary conditions including specification of the initial wavefield, absorption and two types of reflection are available. ?? 2005 Elsevier Ltd. All rights reserved.

  11. MetaRep, an extended CMAS 3D program to visualize mafic (CMAS, ACF-S, ACF-N) and pelitic (AFM-K, AFM-S, AKF-S) projections

    NASA Astrophysics Data System (ADS)

    France, Lydéric; Nicollet, Christian

    2010-06-01

    MetaRep is a program based on our earlier program CMAS 3D. It is developed in MATLAB ® script. MetaRep objectives are to visualize and project major element compositions of mafic and pelitic rocks and their minerals in the pseudo-quaternary projections of the ACF-S, ACF-N, CMAS, AFM-K, AFM-S and AKF-S systems. These six systems are commonly used to describe metamorphic mineral assemblages and magmatic evolutions. Each system, made of four apices, can be represented in a tetrahedron that can be visualized in three dimensions with MetaRep; the four tetrahedron apices represent oxides or combination of oxides that define the composition of the projected rock or mineral. The three-dimensional representation allows one to obtain a better understanding of the topology of the relationships between the rocks and minerals and relations. From these systems, MetaRep can also project data in ternary plots (for example, the ACF, AFM and AKF ternary projections can be generated). A functional interface makes it easy to use and does not require any knowledge of MATLAB ® programming. To facilitate the use, MetaRep loads, from the main interface, data compiled in a Microsoft Excel ™ spreadsheet. Although useful for scientific research, the program is also a powerful tool for teaching. We propose an application example that, by using two combined systems (ACF-S and ACF-N), provides strong confirmation in the petrological interpretation.

  12. Using STOQS and stoqstoolbox for in situ Measurement Data Access in Matlab

    NASA Astrophysics Data System (ADS)

    López-Castejón, F.; Schlining, B.; McCann, M. P.

    2012-12-01

    This poster presents the stoqstoolbox, an extension to Matlab that simplifies the loading of in situ measurement data directly from STOQS databases. STOQS (Spatial Temporal Oceanographic Query System) is a geospatial database tool designed to provide efficient access to data following the CF-NetCDF Discrete Samples Geometries convention. Data are loaded from CF-NetCDF files into a STOQS database where indexes are created on depth, spatial coordinates and other parameters, e.g. platform type. STOQS provides consistent, simple and efficient methods to query for data. For example, we can request all measurements with a standard_name of sea_water_temperature between two times and from between two depths. Data access is simpler because the data are retrieved by parameter irrespective of platform or mission file names. Access is more efficient because data are retrieved via the index on depth and only the requested data are retrieved from the database and transferred into the Matlab workspace. Applications in the stoqstoolbox query the STOQS database via an HTTP REST application programming interface; they follow the Data Access Object pattern, enabling highly customizable query construction. Data are loaded into Matlab structures that clearly indicate latitude, longitude, depth, measurement data value, and platform name. The stoqstoolbox is designed to be used in concert with other tools, such as nctoolbox, which can load data from any OPeNDAP data source. With these two toolboxes a user can easily work with in situ and other gridded data, such as from numerical models and remote sensing platforms. In order to show the capability of stoqstoolbox we will show an example of model validation using data collected during the May-June 2012 field experiment conducted by the Monterey Bay Aquarium Research Institute (MBARI) in Monterey Bay, California. The data are available from the STOQS server at http://odss.mbari.org/canon/stoqs_may2012/query/. Over 14 million data points of 18 parameters from 6 platforms measured over a 3-week period are available on this server. The model used for comparison is the Regional Ocean Modeling System developed by Jet Propulsion Laboratory for the Monterey Bay. The model output are loaded into Matlab using nctoolbox from the JPL server at http://ourocean.jpl.nasa.gov:8080/thredds/dodsC/MBNowcast. Model validation with in situ measurements can be difficult because of different file formats and because data may be spread across individual data systems for each platform. With stoqstoolbox the researcher must know only the URL of the STOQS server and the OPeNDAP URL of the model output. With selected depth and time constraints a user's Matlab program searches for all in situ measurements available for the same time, depth and variable of the model. STOQS and stoqstoolbox are open source software projects supported by MBARI and the David and Lucile Packard foundation. For more information please see http://code.google.com/p/stoqs.

  13. Retaining customers in a managed care market. Hospitals must understand the connection between patient satisfaction, loyalty, retention, and revenue.

    PubMed

    Gemme, E M

    1997-01-01

    Traditionally, health care patients have been treated by health care professionals as people with needs rather than as customers with options. Although managed care has restricted patient choice, choice has not been eliminated. The premise of this article is that patients are primary health care consumers. Adopting such a premise and developing an active customer retention program can help health care organizations change their culture for the better, which may lead to higher customer retention levels and increased revenues. Customer retention programs based on service excellence that empower employees to provide excellent care can eventually lead to a larger market share for health care organizations trying to survive this era of intense competition.

  14. 20 CFR 666.420 - Under what circumstances may a sanction be applied to local areas for poor performance?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... performance agreed to under § 666.310 for the core indicators of performance or customer satisfaction... or customer satisfaction indicators for a program for two consecutive program years, the Governor...

  15. 20 CFR 666.100 - What performance indicators must be included in a State's plan?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., respectively and the two customer satisfaction indicators. (1) For the Adult program, these indicators are: (i...) A single customer satisfaction measure for employers and a single customer satisfaction indicator...

  16. 76 FR 9786 - NIOSH Dose Reconstruction Program Ten-Year Review-Phase I Report on Customer Service; Request for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention NIOSH Dose Reconstruction Program Ten-Year Review--Phase I Report on Customer Service; Request for Public Review and Comment... requests public review and comment on the draft publication, ``NIOSH Dose Reconstruction Program Ten-Year...

  17. SIGNUM: A Matlab, TIN-based landscape evolution model

    NASA Astrophysics Data System (ADS)

    Refice, A.; Giachetta, E.; Capolongo, D.

    2012-08-01

    Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.

  18. Portfolio-Scale Optimization of Customer Energy Efficiency Incentive and Marketing: Cooperative Research and Development Final Report, CRADA Number CRD-13-535

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brackney, Larry J.

    North East utility National Grid (NGrid) is developing a portfolio-scale application of OpenStudio designed to optimize incentive and marketing expenditures for their energy efficiency (EE) programs. NGrid wishes to leverage a combination of geographic information systems (GIS), public records, customer data, and content from the Building Component Library (BCL) to form a JavaScript Object Notation (JSON) input file that is consumed by an OpenStudio-based expert system for automated model generation. A baseline model for each customer building will be automatically tuned using electricity and gas consumption data, and a set of energy conservation measures (ECMs) associated with each NGrid incentivemore » program will be applied to the model. The simulated energy performance and return on investment (ROI) will be compared with customer hurdle rates and available incentives to A) optimize the incentive required to overcome the customer hurdle rate and B) determine if marketing activity associated with the specific ECM is warranted for that particular customer. Repeated across their portfolio, this process will enable NGrid to substantially optimize their marketing and incentive expenditures, targeting those customers that will likely adopt and benefit from specific EE programs.« less

  19. Tools for Integrating Data Access from the IRIS DMC into Research Workflows

    NASA Astrophysics Data System (ADS)

    Reyes, C. G.; Suleiman, Y. Y.; Trabant, C.; Karstens, R.; Weertman, B. R.

    2012-12-01

    Web service interfaces at the IRIS Data Management Center (DMC) provide access to a vast archive of seismological and related geophysical data. These interfaces are designed to easily incorporate data access into data processing workflows. Examples of data that may be accessed include: time series data, related metadata, and earthquake information. The DMC has developed command line scripts, MATLAB® interfaces and a Java library to support a wide variety of data access needs. Users of these interfaces do not need to concern themselves with web service details, networking, or even (in most cases) data conversion. Fetch scripts allow access to the DMC archive and are a comfortable fit for command line users. These scripts are written in Perl and are well suited for automation and integration into existing workflows on most operating systems. For metdata and event information, the Fetch scripts even parse the returned data into simple text summaries. The IRIS Java Web Services Library (IRIS-WS Library) allows Java developers the ability to create programs that access the DMC archives seamlessly. By returning the data and information as native Java objects the Library insulates the developer from data formats, network programming and web service details. The MATLAB interfaces leverage this library to allow users access to the DMC archive directly from within MATLAB (r2009b or newer), returning data into variables for immediate use. Data users and research groups are developing other toolkits that use the DMC's web services. Notably, the ObsPy framework developed at LMU Munich is a Python Toolbox that allows seamless access to data and information via the DMC services. Another example is the MATLAB-based GISMO and Waveform Suite developments that can now access data via web services. In summary, there now exist a host of ways that researchers can bring IRIS DMC data directly into their workflows. MATLAB users can use irisFetch.m, command line users can use the various Fetch scripts, Java users can use the IRIS-WS library, and Python users may request data through ObsPy. To learn more about any of these clients see http://www.iris.edu/ws/wsclients/.

  20. History and Status of the CIS Customs Union

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawson, T.M.; Erickson, S.A.

    1999-08-31

    This report explores the history of the CIS Customs Union and the major obstacles the Union faces in its implementation. Investigation of the Customs Union is necessary as its implementation could effect the Second Line of Defense (SLD) Program. Russian Customs contends that radiation detectors should not be installed along the Customs Union members borders of as the borders will be dissolved when the Union is implemented.

  1. CUSTOMER/SUPPLIER ACCOUNTABILITY AND PROGRAM IMPLEMENTATION

    EPA Science Inventory

    Quality assurance (QA) and quality control (QC) are the basic components of a QA program, which is a fundamental quality management tool. he quality of outputs and services strongly depends on the caliber of the communications between the "customer" and the "supplier." lear under...

  2. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly... Simplified Entry functionality in the Automated Commercial Environment (ACE). Originally, the test was known...) test concerning Automated Commercial Environment (ACE) Simplified Entry (SE test) functionality is...

  3. SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.

    PubMed

    Smith, Lucas R; Barton, Elisabeth R

    2014-01-01

    Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection along with user correction. The output of the code provides data in accordance with established standards of practice. The results of the program have been validated using a small set of wild-type and mdx muscle sections. This program is the first freely available and open source image processing program designed to automate analysis of skeletal muscle histological sections.

  4. Visualizing how Seismic Waves Propagate Across Seismic Arrays using the IRIS DMS Ground Motion Visualization (GMV) Products and Codes

    NASA Astrophysics Data System (ADS)

    Taber, J.; Bahavar, M.; Bravo, T. K.; Butler, R. F.; Kilb, D. L.; Trabant, C.; Woodward, R.; Ammon, C. J.

    2011-12-01

    Data from dense seismic arrays can be used to visualize the propagation of seismic waves, resulting in animations effective for teaching both general and advanced audiences. One of the first visualizations of this type was developed using Objective C code and EarthScope/USArray data, which was then modified and ported to the Matlab platform and has now been standardized and automated as an IRIS Data Management System (IRIS-DMS) data product. These iterative code developments and improvements were completed by C. Ammon, R. Woodward and M. Bahavar, respectively. Currently, an automated script creates Ground Motion Visualizations (GMVs) for all global earthquakes over magnitude 6 recorded by EarthScope's USArray Transportable Array (USArray TA) network. The USArray TA network is a rolling array of 400 broadband stations deployed on a uniform 70-km grid. These near real-time GMV visualizations are typically available for download within 4 hours or less of their occurrence (see: www.iris.edu/dms/products/usarraygmv/). The IRIS-DMS group has recently added a feature that allows users to highlight key elements within the GMVs, by providing an online tool for creating customized GMVs. This new interface allows users to select the stations, channels, and time window of interest, adjust the mapped areal extent of the view, and specify high and low pass filters. An online tutorial available from the IRIS Education and Public Outreach (IRIS-EPO) website, listed below, steps through a teaching sequence that can be used to explain the basic features of the GMVs. For example, they can be used to demonstrate simple concepts such as relative P, S and surface wave velocities and corresponding wavelengths for middle-school students, or more advanced concepts such as the influence of focal mechanism on waveforms, or how seismic waves converge at an earthquake's antipode. For those who desire a greater level of customization, including the ability to use the GMV framework with data sets not stored within the IRIS-DMS, the Matlab GMV code is now also available from the IRIS-DMS website. These GMV codes have been applied to sac-formatted data from the Quake Catcher Network (QCN). Through a collaboration between NSF-funded programs and projects (e.g., IRIS and QCN) we are striving to make these codes user friendly enough to be routinely incorporated in undergraduate and graduate seismology classes. In this way, we will help provide a research tool for students to explore never-looked-at-before data, similar to actual seismology research. As technology is advancing quickly, we now have more data than seismologists can easily examine. Given this, we anticipate students using our codes can perform a 'citizen scientist' role in that they can help us identify key signals within the unexamined vast data streams we are acquiring.

  5. A portable platform to collect and review behavioral data simultaneously with neurophysiological signals.

    PubMed

    Tianxiao Jiang; Siddiqui, Hasan; Ray, Shruti; Asman, Priscella; Ozturk, Musa; Ince, Nuri F

    2017-07-01

    This paper presents a portable platform to collect and review behavioral data simultaneously with neurophysiological signals. The whole system is comprised of four parts: a sensor data acquisition interface, a socket server for real-time data streaming, a Simulink system for real-time processing and an offline data review and analysis toolbox. A low-cost microcontroller is used to acquire data from external sensors such as accelerometer and hand dynamometer. The micro-controller transfers the data either directly through USB or wirelessly through a bluetooth module to a data server written in C++ for MS Windows OS. The data server also interfaces with the digital glove and captures HD video from webcam. The acquired sensor data are streamed under User Datagram Protocol (UDP) to other applications such as Simulink/Matlab for real-time analysis and recording. Neurophysiological signals such as electroencephalography (EEG), electrocorticography (ECoG) and local field potential (LFP) recordings can be collected simultaneously in Simulink and fused with behavioral data. In addition, we developed a customized Matlab Graphical User Interface (GUI) software to review, annotate and analyze the data offline. The software provides a fast, user-friendly data visualization environment with synchronized video playback feature. The software is also capable of reviewing long-term neural recordings. Other featured functions such as fast preprocessing with multithreaded filters, annotation, montage selection, power-spectral density (PSD) estimate, time-frequency map and spatial spectral map are also implemented.

  6. Revising the Representation of Fatty Acid, Glycerolipid, and Glycerophospholipid Metabolism in the Consensus Model of Yeast Metabolism

    PubMed Central

    Aung, Hnin W.; Henry, Susan A.

    2013-01-01

    Abstract Genome-scale metabolic models are built using information from an organism's annotated genome and, correspondingly, information on reactions catalyzed by the set of metabolic enzymes encoded by the genome. These models have been successfully applied to guide metabolic engineering to increase production of metabolites of industrial interest. Congruity between simulated and experimental metabolic behavior is influenced by the accuracy of the representation of the metabolic network in the model. In the interest of applying the consensus model of Saccharomyces cerevisiae metabolism for increased productivity of triglycerides, we manually evaluated the representation of fatty acid, glycerophospholipid, and glycerolipid metabolism in the consensus model (Yeast v6.0). These areas of metabolism were chosen due to their tightly interconnected nature to triglyceride synthesis. Manual curation was facilitated by custom MATLAB functions that return information contained in the model for reactions associated with genes and metabolites within the stated areas of metabolism. Through manual curation, we have identified inconsistencies between information contained in the model and literature knowledge. These inconsistencies include incorrect gene-reaction associations, improper definition of substrates/products in reactions, inappropriate assignments of reaction directionality, nonfunctional β-oxidation pathways, and missing reactions relevant to the synthesis and degradation of triglycerides. Suggestions to amend these inconsistencies in the Yeast v6.0 model can be implemented through a MATLAB script provided in the Supplementary Materials, Supplementary Data S1 (Supplementary Data are available online at www.liebertpub.com/ind). PMID:24678285

  7. Real-Time Motion Capture Toolbox (RTMocap): an open-source code for recording 3-D motion kinematics to study action-effect anticipations during motor and social interactions.

    PubMed

    Lewkowicz, Daniel; Delevoye-Turrell, Yvonne

    2016-03-01

    We present here a toolbox for the real-time motion capture of biological movements that runs in the cross-platform MATLAB environment (The MathWorks, Inc., Natick, MA). It provides instantaneous processing of the 3-D movement coordinates of up to 20 markers at a single instant. Available functions include (1) the setting of reference positions, areas, and trajectories of interest; (2) recording of the 3-D coordinates for each marker over the trial duration; and (3) the detection of events to use as triggers for external reinforcers (e.g., lights, sounds, or odors). Through fast online communication between the hardware controller and RTMocap, automatic trial selection is possible by means of either a preset or an adaptive criterion. Rapid preprocessing of signals is also provided, which includes artifact rejection, filtering, spline interpolation, and averaging. A key example is detailed, and three typical variations are developed (1) to provide a clear understanding of the importance of real-time control for 3-D motion in cognitive sciences and (2) to present users with simple lines of code that can be used as starting points for customizing experiments using the simple MATLAB syntax. RTMocap is freely available (http://sites.google.com/site/RTMocap/) under the GNU public license for noncommercial use and open-source development, together with sample data and extensive documentation.

  8. Link Analysis in the Mission Planning Lab

    NASA Technical Reports Server (NTRS)

    McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang

    2011-01-01

    The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.

  9. Increasing your HCAHPS scores with Extreme Customer Service.

    PubMed

    Clouarte, Joe

    2016-10-01

    Providing great customer service is extremely critical in the healthcare setting, especially when it comes to HCAHBPS (Hospital Consumer Assessment of Health care Providers and Systems) scores, the author says. While there are several service training programs within healthcare, they often require six to eight minutes of interaction with patients or guests. This works well for clinical staff, he says, but when it comes to non-clinical staff, including security officers, many times they only have fifteen or thirty seconds to create positive patient or guest experience. In this article he describes Extreme Customer Service © a program he has developed to fill that customer gap for non-clinical staff.

  10. 78 FR 26649 - Agency Information Collection Activities: Trusted Traveler Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-07

    ... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection Agency Information Collection Activities: Trusted Traveler Programs AGENCY: U.S. Customs and Border Protection (CBP), Department of Homeland Security. ACTION: 60-Day Notice and request for comments; Extension of an existing collection of...

  11. Customer Satisfaction with Training Programs.

    ERIC Educational Resources Information Center

    Mulder, Martin

    2001-01-01

    A model for evaluating customer satisfaction with training programs was tested with training purchasers. The model confirmed two types of projects: training aimed at achieving learning results and at changing job performance. The model did not fit for training intended to support organizational change. (Contains 31 references.) (SK)

  12. Loudspeaker equalization for auditory research.

    PubMed

    MacDonald, Justin A; Tran, Phuong K

    2007-02-01

    The equalization of loudspeaker frequency response is necessary to conduct many types of well-controlled auditory experiments. This article introduces a program that includes functions to measure a loudspeaker's frequency response, design equalization filters, and apply the filters to a set of stimuli to be used in an auditory experiment. The filters can compensate for both magnitude and phase distortions introduced by the loudspeaker. A MATLAB script is included in the Appendix to illustrate the details of the equalization algorithm used in the program.

  13. Computer aided design environment for the analysis and design of multi-body flexible structures

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant V.; Singh, Ramen P.

    1989-01-01

    A computer aided design environment consisting of the programs NASTRAN, TREETOPS and MATLAB is presented in this paper. With links for data transfer between these programs, the integrated design of multi-body flexible structures is significantly enhanced. The CAD environment is used to model the Space Shuttle/Pinhole Occulater Facility. Then a controller is designed and evaluated in the nonlinear time history sense. Recent enhancements and ongoing research to add more capabilities are also described.

  14. Afghan Customs: U.S. Programs Have Had Some Successes, but Challenges Will Limit Customs Revenue as a Sustainable Source of Income for Afghanistan

    DTIC Science & Technology

    2014-04-01

    TAFA II programs from November 2009 through August 2013. These programs were followed by the Afghanistan Trade and Revenue ( ATAR ) program as a...successor program—the Afghanistan Trade and Revenue ( ATAR ) program, which started in November 2013. CBP has administered the Border Management Task... ATAR contract documents as important anti-corruption measures, SIGAR found that the ATAR contract does not require the implementing partner to meet

  15. The Relationship between Gender and Students' Attitude and Experience of Using a Computer Algebra System

    ERIC Educational Resources Information Center

    Ocak, Mehmet

    2008-01-01

    This correlational study examined the relationship between gender and the students' attitude and prior knowledge of using one of the mathematical software programs (MATLAB). Participants were selected from one community college, one state university and one private college. Students were volunteers from three Calculus I classrooms (one class from…

  16. The NASA Computational Fluid Dynamics (CFD) program - Building technology to solve future challenges

    NASA Technical Reports Server (NTRS)

    Richardson, Pamela F.; Dwoyer, Douglas L.; Kutler, Paul; Povinelli, Louis A.

    1993-01-01

    This paper presents the NASA Computational Fluid Dynamics program in terms of a strategic vision and goals as well as NASA's financial commitment and personnel levels. The paper also identifies the CFD program customers and the support to those customers. In addition, the paper discusses technical emphasis and direction of the program and some recent achievements. NASA's Ames, Langley, and Lewis Research Centers are the research hubs of the CFD program while the NASA Headquarters Office of Aeronautics represents and advocates the program.

  17. 12 CFR Appendix B to Part 364 - Interagency Guidelines Establishing Information Security Standards

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Part 364—Interagency Guidelines Establishing Information Security Standards Table of Contents I... Customer Information A. Information Security Program B. Objectives III. Development and Implementation of Customer Information Security Program A. Involve the Board of Directors B. Assess Risk C. Manage and...

  18. Student Loyalty Assessment with Online Master's Programs

    ERIC Educational Resources Information Center

    Dehghan, Ali

    2012-01-01

    Relationship marketing is attracting, maintaining, and, in multi-service organizations, enhancing customer relationships. Educational programs and services, like those of businesses, depend highly on the repeated purchases of their loyal customers. The purpose of this descriptive research is to investigate the relationships between factors that…

  19. 75 FR 22681 - Supplemental Guidance on Overdraft Protection Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-29

    ... recommended as a Best Practice that a savings association train customer service or consumer complaint... association should present the program as a customer service that may cover inadvertent consumer overdrafts.... alternatives. Provide information about alternatives when they are offered--Part III.A.2. Train staff to...

  20. Developing Customized Programs for Steel and Other Heavy Industries.

    ERIC Educational Resources Information Center

    Day, Philip R., Jr.

    1984-01-01

    Describes Dundalk Community College's (DCC's) customized training programs for local industries. Looks at employment problems and outlook in Baltimore County, the development of a training agreement with Bethlehem Steel, the use of the Developing a Curriculum (DACUM) process to develop skill profiles, and future directions. (DMM)

  1. 47 CFR 101.603 - Permissible communications.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., to their customers except that the distribution of video entertainment material to customers is... 6425-6525 MHz, 17,700-18,580 MHz, and on frequencies above 21,200 MHz, licensees may deliver any of... program material to multichannel video programming distributors, except in the frequency bands 6425-6525...

  2. 47 CFR 101.603 - Permissible communications.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., to their customers except that the distribution of video entertainment material to customers is... 6425-6525 MHz, 17,700-18,580 MHz, and on frequencies above 21,200 MHz, licensees may deliver any of... program material to multichannel video programming distributors, except in the frequency bands 6425-6525...

  3. 47 CFR 101.603 - Permissible communications.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., to their customers except that the distribution of video entertainment material to customers is... 6425-6525 MHz, 17,700-18,580 MHz, and on frequencies above 21,200 MHz, licensees may deliver any of... program material to multichannel video programming distributors, except in the frequency bands 6425-6525...

  4. The Role of Logic Modeling in a Collaborative and Iterative Research Process: Lessons from Research and Analysis Conducted with the Federal Voting Assistance Program

    DTIC Science & Technology

    2016-01-01

    outputs, customers , and outcomes (see Figure 2.1). In the Taylor-Powell and Henert simple three-part example, the food would constitute an input, finding... Customer Activities etaidemretnI Goals Strategic Goals Annual Goals Management Objectives Operations M ission External factors Annual...Partners are the individuals or organizations that work with programs to conduct activities or enable outputs. • Customers (intermediate and final

  5. 76 FR 6633 - Agency Information Collection Activities: Proposed Collection; Comments Requested

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-07

    ... of Information Collection Under Review: Customer Satisfaction Surveys. The Department of Justice (DOJ...: Extension of a currently approved collection. (2) Title of the Form/Collection: Customer Satisfaction... of Alcohol, Tobacco, Firearms and Explosives distribute program-specific customer satisfaction...

  6. Implementation of Custom Colors in the DECwindows Environment

    DTIC Science & Technology

    1992-01-01

    Implementation of Custom Colors in the DECwindlows Environment Program Element No 0604262 Project No 64214 6. Author(s). Task No Stephanie A. Myrick, Maura C...13. Abstract (Maximum 200 words), This paper describes the implementation of user-defined, or custom , colors in the DECwindows environmeot Custom ...colors can be used to augment the standard color set that is associated with the hardware colormap. The custom color set that is included in this paper

  7. Your loyalty program is betraying you.

    PubMed

    Nunes, Joseph C; Drèze, Xavier

    2006-04-01

    Even as loyalty programs are launched left and right, many are being scuttled. How can that be? These days, everyone knows that an old customer retained is worth more than a new customer won. What is so hard about making a simple loyalty program work? Quite a lot, the authors say. The biggest challenges include clarifying business goals, engineering the reward structure, and creating incentives powerful enough to change buying behavior but not so generous that they erode margins. Additionally, companies have to sort out the puzzles of consumer psychology, which can result, for example, in two rewards of equal economic value inspiring very different levels of purchasing. In their research, the authors have discovered patterns in what the successful loyalty programs get right and in how the others fail. Together, their findings constitute a tool kit for designing something rare indeed: a program that won't do you wrong. To begin with, it's important to know exactly what a loyalty program can do. It can keep customers from defecting, induce them to consolidate certain purchases with one seller (in other words, win a greater share of wallet), prompt customers to make additional purchases, yield insight into their behavior and preferences, and turn a profit. A program can meet these objectives in several ways--for instance, by offering rewards (points, say, or frequent-flier miles) divisible enough to provide many redemption opportunities but not so divisible that they fail to lock in customers. Companies striving to generate customer loyalty should avoid five common mistakes: Don't create a new commodity, which can result in price wars and other tit-for-tat competitive moves; don't cater to the disloyal by making rewards easy for just anyone to reap; don't reward purchasing volume over profitability; don't give away the store; and, finally, don't promise what can't be delivered.

  8. A Low-Cost Method of Ciliary Beat Frequency Measurement Using iPhone and MATLAB: Rabbit Study.

    PubMed

    Chen, Jason J; Lemieux, Bryan T; Wong, Brian J F

    2016-08-01

    (1) To determine ciliary beat frequency (CBF) using a consumer-grade cellphone camera and MATLAB and (2) to evaluate the effectiveness and accuracy of the proposed method. Prospective animal study. Academic otolaryngology department research laboratory. Five ex vivo tracheal samples were extracted from 3 freshly euthanized (<3 hours postmortem) New Zealand white rabbits and incubated for 30 minutes in buffer at 23°C, buffer at 37°C, or 10% formalin at 23°C. Samples were sectioned transversely and observed under a phase-contrast microscope. Cilia movement was recorded through the eyepiece using an iPhone 6 at 240 frames per second (fps). Through MATLAB programming, the video of the 23°C sample was downsampled to 120, 60, and 30 fps, and Fourier analysis was performed on videos of all frame rates and conditions to determine CBF. CBF of the 23°C sample was also calculated manually frame by frame for verification. Recorded at 240 fps, the CBF at 23°C was 5.03 ± 0.4 Hz, and the CBF at 37°C was 9.08 ± 0.49 Hz (P < .001). The sample with 10% formalin did not display any data beyond DC noise. Compared with 240 fps, the means of other frame rates/methods (120, 60, 30 fps; manual counting) at 23°C all showed no statistical difference (P > .05). There is no significant difference between CBF measured via visual inspection and that analyzed by the developed program. Furthermore, all tested acquisition rates are shown to be effective, providing a fast and inexpensive alternative to current CBF measurement protocols. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.

  9. The role of complaint management in the service recovery process.

    PubMed

    Bendall-Lyon, D; Powers, T L

    2001-05-01

    Patient satisfaction and retention can be influenced by the development of an effective service recovery program that can identify complaints and remedy failure points in the service system. Patient complaints provide organizations with an opportunity to resolve unsatisfactory situations and to track complaint data for quality improvement purposes. Service recovery is an important and effective customer retention tool. One way an organization can ensure repeat business is by developing a strong customer service program that includes service recovery as an essential component. The concept of service recovery involves the service provider taking responsive action to "recover" lost or dissatisfied customers and convert them into satisfied customers. Service recovery has proven to be cost-effective in other service industries. The complaint management process involves six steps that organizations can use to influence effective service recovery: (1) encourage complaints as a quality improvement tool; (2) establish a team of representatives to handle complaints; (3) resolve customer problems quickly and effectively; (4) develop a complaint database; (5) commit to identifying failure points in the service system; and (6) track trends and use information to improve service processes. Customer retention is enhanced when an organization can reclaim disgruntled patients through the development of effective service recovery programs. Health care organizations can become more customer oriented by taking advantage of the information provided by patient complaints, increasing patient satisfaction and retention in the process.

  10. Web based Health Education, E-learning, for weight management.

    PubMed

    Heetebry, Irene; Hatcher, Myron; Tabriziani, Hossein

    2005-12-01

    Obesity is a major health problem across the United States and becoming a progressive world wide problem. An overweight person could access the weight management program and develop a personalized weight reduction plan. The customer enters specific data to personalize the program and in the future an artificial intelligence program can evaluate customer behavior and adjust accordingly. This is an on-line program with class room support, offered as back up when desired by the patient.

  11. Advances in Engineering Software for Lift Transportation Systems

    NASA Astrophysics Data System (ADS)

    Kazakoff, Alexander Borisoff

    2012-03-01

    In this paper an attempt is performed at computer modelling of ropeway ski lift systems. The logic in these systems is based on a travel form between the two terminals, which operates with high capacity cabins, chairs, gondolas or draw-bars. Computer codes AUTOCAD, MATLAB and Compaq-Visual Fortran - version 6.6 are used in the computer modelling. The rope systems computer modelling is organized in two stages in this paper. The first stage is organization of the ground relief profile and a design of the lift system as a whole, according to the terrain profile and the climatic and atmospheric conditions. The ground profile is prepared by the geodesists and is presented in an AUTOCAD view. The next step is the design of the lift itself which is performed by programmes using the computer code MATLAB. The second stage of the computer modelling is performed after the optimization of the co-ordinates and the lift profile using the computer code MATLAB. Then the co-ordinates and the parameters are inserted into a program written in Compaq Visual Fortran - version 6.6., which calculates 171 lift parameters, organized in 42 tables. The objective of the work presented in this paper is an attempt at computer modelling of the design and parameters derivation of the rope way systems and their computer variation and optimization.

  12. CPMC-Lab: A MATLAB package for Constrained Path Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Nguyen, Huy; Shi, Hao; Xu, Jie; Zhang, Shiwei

    2014-12-01

    We describe CPMC-Lab, a MATLAB program for the constrained-path and phaseless auxiliary-field Monte Carlo methods. These methods have allowed applications ranging from the study of strongly correlated models, such as the Hubbard model, to ab initio calculations in molecules and solids. The present package implements the full ground-state constrained-path Monte Carlo (CPMC) method in MATLAB with a graphical interface, using the Hubbard model as an example. The package can perform calculations in finite supercells in any dimensions, under periodic or twist boundary conditions. Importance sampling and all other algorithmic details of a total energy calculation are included and illustrated. This open-source tool allows users to experiment with various model and run parameters and visualize the results. It provides a direct and interactive environment to learn the method and study the code with minimal overhead for setup. Furthermore, the package can be easily generalized for auxiliary-field quantum Monte Carlo (AFQMC) calculations in many other models for correlated electron systems, and can serve as a template for developing a production code for AFQMC total energy calculations in real materials. Several illustrative studies are carried out in one- and two-dimensional lattices on total energy, kinetic energy, potential energy, and charge- and spin-gaps.

  13. 12 CFR Appendix D-2 to Part 208 - Interagency Guidelines Establishing Information Security Standards

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Relationships Risk Management Principles,” Nov. 1, 2001; FDIC FIL 68-99, Risk Assessment Tools and Practices for.... Definitions II. Standards for Safeguarding Customer Information A. Information Security Program B. Objectives III. Development and Implementation of Customer Information Security Program A. Involve the Board of...

  14. 12 CFR Appendix F to Part 225 - Interagency Guidelines Establishing Information Security Standards

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Relationships Risk Management Principles,” Nov. 1, 2001; FDIC FIL 68-99, Risk Assessment Tools and Practices for.... Standards for Safeguarding Customer Information A. Information Security Program B. Objectives III. Development and Implementation of Customer Information Security Program A. Involve the Board of Directors B...

  15. 17 CFR 150.4 - Aggregation of positions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... knowledge of, gaining access to, or receiving data about the trading or positions of the pool; (ii) The... is part of, or participates in, or receives trading advice from a customer trading program of a... discretionary account or the customer trading program is determined independently of all trading decisions in...

  16. University to Community and Back: Creating a Customer Focused Process.

    ERIC Educational Resources Information Center

    Martin-Milius, Tara

    This paper examines ways in which university extension programs can become more customer-focused in the courses and services that they deliver, focusing on the experiences of the University of California Extension, Santa Cruz. Extension programs can increase their effectiveness by: (1) establishing partnerships with other service organizations,…

  17. Managing Food Service Costs and Satisfying Customers.

    ERIC Educational Resources Information Center

    Reuther, Anne; Otto, Ione

    1987-01-01

    Milwaukee Area Technical College, Wisconsin, has four campuses, each with its own food service operation that, combined, serve nearly 3,000 people daily. Several food service-related programs are part of the curriculum. Cost containment and customer satisfaction are the two overriding goals of the food service programs. (MLF)

  18. Aggregating Case Study Data in Customer Service Training.

    ERIC Educational Resources Information Center

    Barrington, Gail V.

    An evaluation was conducted to determine the outcomes and impacts of participation in the ALBERTA BEST training program in terms of participant attitudes toward service excellence and business profitability. ALBERTA BEST is a customer service program offered by the Alberta (Canada) government. The evaluation involved a series of case studies…

  19. Recruitment. Getting Customers for Employment and Training Programs.

    ERIC Educational Resources Information Center

    Newton, Greg

    This workbook presents the essential principles of successful marketing and applies the proven strategies used by the private sector to attract customers for their products to the recruitment of clients for employment and training programs. It also provides the tools and how-to's to develop recruitment strategies. Informative materials, lists of…

  20. Missouri Customized Training Program. Skills for Tomorrow's Work Force. Brochure #80238.

    ERIC Educational Resources Information Center

    Missouri State Div. of Job Development and Training, Jefferson City.

    This publication provides businesses with information on the Missouri Customized Training Program (MCTP), which provides assistance to Missouri businesses in recruiting, training, and retraining of workers. It describes the two types of MCTP training: Skill Training and On-the-Job Training. Employee recruitment options are also discussed. Four…

  1. American Recovery and Reinvestment Act of 2009: Final Report on Customer Acceptance, Retention, and Response to Time-Based Rates from Consumer Behavior Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cappers, Peter; Scheer, Rich

    Time-based rate programs, enabled by utility investments in advanced metering infrastructure (AMI), are increasingly being considered by utilities as tools to reduce peak demand and enable customers to better manage consumption and costs. Under the Smart Grid Investment Grant Program (SGIG), the U.S. Department of Energy (DOE) partnered with several electric utilities to conduct consumer behavior studies (CBS). The goals involved applying randomized and controlled experimental designs for estimating customer responses more precisely and credibly to advance understanding of time-based rates and customer systems, and provide new information for improving program designs, implementation strategies, and evaluations. The intent was tomore » produce more robust and credible analysis of impacts, costs, benefits, and lessons learned and assist utility and regulatory decision makers in evaluating investment opportunities involving time-based rates.« less

  2. 75 FR 38725 - Service Performance Measurement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-06

    ... of Customer Sastisfaction A. General Considerations B. Rule 3055.91--Consumer Access to Postal Services C. Rule 3055.92--Customer Experience Measurement Surveys D. Rule 3055.93--Mystery Shopper Program... Commission is adopting a final rule on service perfomance measurement and customer satisfaction. The final...

  3. Outcomes study of a customer relations educational program in dialysis practice.

    PubMed

    Schmidt, Jill

    2006-01-01

    In the fall of 2003, a customer relations staff educational program was instituted for use throughout Renal Care Group. After 1 year and 4 months, an outcomes study was implemented to evaluate and revise the program. The program is taught by the dialysis social worker to the rest of the dialysis health-care team to increase customer relations skills with the patients, their families, and dialysis coworkers. Today, more than ever, patients are seeking better quality health care not only in technical skills but also in compassionate care. Pretest and posttest scores from each training session indicated the necessity of this training and knowledge gained. Through a survey of instructors, strengths of the program were identified, as well as were areas that needed revision. In addition to providing helpful information, some of the main strengths of the program indicated by instructors were team building and problem solving. Revisions consisted primarily of shortening the modules and simplifying the program's use.

  4. Arbitrating Control of Control and Display Units

    NASA Technical Reports Server (NTRS)

    Sugden, Paul C.

    2007-01-01

    The ARINC 739 Switch is a computer program that arbitrates control of two multi-function control and display units (MCDUs) between (1) a commercial flight-management computer (FMC) and (2) NASA software used in research on transport aircraft. (MCDUs are the primary interfaces between pilots and FMCs on many commercial aircraft.) This program was recently redesigned into a software library that can be embedded in research application programs. As part of the redesign, this software was combined with software for creating custom pages of information to be displayed on a CDU. This software commands independent switching of the left (pilot s) and right (copilot s) MCDUs. For example, a custom CDU page can control the left CDU while the FMC controls the right CDU. The software uses menu keys to switch control of the CDU between the FMC or a custom CDU page. The software provides an interface that enables custom CDU pages to insert keystrokes into the FMC s CDU input interface. This feature allows the custom CDU pages to manipulate the FMC as if it were a pilot.

  5. Promoting Healthy Choices in Non-Chain Restaurants: Effects of a Simple Cue to Customers

    PubMed Central

    Nothwehr, Faryle K.; Snetselaar, Linda; Dawson, Jeffrey; Schultz, Ulrike

    2014-01-01

    This study tested a novel intervention to influence restaurant customer ordering behavior, with measurements at baseline and 3, 6, and 12 months postintervention in four owner-operated restaurants in the Midwest. A sample of 141 to 370 customers was surveyed at each time point. The response rate was 70% to 84% with 59% women, 98% White, and a mean age of 53 years. Table signs listed changes customers might consider, for example, asking for meat broiled instead of fried or requesting smaller portions. Customer surveys measured program reach and effectiveness. Owner interviews measured perceptions of program burden and customer response. Order slips were analyzed for evidence of changes in ordering. Window signs were noticed by 40%, 48%, and 45% of customers at each follow-up, respectively. Table signs were noticed by 67%, 71%, and 69% of customers, respectively. Of those, 34% at each time point stated that the signs influenced their order. Examples of how orders were influenced were elicited. Order slip data not only did not show significant changes but was also found to be an inadequate measure for the intervention. Owners reported no concerns or complaints. This intervention resulted in small but positive behavior changes among a portion of customers. Because of its simplicity and acceptability, it has great potential for dissemination. PMID:23048009

  6. Promoting healthy choices in non-chain restaurants: effects of a simple cue to customers.

    PubMed

    Nothwehr, Faryle K; Snetselaar, Linda; Dawson, Jeffrey; Schultz, Ulrike

    2013-01-01

    This study tested a novel intervention to influence restaurant customer ordering behavior, with measurements at baseline and 3, 6, and 12 months postintervention in four owner-operated restaurants in the Midwest. A sample of 141 to 370 customers was surveyed at each time point. The response rate was 70% to 84% with 59% women, 98% White, and a mean age of 53 years. Table signs listed changes customers might consider, for example, asking for meat broiled instead of fried or requesting smaller portions. Customer surveys measured program reach and effectiveness. Owner interviews measured perceptions of program burden and customer response. Order slips were analyzed for evidence of changes in ordering. Window signs were noticed by 40%, 48%, and 45% of customers at each follow-up, respectively. Table signs were noticed by 67%, 71%, and 69% of customers, respectively. Of those, 34% at each time point stated that the signs influenced their order. Examples of how orders were influenced were elicited. Order slip data not only did not show significant changes but was also found to be an inadequate measure for the intervention. Owners reported no concerns or complaints. This intervention resulted in small but positive behavior changes among a portion of customers. Because of its simplicity and acceptability, it has great potential for dissemination.

  7. Introduction to TAFI - A Matlab® toolbox for analysis of flexural isostasy

    NASA Astrophysics Data System (ADS)

    Jha, S.; Harry, D. L.; Schutt, D.

    2016-12-01

    The isostatic response of vertical tectonic loads emplaced on thin elastic plates overlying inviscid substrate and the corresponding gravity anomalies are commonly modeled using well established theories and methodologies of flexural analysis. However, such analysis requires some mathematical and coding expertise on part of users. With that in mind, we designed a new interactive Matlab® toolbox called Toolbox for Analysis of Flexural Isostasy (TAFI). TAFI allows users to create forward models (2-D and 3-D) of flexural deformation of the lithosphere and resulting gravity anomaly. TAFI computes Green's Functions for flexure of the elastic plate subjected to point or line loads, and analytical solution for harmonic loads. Flexure due to non-impulsive, distributed 2-D or 3-D loads are computed by convolving the appropriate Green's function with a user-supplied spatially discretized load function. The gravity anomaly associated with each density interface is calculated by using the Fourier Transform of flexural deflection of these interfaces and estimating the gravity in the wavenumber domain. All models created in TAFI are based on Matlab's intrinsic functions and do not require any specialized toolbox, function or library except those distributed with TAFI. Modeling functions within TAFI can be called from Matlab workspace, from within user written programs or from the TAFI's graphical user interface (GUI). The GUI enables the user to model the flexural deflection of lithosphere interactively, enabling real time comparison of model fit with observed data constraining the flexural deformation and gravity, facilitating rapid search for best fitting flexural model. TAFI is a very useful teaching and research tool and have been tested rigorously in graduate level teaching and basic research environment.

  8. Using health and demographic surveillance for the early detection of cholera outbreaks: analysis of community- and hospital-based data from Matlab, Bangladesh.

    PubMed

    Saulnier, Dell D; Persson, Lars-Åke; Streatfield, Peter Kim; Faruque, A S G; Rahman, Anisur

    2016-01-01

    Cholera outbreaks are a continuing problem in Bangladesh, and the timely detection of an outbreak is important for reducing morbidity and mortality. In Matlab, the ongoing Health and Demographic Surveillance System (HDSS) data records symptoms of diarrhea in children under the age of 5 years at the community level. Cholera surveillance in Matlab currently uses hospital-based data. The objective of this study is to determine whether increases in cholera in Matlab can be detected earlier by using HDSS diarrhea symptom data in a syndromic surveillance analysis, when compared to hospital admissions for cholera. HDSS diarrhea symptom data and hospital admissions for cholera in children under 5 years of age over a 2-year period were analyzed with the syndromic surveillance statistical program EARS (Early Aberration Reporting System). Dates when significant increases in either symptoms or cholera cases occurred were compared to one another. The analysis revealed that there were 43 days over 16 months when the cholera cases or diarrhea symptoms increased significantly. There were 8 months when both data sets detected days with significant increases. In 5 of the 8 months, increases in diarrheal symptoms occurred before increases of cholera cases. The increases in symptoms occurred between 1 and 15 days before the increases in cholera cases. The results suggest that the HDSS survey data may be able to detect an increase in cholera before an increase in hospital admissions is seen. However, there was no direct link between diarrheal symptom increases and cholera cases, and this, as well as other methodological weaknesses, should be taken into consideration.

  9. A Series of MATLAB Learning Modules to Enhance Numerical Competency in Applied Marine Sciences

    NASA Astrophysics Data System (ADS)

    Fischer, A. M.; Lucieer, V.; Burke, C.

    2016-12-01

    Enhanced numerical competency to navigate the massive data landscapes are critical skills students need to effectively explore, analyse and visualize complex patterns in high-dimensional data for addressing the complexity of many of the world's problems. This is especially the case for interdisciplinary, undergraduate applied marine science programs, where students are required to demonstrate competency in methods and ideas across multiple disciplines. In response to this challenge, we have developed a series of repository-based data exploration, analysis and visualization modules in MATLAB for integration across various attending and online classes within the University of Tasmania. The primary focus of these modules is to teach students to collect, aggregate and interpret data from large on-line marine scientific data repositories to, 1) gain technical skills in discovering, accessing, managing and visualising large, numerous data sources, 2) interpret, analyse and design approaches to visualise these data, and 3) to address, through numerical approaches, complex, real-world problems, that the traditional scientific methods cannot address. All modules, implemented through a MATLAB live script, include a short recorded lecture to introduce the topic, a handout that gives an overview of the activities, an instructor's manual with a detailed methodology and discussion points, a student assessment (quiz and level-specific challenge task), and a survey. The marine science themes addressed through these modules include biodiversity, habitat mapping, algal blooms and sea surface temperature change and utilize a series of marine science and oceanographic data portals. Through these modules students, with minimal experience in MATLAB or numerical methods are introduced to array indexing, concatenation, sorting, and reshaping, principal component analysis, spectral analysis and unsupervised classification within the context of oceanographic processes, marine geology and marine community ecology.

  10. Essays on Mathematical Optimization for Residential Demand Response in the Energy Sector

    NASA Astrophysics Data System (ADS)

    Palaparambil Dinesh, Lakshmi

    In the electric utility industry, it could be challenging to adjust supply to match demand due to large generator ramp up times, high generation costs and insufficient in-house generation capacity. Demand response (DR) is a technique for adjusting the demand for electric power instead of the supply. Direct Load Control (DLC) is one of the ways to implement DR. DLC program participants sign up for power interruption contracts and are given financial incentives for curtailing electricity usage during peak demand time periods. This dissertation studies a DLC program for residential air conditioners using mathematical optimization models. First, we develop a model that determines what contract parameters to use in designing contracts between the provider and residential customers, when to turn which power unit on or off and how much power to cut during peak demand hours. The model uses information on customer preferences for choice of contract parameters such as DLC financial incentives and energy usage curtailment. In numerical experiments, the proposed model leads to projected cost savings of the order of 20%, compared to a current benchmark model used in practice. We also quantify the impact of factors leading to cost savings and study characteristics of customers picked by different contracts. Second, we study a DLC program in a macro economic environment using a Computable General Equilibrium (CGE) model. A CGE model is used to study the impact of external factors such as policy and technology changes on different economic sectors. Here we differentiate customers based on their preference for DLC programs by using different values for price elasticity of demand for electricity commodity. Consequently, DLC program customers could substitute demand for electricity commodity with other commodities such as transportation sector. Price elasticity of demand is calculated using a novel methodology that incorporates customer preferences for DLC contracts from the first model. The calculation of elasticity based on our methodology is useful since the prices of commodities are not only determined by aggregate demand and supply but also by customers' relative preferences for commodities. In addition to this we quantify the indirect substitution and rebound effects on sectoral activity levels, incomes and prices based on customer differences, when DLC is implemented.

  11. 39 CFR 501.18 - Customer information and authorization.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Customer information and authorization. 501.18 Section 501.18 Postal Service UNITED STATES POSTAL SERVICE POSTAGE PROGRAMS AUTHORIZATION TO MANUFACTURE AND DISTRIBUTE POSTAGE EVIDENCING SYSTEMS § 501.18 Customer information and authorization. (a...

  12. 39 CFR 501.18 - Customer information and authorization.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 39 Postal Service 1 2011-07-01 2011-07-01 false Customer information and authorization. 501.18 Section 501.18 Postal Service UNITED STATES POSTAL SERVICE POSTAGE PROGRAMS AUTHORIZATION TO MANUFACTURE AND DISTRIBUTE POSTAGE EVIDENCING SYSTEMS § 501.18 Customer information and authorization. (a...

  13. Teaching Computational Geophysics Classes using Active Learning Techniques

    NASA Astrophysics Data System (ADS)

    Keers, H.; Rondenay, S.; Harlap, Y.; Nordmo, I.

    2016-12-01

    We give an overview of our experience in teaching two computational geophysics classes at the undergraduate level. In particular we describe The first class is for most students the first programming class and assumes that the students have had an introductory course in geophysics. In this class the students are introduced to basic Matlab skills: use of variables, basic array and matrix definition and manipulation, basic statistics, 1D integration, plotting of lines and surfaces, making of .m files and basic debugging techniques. All of these concepts are applied to elementary but important concepts in earthquake and exploration geophysics (including epicentre location, computation of travel time curves for simple layered media plotting of 1D and 2D velocity models etc.). It is important to integrate the geophysics with the programming concepts: we found that this enhances students' understanding. Moreover, as this is a 3 year Bachelor program, and this class is taught in the 2nd semester, there is little time for a class that focusses on only programming. In the second class, which is optional and can be taken in the 4th or 6th semester, but often is also taken by Master students we extend the Matlab programming to include signal processing and ordinary and partial differential equations, again with emphasis on geophysics (such as ray tracing and solving the acoustic wave equation). This class also contains a project in which the students have to write a brief paper on a topic in computational geophysics, preferably with programming examples. When teaching these classes it was found that active learning techniques, in which the students actively participate in the class, either individually, in pairs or in groups, are indispensable. We give a brief overview of the various activities that we have developed when teaching theses classes.

  14. PFMCal : Photonic force microscopy calibration extended for its application in high-frequency microrheology

    NASA Astrophysics Data System (ADS)

    Butykai, A.; Domínguez-García, P.; Mor, F. M.; Gaál, R.; Forró, L.; Jeney, S.

    2017-11-01

    The present document is an update of the previously published MatLab code for the calibration of optical tweezers in the high-resolution detection of the Brownian motion of non-spherical probes [1]. In this instance, an alternative version of the original code, based on the same physical theory [2], but focused on the automation of the calibration of measurements using spherical probes, is outlined. The new added code is useful for high-frequency microrheology studies, where the probe radius is known but the viscosity of the surrounding fluid maybe not. This extended calibration methodology is automatic, without the need of a user's interface. A code for calibration by means of thermal noise analysis [3] is also included; this is a method that can be applied when using viscoelastic fluids if the trap stiffness is previously estimated [4]. The new code can be executed in MatLab and using GNU Octave. Program Files doi:http://dx.doi.org/10.17632/s59f3gz729.1 Licensing provisions: GPLv3 Programming language: MatLab 2016a (MathWorks Inc.) and GNU Octave 4.0 Operating system: Linux and Windows. Supplementary material: A new document README.pdf includes basic running instructions for the new code. Journal reference of previous version: Computer Physics Communications, 196 (2015) 599 Does the new version supersede the previous version?: No. It adds alternative but compatible code while providing similar calibration factors. Nature of problem (approx. 50-250 words): The original code uses a MatLab-provided user's interface, which is not available in GNU Octave, and cannot be used outside of a proprietary software as MatLab. Besides, the process of calibration when using spherical probes needs an automatic method when calibrating big amounts of different data focused to microrheology. Solution method (approx. 50-250 words): The new code can be executed in the latest version of MatLab and using GNU Octave, a free and open-source alternative to MatLab. This code generates an automatic calibration process which requires only to write the input data in the main script. Additionally, we include a calibration method based on thermal noise statistics, which can be used with viscoelastic fluids if the trap stiffness is previously estimated. Reasons for the new version: This version extends the functionality of PFMCal for the particular case of spherical probes and unknown fluid viscosities. The extended code is automatic, works in different operating systems and it is compatible with GNU Octave. Summary of revisions: The original MatLab program in the previous version, which is executed by PFMCal.m, is not changed. Here, we have added two additional main archives named PFMCal_auto.m and PFMCal_histo.m, which implement automatic calculations of the calibration process and calibration through Boltzmann statistics, respectively. The process of calibration using this code for spherical beads is described in the README.pdf file provided in the new code submission. Here, we obtain different calibration factors, β (given in μm/V), according to [2], related to two statistical quantities: the mean-squared displacement (MSD), βMSD, and the velocity autocorrelation function (VAF), βVAF. Using that methodology, the trap stiffness, k, and the zero-shear viscosity of the fluid, η, can be calculated if the value of the particle's radius, a, is previously known. For comparison, we include in the extended code the method of calibration using the corner frequency of the power-spectral density (PSD) [5], providing a calibration factor βPSD. Besides, with the prior estimation of the trap stiffness, along with the known value of the particle's radius, we can use thermal noise statistics to obtain calibration factors, β, according to the quadratic form of the optical potential, βE, and related to the Gaussian distribution of the bead's positions, βσ2. This method has been demonstrated to be applicable to the calibration of optical tweezers when using non-Newtonian viscoelastic polymeric liquids [4]. An example of the results using this calibration process is summarized in Table 1. Using the data provided in the new code submission, for water and acetone fluids, we calculate all the calibration factors by using the original PFMCal.m and by the new non-GUI code PFMCal_auto.m and PFMCal_histo.m. Regarding the new code, PFMCal_auto.m returns η, k, βMSD, βVAF and βPSD, while PFMCal_histo.m provides βσ2 and βE. Table 1 shows how we obtain the expected viscosity of the two fluids at this temperature and how the different methods provide good agreement between trap stiffnesses and calibration factors. Additional comments including Restrictions and Unusual features (approx. 50-250 words): The original code, PFMCal.m, runs under MatLab using the Statistics Toolbox. The extended code, PFMCal_auto.m and PFMCal_histo.m, can be executed without modification using MatLab or GNU Octave. The code has been tested in Linux and Windows operating systems.

  15. Software For Computer-Aided Design Of Control Systems

    NASA Technical Reports Server (NTRS)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  16. Coding in Senior School Mathematics with Live Editing

    ERIC Educational Resources Information Center

    Thompson, Ian

    2017-01-01

    In this paper, an example is offered of a problem-solving task for senior secondary school students which was given in the context of a story. As the story unfolds, the task requires progressively more complex forms of linear programming to be applied. Coding in MATLAB is used throughout the task in such a way that it supports the increasing…

  17. 12 CFR Appendix B to Part 364 - Interagency Guidelines Establishing Information Security Standards

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Relationships Risk Management Principles,” Nov. 1, 2001; FDIC FIL 68-99, Risk Assessment Tools and Practices for... Customer Information A. Information Security Program B. Objectives III. Development and Implementation of Customer Information Security Program A. Involve the Board of Directors B. Assess Risk C. Manage and...

  18. 78 FR 48441 - Office of Urban Indian Health Programs Proposed Single Source Grant With Native American...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-08

    ... outreach and case management, the program has expanded offering to include on-site dental service and... Care: Customer service is the key to quality care. Treating patients well is the first step to improving quality and access. This area also incorporates Best Practices in customer service. Identify...

  19. Developing Customized Programs for Steel and Other Heavy Industries: A Case Study.

    ERIC Educational Resources Information Center

    Day, Philip R., Jr.

    1984-01-01

    This article discusses the successful implementation of a unique customized training program for steel and other industries. A contextual framework for understanding both the process and the product is presented. Traditional labor management problems are examined as well as the DACUM (Developing a Curriculum) procedure of identifying job-related…

  20. Management Education Benchmarking Designing Customized and Flexible MBA Programs

    ERIC Educational Resources Information Center

    Hall, Owen P., Jr.; Young, Terry W.

    2007-01-01

    To meet the challenges of the 21st century B-schools are revising curriculum, delivery and outcome assessment modalities. Today, the proportion of electives and other specialty offerings in many MBA programs now constitutes more than 50% of the total curriculum. However, this focus on customization, integration and flexibility is not without its…

  1. Matpar: Parallel Extensions for MATLAB

    NASA Technical Reports Server (NTRS)

    Springer, P. L.

    1998-01-01

    Matpar is a set of client/server software that allows a MATLAB user to take advantage of a parallel computer for very large problems. The user can replace calls to certain built-in MATLAB functions with calls to Matpar functions.

  2. Fully Automated Sunspot Detection and Classification Using SDO HMI Imagery in MATLAB

    DTIC Science & Technology

    2014-03-27

    FULLY AUTOMATED SUNSPOT DETECTION AND CLASSIFICATION USING SDO HMI IMAGERY IN MATLAB THESIS Gordon M. Spahr, Second Lieutenant, USAF AFIT-ENP-14-M-34...CLASSIFICATION USING SDO HMI IMAGERY IN MATLAB THESIS Presented to the Faculty Department of Engineering Physics Graduate School of Engineering and Management Air...DISTRIUBUTION UNLIMITED. AFIT-ENP-14-M-34 FULLY AUTOMATED SUNSPOT DETECTION AND CLASSIFICATION USING SDO HMI IMAGERY IN MATLAB Gordon M. Spahr, BS Second

  3. Echolocation-Based Foraging by Harbor Porpoises and Sperm Whales, Including Effects on Noise and Acoustic Propagation

    DTIC Science & Technology

    2008-09-01

    Behavioural Point Process Data 234 Appendix B: Matlab Code 258 Matlab Code Used in Chapter 2 (Porpoise Prey Capture Analysis) 258 Click Extraction and...Measurement of Click Properties 258 Envelope-based Click Detector 262 Matlab Code Used in Chapter 3 (Transmission Loss in Porpoise Habitats) ..267...Click Extraction from Data Wavefiles 267 Click Level Determination (Grand Manan Datasets) 270 Click Level Determination (Danish Datasets) 287 Matlab

  4. Flexible missile autopilot design studies with PC-MATLAB/386

    NASA Technical Reports Server (NTRS)

    Ruth, Michael J.

    1989-01-01

    Development of a responsive, high-bandwidth missile autopilot for airframes which have structural modes of unusually low frequency presents a challenging design task. Such systems are viable candidates for modern, state-space control design methods. The PC-MATLAB interactive software package provides an environment well-suited to the development of candidate linear control laws for flexible missile autopilots. The strengths of MATLAB include: (1) exceptionally high speed (MATLAB's version for 80386-based PC's offers benchmarks approaching minicomputer and mainframe performance); (2) ability to handle large design models of several hundred degrees of freedom, if necessary; and (3) broad extensibility through user-defined functions. To characterize MATLAB capabilities, a simplified design example is presented. This involves interactive definition of an observer-based state-space compensator for a flexible missile autopilot design task. MATLAB capabilities and limitations, in the context of this design task, are then summarized.

  5. Center to Advance Palliative Care palliative care clinical care and customer satisfaction metrics consensus recommendations.

    PubMed

    Weissman, David E; Morrison, R Sean; Meier, Diane E

    2010-02-01

    Data collection and analysis are vital for strategic planning, quality improvement, and demonstration of palliative care program impact to hospital administrators, private funders and policymakers. Since 2000, the Center to Advance Palliative Care (CAPC) has provided technical assistance to hospitals, health systems and hospices working to start, sustain, and grow nonhospice palliative care programs. CAPC convened a consensus panel in 2008 to develop recommendations for specific clinical and customer metrics that programs should track. The panel agreed on four key domains of clinical metrics and two domains of customer metrics. Clinical metrics include: daily assessment of physical/psychological/spiritual symptoms by a symptom assessment tool; establishment of patient-centered goals of care; support to patient/family caregivers; and management of transitions across care sites. For customer metrics, consensus was reached on two domains that should be tracked to assess satisfaction: patient/family satisfaction, and referring clinician satisfaction. In an effort to ensure access to reliably high-quality palliative care data throughout the nation, hospital palliative care programs are encouraged to collect and report outcomes for each of the metric domains described here.

  6. Computing Across the Physics and Astrophysics Curriculum

    NASA Astrophysics Data System (ADS)

    DeGioia Eastwood, Kathy; James, M.; Dolle, E.

    2012-01-01

    Computational skills are essential in today's marketplace. Bachelors entering the STEM workforce report that their undergraduate education does not adequately prepare them to use scientific software and to write programs. Computation can also increase student learning; not only are the students actively engaged, but computational problems allow them to explore physical problems that are more realistic than the few that can be solved analytically. We have received a grant from the NSF CCLI Phase I program to integrate computing into our upper division curriculum. Our language of choice is Matlab; this language had already been chosen for our required sophomore course in Computational Physics because of its prevalence in industry. For two summers we have held faculty workshops to help our professors develop the needed expertise, and we are now in the implementation and evaluation stage. The end product will be a set of learning materials in the form of computational modules that we will make freely available. These modules will include the assignment, pedagogical goals, Matlab code, samples of student work, and instructor comments. At this meeting we present an overview of the project as well as modules written for a course in upper division stellar astrophysics. We acknowledge the support of the NSF through DUE-0837368.

  7. Dynamic optimization case studies in DYNOPT tool

    NASA Astrophysics Data System (ADS)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  8. 77 FR 7281 - Energy Conservation Program: Energy Conservation Standards for Distribution Transformers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... Manufacturing Transformers H. Customer Subgroup Analysis I. Manufacturer Impact Analysis 1. Overview 2... Justification and Energy Savings 1. Economic Impacts on Customers a. Life-Cycle Cost and Payback Period b. Customer Subgroup Analysis c. Rebuttable-Presumption Payback 2. Economic Impact on Manufacturers a...

  9. 19 CFR 122.175 - Exemption from penalties.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Exemption from penalties. 122.175 Section 122.175 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY AIR COMMERCE REGULATIONS Air Carrier Smuggling Prevention Program § 122.175 Exemption from...

  10. Simulation for Wind Turbine Generators -- With FAST and MATLAB-Simulink Modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, M.; Muljadi, E.; Jonkman, J.

    This report presents the work done to develop generator and gearbox models in the Matrix Laboratory (MATLAB) environment and couple them to the National Renewable Energy Laboratory's Fatigue, Aerodynamics, Structures, and Turbulence (FAST) program. The goal of this project was to interface the superior aerodynamic and mechanical models of FAST to the excellent electrical generator models found in various Simulink libraries and applications. The scope was limited to Type 1, Type 2, and Type 3 generators and fairly basic gear-train models. Future work will include models of Type 4 generators and more-advanced gear-train models with increased degrees of freedom. Asmore » described in this study, implementation of the developed drivetrain model enables the software tool to be used in many ways. Several case studies are presented as examples of the many types of studies that can be performed using this tool.« less

  11. Extraction of RBM frequency of a (10, 0) SWNT using MATLAB

    NASA Astrophysics Data System (ADS)

    Yadav, Monika; Negi, Sunita

    2018-05-01

    In the last few decades carbon nanotubes (CNT) have attracted great interest for future applications of these tubes in the field of electronics, composite material and drug delivery etc. The normal modes associated with a carbon nanotube are therefore important to be understood for a better understanding of their behavior. In this paper we report the frequency of the most important "Radial Breathing Mode (RBM)" associated with a single walled carbon nanotube. This is done with the help of a MATLAB program. We observe a RBM frequency associated with a (10, 0) single walled carbon nanotube at 340 cm-1. A very slow frequency component is also observed to be associated with the RBM of the carbon nanotube as reported earlier also. This method used for the extraction of RBM frequency is observed to be simple and fast as compared with the other methods.

  12. GRace: a MATLAB-based application for fitting the discrimination-association model.

    PubMed

    Stefanutti, Luca; Vianello, Michelangelo; Anselmi, Pasquale; Robusto, Egidio

    2014-10-28

    The Implicit Association Test (IAT) is a computerized two-choice discrimination task in which stimuli have to be categorized as belonging to target categories or attribute categories by pressing, as quickly and accurately as possible, one of two response keys. The discrimination association model has been recently proposed for the analysis of reaction time and accuracy of an individual respondent to the IAT. The model disentangles the influences of three qualitatively different components on the responses to the IAT: stimuli discrimination, automatic association, and termination criterion. The article presents General Race (GRace), a MATLAB-based application for fitting the discrimination association model to IAT data. GRace has been developed for Windows as a standalone application. It is user-friendly and does not require any programming experience. The use of GRace is illustrated on the data of a Coca Cola-Pepsi Cola IAT, and the results of the analysis are interpreted and discussed.

  13. Splitting parameter yield (SPY): A program for semiautomatic analysis of shear-wave splitting

    NASA Astrophysics Data System (ADS)

    Zaccarelli, Lucia; Bianco, Francesca; Zaccarelli, Riccardo

    2012-03-01

    SPY is a Matlab algorithm that analyzes seismic waveforms in a semiautomatic way, providing estimates of the two observables of the anisotropy: the shear-wave splitting parameters. We chose to exploit those computational processes that require less intervention by the user, gaining objectivity and reliability as a result. The algorithm joins the covariance matrix and the cross-correlation techniques, and all the computation steps are interspersed by several automatic checks intended to verify the reliability of the yields. The resulting semiautomation generates two new advantages in the field of anisotropy studies: handling a huge amount of data at the same time, and comparing different yields. From this perspective, SPY has been developed in the Matlab environment, which is widespread, versatile, and user-friendly. Our intention is to provide the scientific community with a new monitoring tool for tracking the temporal variations of the crustal stress field.

  14. Novel algorithm and MATLAB-based program for automated power law analysis of single particle, time-dependent mean-square displacement

    NASA Astrophysics Data System (ADS)

    Umansky, Moti; Weihs, Daphne

    2012-08-01

    In many physical and biophysical studies, single-particle tracking is utilized to reveal interactions, diffusion coefficients, active modes of driving motion, dynamic local structure, micromechanics, and microrheology. The basic analysis applied to those data is to determine the time-dependent mean-square displacement (MSD) of particle trajectories and perform time- and ensemble-averaging of similar motions. The motion of particles typically exhibits time-dependent power-law scaling, and only trajectories with qualitatively and quantitatively comparable MSD should be ensembled. Ensemble averaging trajectories that arise from different mechanisms, e.g., actively driven and diffusive, is incorrect and can result inaccurate correlations between structure, mechanics, and activity. We have developed an algorithm to automatically and accurately determine power-law scaling of experimentally measured single-particle MSD. Trajectories can then categorized and grouped according to user defined cutoffs of time, amplitudes, scaling exponent values, or combinations. Power-law fits are then provided for each trajectory alongside categorized groups of trajectories, histograms of power laws, and the ensemble-averaged MSD of each group. The codes are designed to be easily incorporated into existing user codes. We expect that this algorithm and program will be invaluable to anyone performing single-particle tracking, be it in physical or biophysical systems. Catalogue identifier: AEMD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 25 892 No. of bytes in distributed program, including test data, etc.: 5 572 780 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) version 7.11 (2010b) or higher, program should also be backwards compatible. Symbolic Math Toolboxes (5.5) is required. The Curve Fitting Toolbox (3.0) is recommended. Computer: Tested on Windows only, yet should work on any computer running MATLAB. In Windows 7, should be used as administrator, if the user is not the administrator the program may not be able to save outputs and temporary outputs to all locations. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.11 / 2010b or higher. Supplementary material: Sample output files (approx. 30 MBytes) are available. Classification: 12 External routines: Several MATLAB subfunctions (m-files), freely available on the web, were used as part of and included in, this code: count, NaN suite, parseArgs, roundsd, subaxis, wcov, wmean, and the executable pdfTK.exe. Nature of problem: In many physical and biophysical areas employing single-particle tracking, having the time-dependent power-laws governing the time-averaged meansquare displacement (MSD) of a single particle is crucial. Those power laws determine the mode-of-motion and hint at the underlying mechanisms driving motion. Accurate determination of the power laws that describe each trajectory will allow categorization into groups for further analysis of single trajectories or ensemble analysis, e.g. ensemble and time-averaged MSD. Solution method: The algorithm in the provided program automatically analyzes and fits time-dependent power laws to single particle trajectories, then group particles according to user defined cutoffs. It accepts time-dependent trajectories of several particles, each trajectory is run through the program, its time-averaged MSD is calculated, and power laws are determined in regions where the MSD is linear on a log-log scale. Our algorithm searches for high-curvature points in experimental data, here time-dependent MSD. Those serve as anchor points for determining the ranges of the power-law fits. Power-law scaling is then accurately determined and error estimations of the parameters and quality of fit are provided. After all single trajectory time-averaged MSDs are fit, we obtain cutoffs from the user to categorize and segment the power laws into groups; cutoff are either in exponents of the power laws, time of appearance of the fits, or both together. The trajectories are sorted according to the cutoffs and the time- and ensemble-averaged MSD of each group is provided, with histograms of the distributions of the exponents in each group. The program then allows the user to generate new trajectory files with trajectories segmented according to the determined groups, for any further required analysis. Additional comments: README file giving the names and a brief description of all the files that make-up the package and clear instructions on the installation and execution of the program is included in the distribution package. Running time: On an i5 Windows 7 machine with 4 GB RAM the automated parts of the run (excluding data loading and user input) take less than 45 minutes to analyze and save all stages for an 844 trajectory file, including optional PDF save. Trajectory length did not affect run time (tested up to 3600 frames/trajectory), which was on average 3.2±0.4 seconds per trajectory.

  15. Multi-objective problem of the modified distributed parallel machine and assembly scheduling problem (MDPMASP) with eligibility constraints

    NASA Astrophysics Data System (ADS)

    Amallynda, I.; Santosa, B.

    2017-11-01

    This paper proposes a new generalization of the distributed parallel machine and assembly scheduling problem (DPMASP) with eligibility constraints referred to as the modified distributed parallel machine and assembly scheduling problem (MDPMASP) with eligibility constraints. Within this generalization, we assume that there are a set non-identical factories or production lines, each one with a set unrelated parallel machine with different speeds in processing them disposed to a single assembly machine in series. A set of different products that are manufactured through an assembly program of a set of components (jobs) according to the requested demand. Each product requires several kinds of jobs with different sizes. Beside that we also consider to the multi-objective problem (MOP) of minimizing mean flow time and the number of tardy products simultaneously. This is known to be NP-Hard problem, is important to practice, as the former criterions to reflect the customer's demand and manufacturer's perspective. This is a realistic and complex problem with wide range of possible solutions, we propose four simple heuristics and two metaheuristics to solve it. Various parameters of the proposed metaheuristic algorithms are discussed and calibrated by means of Taguchi technique. All proposed algorithms are tested by Matlab software. Our computational experiments indicate that the proposed problem and fourth proposed algorithms are able to be implemented and can be used to solve moderately-sized instances, and giving efficient solutions, which are close to optimum in most cases.

  16. Accelerometry-Derived Physical Activity Correlations Between Parents and Their Fourth-Grade Child Are Specific to Time of Day and Activity Level.

    PubMed

    Strutz, Erin; Browning, Raymond; Smith, Stephanie; Lohse, Barbara; Cunningham-Sabo, Leslie

    2018-06-01

    The purpose of this study was to employ high-frequency accelerometry to explore parent-child physical activity (PA) relationships across a free-living sample. We recorded 7 days of wrist-mounted accelerometry data from 168 dyads of elementary-aged children and their parents. Using a custom MATLAB program (Natick, MA), we summed child and parent accelerations over 1 and 60 seconds, respectively, and applied published cut points to determine the amount of time spent in moderate-vigorous PA (MVPA). Bivariate and partial correlations examined parent-child relationships between percentage of time spent in MVPA. Weak to moderate positive correlations were observed before school (r = .326, P < .001), after school (r = .176, P = .023), during the evening (r = .213, P = .006), and on weekends (r = .231, P = .003). Partial correlations controlling for parent-child MVPA revealed significant relationships during the school day (r = .185, P = .017), before school (r = .315, P < .001), and on weekends (r = .266, P = .001). In addition, parents of more active children were significantly more active than parents of less active children during the evening. These data suggest that there is some association between parent-child PA, especially before school and on weekends. Future interventions aiming to increase PA among adults and children must consider patterns of MVPA specific to children and parents and target them accordingly.

  17. An energy management for series hybrid electric vehicle using improved dynamic programming

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Yang, Yaoquan; Liu, Chunyu

    2018-02-01

    With the increasing numbers of hybrid electric vehicle (HEV), management for two energy sources, engine and battery, is more and more important to achieve the minimum fuel consumption. This paper introduces several working modes of series hybrid electric vehicle (SHEV) firstly and then describes the mathematical model of main relative components in SHEV. On the foundation of this model, dynamic programming is applied to distribute energy of engine and battery on the platform of matlab and acquires less fuel consumption compared with traditional control strategy. Besides, control rule recovering energy in brake profiles is added into dynamic programming, so shorter computing time is realized by improved dynamic programming and optimization on algorithm.

  18. Strain Library Imaging Protocol for high-throughput, automated single-cell microscopy of large bacterial collections arrayed on multiwell plates.

    PubMed

    Shi, Handuo; Colavin, Alexandre; Lee, Timothy K; Huang, Kerwyn Casey

    2017-02-01

    Single-cell microscopy is a powerful tool for studying gene functions using strain libraries, but it suffers from throughput limitations. Here we describe the Strain Library Imaging Protocol (SLIP), which is a high-throughput, automated microscopy workflow for large strain collections that requires minimal user involvement. SLIP involves transferring arrayed bacterial cultures from multiwell plates onto large agar pads using inexpensive replicator pins and automatically imaging the resulting single cells. The acquired images are subsequently reviewed and analyzed by custom MATLAB scripts that segment single-cell contours and extract quantitative metrics. SLIP yields rich data sets on cell morphology and gene expression that illustrate the function of certain genes and the connections among strains in a library. For a library arrayed on 96-well plates, image acquisition can be completed within 4 min per plate.

  19. EEG and MEG data analysis in SPM8.

    PubMed

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools.

  20. EEG and MEG Data Analysis in SPM8

    PubMed Central

    Litvak, Vladimir; Mattout, Jérémie; Kiebel, Stefan; Phillips, Christophe; Henson, Richard; Kilner, James; Barnes, Gareth; Oostenveld, Robert; Daunizeau, Jean; Flandin, Guillaume; Penny, Will; Friston, Karl

    2011-01-01

    SPM is a free and open source software written in MATLAB (The MathWorks, Inc.). In addition to standard M/EEG preprocessing, we presently offer three main analysis tools: (i) statistical analysis of scalp-maps, time-frequency images, and volumetric 3D source reconstruction images based on the general linear model, with correction for multiple comparisons using random field theory; (ii) Bayesian M/EEG source reconstruction, including support for group studies, simultaneous EEG and MEG, and fMRI priors; (iii) dynamic causal modelling (DCM), an approach combining neural modelling with data analysis for which there are several variants dealing with evoked responses, steady state responses (power spectra and cross-spectra), induced responses, and phase coupling. SPM8 is integrated with the FieldTrip toolbox , making it possible for users to combine a variety of standard analysis methods with new schemes implemented in SPM and build custom analysis tools using powerful graphical user interface (GUI) and batching tools. PMID:21437221

  1. The Kepler Science Operations Center Pipeline Framework Extensions

    NASA Technical Reports Server (NTRS)

    Klaus, Todd C.; Cote, Miles T.; McCauliff, Sean; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Chandrasekaran, Hema; Bryson, Stephen T.; Middour, Christopher; Caldwell, Douglas A.; hide

    2010-01-01

    The Kepler Science Operations Center (SOC) is responsible for several aspects of the Kepler Mission, including managing targets, generating on-board data compression tables, monitoring photometer health and status, processing the science data, and exporting the pipeline products to the mission archive. We describe how the generic pipeline framework software developed for Kepler is extended to achieve these goals, including pipeline configurations for processing science data and other support roles, and custom unit of work generators that control how the Kepler data are partitioned and distributed across the computing cluster. We describe the interface between the Java software that manages the retrieval and storage of the data for a given unit of work and the MATLAB algorithms that process these data. The data for each unit of work are packaged into a single file that contains everything needed by the science algorithms, allowing these files to be used to debug and evolve the algorithms offline.

  2. BasinVis 1.0: A MATLAB®-based program for sedimentary basin subsidence analysis and visualization

    NASA Astrophysics Data System (ADS)

    Lee, Eun Young; Novotny, Johannes; Wagreich, Michael

    2016-06-01

    Stratigraphic and structural mapping is important to understand the internal structure of sedimentary basins. Subsidence analysis provides significant insights for basin evolution. We designed a new software package to process and visualize stratigraphic setting and subsidence evolution of sedimentary basins from well data. BasinVis 1.0 is implemented in MATLAB®, a multi-paradigm numerical computing environment, and employs two numerical methods: interpolation and subsidence analysis. Five different interpolation methods (linear, natural, cubic spline, Kriging, and thin-plate spline) are provided in this program for surface modeling. The subsidence analysis consists of decompaction and backstripping techniques. BasinVis 1.0 incorporates five main processing steps; (1) setup (study area and stratigraphic units), (2) loading well data, (3) stratigraphic setting visualization, (4) subsidence parameter input, and (5) subsidence analysis and visualization. For in-depth analysis, our software provides cross-section and dip-slip fault backstripping tools. The graphical user interface guides users through the workflow and provides tools to analyze and export the results. Interpolation and subsidence results are cached to minimize redundant computations and improve the interactivity of the program. All 2D and 3D visualizations are created by using MATLAB plotting functions, which enables users to fine-tune the results using the full range of available plot options in MATLAB. We demonstrate all functions in a case study of Miocene sediment in the central Vienna Basin.

  3. Transverse emittance and phase space program developed for use at the Fermilab A0 Photoinjector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurman-Keup, R.; Johnson, A.S.; Lumpkin, A.H.

    2011-03-01

    The Fermilab A0 Photoinjector is a 16 MeV high intensity, high brightness electron linac developed for advanced accelerator R&D. One of the key parameters for the electron beam is the transverse beam emittance. Here we report on a newly developed MATLAB based GUI program used for transverse emittance measurements using the multi-slit technique. This program combines the image acquisition and post-processing tools for determining the transverse phase space parameters with uncertainties. An integral part of accelerator research is a measurement of the beam phase space. Measurements of the transverse phase space can be accomplished by a variety of methods includingmore » multiple screens separated by drift spaces, or by sampling phase space via pepper pots or slits. In any case, the measurement of the phase space parameters, in particular the emittance, can be drastically simplified and sped up by automating the measurement in an intuitive fashion utilizing a graphical interface. At the A0 Photoinjector (A0PI), the control system is DOOCS, which originated at DESY. In addition, there is a library for interfacing to MATLAB, a graphically capable numerical analysis package sold by The Mathworks. It is this graphical package which was chosen as the basis for a graphical phase space measurement system due to its combination of analysis and display capabilities.« less

  4. ImageJ-MATLAB: a bidirectional framework for scientific image analysis interoperability.

    PubMed

    Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W

    2017-02-15

    ImageJ-MATLAB is a lightweight Java library facilitating bi-directional interoperability between MATLAB and ImageJ. By defining a standard for translation between matrix and image data structures, researchers are empowered to select the best tool for their image-analysis tasks. Freely available extension to ImageJ2 ( http://imagej.net/Downloads ). Installation and use instructions available at http://imagej.net/MATLAB_Scripting. Tested with ImageJ 2.0.0-rc-54 , Java 1.8.0_66 and MATLAB R2015b. eliceiri@wisc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  5. Developing Tools and Techniques to Increase Communication Effectiveness

    NASA Technical Reports Server (NTRS)

    Hayes, Linda A.; Peterson, Doug

    1997-01-01

    The Public Affairs Office (PAO) of the Johnson Space Center (JSC) is responsible for communicating current JSC Space Program activities as well as goals and objectives to the American Public. As part of the 1996 Strategic Communications Plan, a review of PAO' s current communication procedures was conducted. The 1996 Summer Faculty Fellow performed research activities to support this effort by reviewing current research concerning NASA/JSC's customers' perceptions and interests, developing communications tools which enable PAO to more effectively inform JSC customers about the Space Program, and proposing a process for developing and using consistent messages throughout PAO. Note that this research does not attempt to change or influence customer perceptions or interests but, instead, incorporates current customer interests into PAO's communication process.

  6. Upgrade and interpersonal skills training at American Airlines

    NASA Technical Reports Server (NTRS)

    Estridge, W. W.; Mansfield, J. L.

    1980-01-01

    Segments of the interpersonal skills training audio visual program are presented. The program was developed to train customer contact personnel with specific emphasis on transactional analysis in customer treatment. Concepts of transactional analysis are summarized in terms of the make up of the personality, identified as the three ego states. These ego states are identified as the parent, the adult, and the child. Synopses of four of the tape programs are given.

  7. 77 FR 46528 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-03

    ... Barcode. The second will strengthen the digital relationship with consumers and aid all customers with... proposing to modify a Customer Privacy Act System of Records. These modifications reflect the needs of two new Postal Service programs to assist customers with package and mail tracking. Also, there is an...

  8. 75 FR 52542 - Extension of Agency Information Collection Activity Under OMB Review: Department of Homeland...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-26

    ... Inquiry Program (DHS TRIP). The collection also involves a voluntary customer satisfaction survey to... customer satisfaction survey in accordance with the DHS Office of the Inspector General, Report on... completing customer satisfaction survey will take approximately 10 minutes per respondent. Number of...

  9. Custom Sewing, Modules One, Two, and Three. Instructor Guide.

    ERIC Educational Resources Information Center

    Missouri Univ., Columbia. Instructional Materials Lab.

    This document consists of three modules designed for a custom apparel and garment sewing program teaching students to construct, alter, and prepare garments and home fashions to customer specifications. Each module includes some or all of the following components: performance objectives, lesson plans, suggested activities, information sheets,…

  10. Science and Engineering Education : Who is the Customer?

    DTIC Science & Technology

    2012-05-30

    business relationships are at the heart of the negative consequences of misidentifying the student as customer [8]. Student evaluations of teachers are...Journal of Education Management , 8, 29-36. 7. Scott, S.V. (1999) The academic as service provider: is the customer ‘always right’? Journal of...Engineering Education: Who is the Customer ? 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Michael Courtney

  11. Utility Green-Pricing Programs: What Defines Success? (Topical Issues Brief)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swezey, B.; Bird, L.

    2001-09-13

    ''Green pricing'' is an optional service through which customers can support a greater level of investment by their electric utility in renewable energy technologies. Electric utilities in 29 states are now implementing green-pricing programs. This report examines important elements of green-pricing programs, including the different types of programs offered, the premiums charged, customer response, and additional factors that experience indicates are key to the development of successful programs. The best-performing programs tend to share a number of common attributes related to product design, value creation, product pricing, and program implementation. The report ends with a list of ''best practices'' formore » utilities to follow when developing and implementing programs.« less

  12. A Qualitative Study of a Rural Community College Workforce Development Customized Training Program

    ERIC Educational Resources Information Center

    O'Rear, Susan

    2011-01-01

    Across the United States, partnerships have formed between business and industry and rural community college workforce development customize training programs to meet the demands of the 21st century labor market. For many business and industry managers, a partnership has become a necessary means to train the unskilled as well as update skills…

  13. 78 FR 75629 - Self-Regulatory Organizations; Miami International Securities Exchange LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... Effectiveness of a Proposed Rule Change To Amend the MIAX Fee Schedule December 6, 2013. Pursuant to the... Priority Customer Rebate Program (the ``Program'') to (i) lower the volume thresholds of the four highest... thresholds in a month as described below. The volume thresholds are calculated based on the customer average...

  14. First Year Experience: How We Can Better Assist First-Year International Students in Higher Education

    ERIC Educational Resources Information Center

    Yan, Zi; Sendall, Patricia

    2016-01-01

    While many American colleges and universities are providing a First Year Experience (FYE) course or program for their first year students, those programs are not often customized to take into account international students' (IS) unique challenges. Using quantitative and qualitative methods, this study evaluated a FYE course that was customized for…

  15. 77 FR 19391 - Notice of Proposed Intelligent Mail Indicia Performance Criteria With Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... products designed to meet new customer needs for access to postage. In addition, changes within the United... opportunities for PES providers to propose new concepts, methods, and processes to enable customers to print pre... support the USPS PES Test and Evaluation Program (the ``Program''). The intent is for the volumes to fully...

  16. 78 FR 22895 - Test To Allow Customs Brokers To Pre-Certify Importers for Participation in the Importer Self...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-17

    ... pre-certify importers for participation in the Importer Self-Assessment (ISA) program. The test will be known as the Customs Broker Importer Self-Assessment Pre- Certification (Broker ISA PC) test. The... Importer Self-Assessment (ISA) program. The Broker Importer Self-Assessment Pre-Certification (ISA PC) test...

  17. Likelihood Ratio Test Polarimetric SAR Ship Detection Application

    DTIC Science & Technology

    2005-12-01

    menu. Under the Matlab menu, the user can export an area of an image to the MatlabTM MAT file format, as well as call RGB image and Pauli...must specify various parameters such as the area of the image to analyze. Export Image Area to MatlabTM (PoIGASP & COASP) Generates a MatlabTM file...represented by the Minister of National Defence, 2005 (0 Sa majest6 la reine, repr(sent(e par le ministre de la Defense nationale, 2005 Abstract This

  18. Plasmonics analysis of nanostructures for bioapplications

    NASA Astrophysics Data System (ADS)

    Xie, Qian

    Plasmonics, the science and technology of the plasmons, is a rapidly growing field with substantial broader impact in numerous different fields, especially for bio-applications such as bio-sensing, bio-photonics and photothermal therapy. Resonance effects associated with plasmatic behavior i.e. surface Plasmon resonance (SPR) and localize surface Plasmon resonance (LSPR), are of particular interest because of their strong sensitivity to the local environment. In this thesis, plasmonic resonance effects are discussed from the basic theory to applications, especially the application in photothermal therapy, and grating bio-sensing. This thesis focuses on modeling different metallic nanostructures, i.e. nanospheres, nanorods, core-shell nanoparticles, nanotori and hexagonal closed packed nanosphere structures, to determine their LSPR wavelengths for use in various applications. Experiments regarding photothermal therapy using gold nanorods are described and a comparison is presented with results obtained from simulations. Lastly, experiments of grating-based plasmon-enhanced bio-sensing are also discussed. In chapter one, the physics of plasmonics is reviewed, including surface plasmon resonance (SPR) and localized surface plasmon resonance (LSPR). In the section on surface plasmon resonance, the physics behind the phenomenon is discussed, and also, the detection methods and applications in bio-sensing are described. In the section on localized surface plasmon resonance (LSPR), the phenomenon is described with respect to sub wavelength metallic nanoparticles. In chapter two, specific plasmonic-based bio-applications are discussed including plasmonic and magneto-plasmonic enhanced photothermal therapy and grating-based SPR bio-sening. In chapter three, which is the most important part in the thesis, optical modeling of different gold nanostructures is presented. The modeling tools used in this thesis are Comsol and custom developed Matlab programs. In Comsol, the geometries of different metallic nanostructures are drawn and simulated using finite element-based computational electromagnetics. The power absorption of the nanostructures is plotted as a function of wavelength to identify the LSPR wavelength, i.e. the wavelength of peak absorption. In Matlab, Mie scattering theory is programmed in terms of semi-analytical mathematical equations, which predict the power absorption for specific plasmonic geometries, i.e. nanospheres, nanorods and core-shell particles. These predictions, which are much faster than the Comsol analysis, are validated using corresponding numerical simulations. In chapter four, experiments involving novel magneto-plasmonic Nano platforms are described, and experimental data is presented to illustrate the use of the modeling in analyzing these particles. Simulations are performed to determine the influence on the laser absorption of magnetic nanospheres in proximity to metallic nanorods. These results are compared with experimental data. In the last chapter, experiments using a grating-based SPR sensor are described, and modeling results are also presented. In summary, this thesis discusses the physics of plasmonics, electromagnetic analysis for predicting the absorption spectra of metallic nanoparticles and bio-applications that utilize these effects.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mundy, D; Tryggestad, E; Beltran, C

    Purpose: To develop daily and monthly quality assurance (QA) programs in support of a new spot-scanning proton treatment facility using a combination of commercial and custom equipment and software. Emphasis was placed on efficiency and evaluation of key quality parameters. Methods: The daily QA program was developed to test output, spot size and position, proton beam energy, and image guidance using the Sun Nuclear Corporation rf-DQA™3 device and Atlas QA software. The program utilizes standard Atlas linear accelerator tests repurposed for proton measurements and a custom jig for indexing the device to the treatment couch. The monthly QA program wasmore » designed to test mechanical performance, image quality, radiation quality, isocenter coincidence, and safety features. Many of these tests are similar to linear accelerator QA counterparts, but many require customized test design and equipment. Coincidence of imaging, laser marker, mechanical, and radiation isocenters, for instance, is verified using a custom film-based device devised and manufactured at our facility. Proton spot size and position as a function of energy are verified using a custom spot pattern incident on film and analysis software developed in-house. More details concerning the equipment and software developed for monthly QA are included in the supporting document. Thresholds for daily and monthly tests were established via perturbation analysis, early experience, and/or proton system specifications and associated acceptance test results. Results: The periodic QA program described here has been in effect for approximately 9 months and has proven efficient and sensitive to sub-clinical variations in treatment delivery characteristics. Conclusion: Tools and professional guidelines for periodic proton system QA are not as well developed as their photon and electron counterparts. The program described here efficiently evaluates key quality parameters and, while specific to the needs of our facility, could be readily adapted to other proton centers.« less

  20. 19 CFR 123.71 - Description of program.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... voluntary cooperation of commercial conveyance entities in Customs effort to prevent the smuggling of... closely with Customs in identifying and reporting suspected smuggling attempts. In exchange for this...

  1. Simulation Concept - How to Exploit Tools for Computing Hybrids

    DTIC Science & Technology

    2010-06-01

    biomolecular reactions ................................................................ 42  Figure 30: Overview of MATLAB Implementation...Figure 50: Adenine graphed using MATLAB (left) and OpenGL (right) ........................ 70  Figure 51: An overhead view of a thymine and adenine base...93  Figure 68: Response frequency solution from MATLAB

  2. Vessel-Mounted ADCP Data Calibration and Correction

    NASA Astrophysics Data System (ADS)

    de Andrade, A. F.; Barreira, L. M.; Violante-Carvalho, N.

    2013-05-01

    A set of scripts for vessel-mounted ADCP (Acoustic Doppler Current Profiler) data processing is presented. The need for corrections in the data measured by a ship-mounted ADCP and the complexities found during installation, implementation and identification of tasks performed by currently available systems for data processing consist the main motivating factors for the development of a system that would be more practical in manipulation, open code and more manageable for the user. The proposed processing system consists of a set of scripts developed in Matlab TM programming language. The system is able to read the binary files provided by the data acquisition program VMDAS (Vessel Mounted Data Acquisition System), Teledyne RDInstruments proprietary, and calculate calibration factors to correct the data and visualize them after correction. For use the new system, it is only necessary that the ADCP data collected with VMDAS program is in a processing diretory and Matlab TM software be installed on the user's computer. Developed algorithms were extensively tested with ADCP data obtained during Oceano Sul III (Southern Ocean III - OSIII) cruise, conducted by Brazilian Navy aboard the R/V "Antares", from March 26th to May 10th 2007, in the oceanic region between the states of São Paulo and Rio Grande do Sul. For read the data the function rdradcp.m, developed by Rich Pawlowicz and available on his website (http://www.eos.ubc.ca/~rich/#RDADCP), was used. To calculate the calibration factors, alignment error (α) and sensitivity error (β) in Water Tracking and Bottom Tracking Modes, equations deduced by Joyce (1998), Pollard & Read (1989) and Trump & Marmorino (1996) were implemented in Matlab. To validate the calibration factors obtained in the processing system developed, the parameters were compared with the factors provided by CODAS (Common Ocean Data Access System, available at http://currents.soest.hawaii.edu/docs/doc/index.html), post-processing program. For the same data analyzed, the factors provided by both systems were similar. Thereafter, the values obtained were used to correct the data and finally matrices were saved with data corrected and they can be plotted. The values of volume transport of the Brazil Current (BC) were calculated using the corrected data by the two systems and proved quite close, confirming the quality of the correction of the system.

  3. Grantee Satisfaction Survey. Final Report, August 2008

    ERIC Educational Resources Information Center

    US Department of Education, 2008

    2008-01-01

    The American Customer Satisfaction Index (ACSI) is the national indicator of customer evaluations of the quality of goods and services available to U.S. residents. Since 1994, it has served as a uniform, cross-industry/government measure of customer satisfaction. A total of 10 groups, composed of eight program offices, EDFacts Coordinators, and…

  4. Customer Service Training. New Paradigm for Effective Workforce Skills. [Employee Guide and Supervisor's Guide.

    ERIC Educational Resources Information Center

    Saint Louis Community Coll., MO. Workplace Literacy Services Center.

    These two documents are part of the customer service training program provided to employees of a large metropolitan hospital. The first manual contains customer service training activities for the hospital's dietary aides, cashiers, patient service representatives, and parking attendants. The activities are organized in three sections as follows:…

  5. A Survey of Quantum Programming Languages: History, Methods, and Tools

    DTIC Science & Technology

    2008-01-01

    and entanglement , to achieve computational solutions to certain problems in less time (fewer computational cycles) than is possible using classical...superposition of quantum bits, entanglement , destructive measurement, and the no-cloning theorem. These differences must be thoroughly understood and even...computers using well-known languages such as C, C++, Java, and rapid prototyping languages such as Maple, Mathematica, and Matlab . A good on-line

  6. Using the Parallel Computing Toolbox with MATLAB on the Peregrine System |

    Science.gov Websites

    parallel pool took %g seconds.\\n', toc) % "single program multiple data" spmd fprintf('Worker %d says Hello World!\\n', labindex) end delete(gcp); % close the parallel pool exit To run the script on a compute node, create the file helloWorld.sub: #!/bin/bash #PBS -l walltime=05:00 #PBS -l nodes=1 #PBS -N

  7. Numerical study of fluid motion in bioreactor with two mixers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheleva, I., E-mail: izheleva@uni-ruse.bg; Lecheva, A., E-mail: alecheva@uni-ruse.bg

    2015-10-28

    Numerical study of hydrodynamic laminar behavior of a viscous fluid in bioreactor with multiple mixers is provided in the present paper. The reactor is equipped with two disk impellers. The fluid motion is studied in stream function-vorticity formulation. The calculations are made by a computer program, written in MATLAB. The fluid structure is described and numerical results are graphically presented and commented.

  8. Air medical referring customer satisfaction: a valuable insight.

    PubMed

    Fultz, J H; Coyle, C B; Reynolds, P W

    1998-01-01

    To remain competitive and survive, air medical programs must have a mechanism for obtaining customer feedback, especially when alternate transport options are available. The goal of this survey was to examine the air medical service's performance as perceived by customers requesting the transport. Surveys were mailed to 400 referring customers who had contact with the flight crew during the transition of patient care. The survey consisted of 16 statements evaluating the service by using a 4-point Likert scale, three demographic questions, one statement evaluating overall satisfaction, and two open-ended questions for comments or suggestions. Two hundred forty-four surveys were returned for a 61% responses rate. Results indicated referring customers are satisfied with the service provided Written comments and suggestions were divided into two categories, positive comments and suggestions for improvement. Three common themes were identified within the suggestions for improvement: crew rapport, communications, and operations. Suggested improvements were evaluated, and selected strategies were incorporated into program operation. Customer feedback furnishes valuable insight into their needs and perception of a service. Comments and suggestions for improvement can promote critical inquiry into service operation and provide a catalyst for improvement.

  9. Minnesota Custom Training: Who Is Being Served and What Role Does Custom Training Play in the Work Environment? Findings from the Minnesota Work Environment Pilot Survey.

    ERIC Educational Resources Information Center

    Minnesota State Technical Coll. System, St. Paul.

    For the past 8 years, Minnesota technical colleges have been offering customized training services to the state's employers. To gather data on what kinds of organizations use custom training (CT) programs, the State Board of Technical Colleges surveyed 600 public and private employers that had used CT services through at least one of the system's…

  10. Consolidation of Customer Orders into Truckloads at a Large Manufacturer

    DTIC Science & Technology

    1997-08-01

    Consolidation of Customer Orders into Truckloads at a Large Manufacturer G. G. Brown; D. Ronen The Journal of the Operational Research Society...AND SUBTITLE Consolidation of Customer Orders into Truckloads at a Large Manufacturer 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...All rights reserved. 0160-5682/97 $12.00 ~ Consolidation of customer orders into truckloads at a large manufacturer GG Brown1 and D Ronen2 1 Naval

  11. Energy planning and energy efficiency assistance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Markel, L.

    1995-12-31

    Electrotek is an engineering services company specializing in energy-related programs. Clients are most utilities, large energy users, and the U.S. Electric Power Research Institute. Electrotek has directed energy projects for the U.S. Agency for International Development and the U.S. Department of Energy in Poland and other countries of Central Europe. The objective is to assist the host country organizations to identify and implement appropriate energy efficiency and pollution reduction technologies, to transfer technical and organizational knowledge, so that further implementations are market-driven, without needed continuing foreign investment. Electrotek has worked with the Silesian Power Distribution Company to design an energymore » efficiency program for industrial customers that has proven to be profitable for the company and for its customers. The program has both saved energy and costs, and reduced pollution. The program is expanding to include additional customers, without needing more funding from the U.S. government.« less

  12. Application of PSAT to Load Flow Analysis with STATCOM under Load Increase Scenario and Line Contingencies

    NASA Astrophysics Data System (ADS)

    Telang, Aparna S.; Bedekar, P. P.

    2017-09-01

    Load flow analysis is the initial and essential step for any power system computation. It is required for choosing better options for power system expansion to meet with ever increasing load demand. Implementation of Flexible AC Transmission System (FACTS) device like STATCOM, in the load flow, which is having fast and very flexible control, is one of the important tasks for power system researchers. This paper presents a simple and systematic approach for steady state power flow calculations with FACTS controller, static synchronous compensator (STATCOM) using command line usage of MATLAB tool-power system analysis toolbox (PSAT). The complexity of MATLAB language programming increases due to incorporation of STATCOM in an existing Newton-Raphson load flow algorithm. Thus, the main contribution of this paper is to show how command line usage of user friendly MATLAB tool, PSAT, can extensively be used for quicker and wider interpretation of the results of load flow with STATCOM. The novelty of this paper lies in the method of applying the load increase pattern, where the active and reactive loads have been changed simultaneously at all the load buses under consideration for creating stressed conditions for load flow analysis with STATCOM. The performance have been evaluated on many standard IEEE test systems and the results for standard IEEE-30 bus system, IEEE-57 bus system, and IEEE-118 bus system are presented.

  13. Early participation in a prenatal food supplementation program ameliorates the negative association of food insecurity with quality of maternal-infant interaction.

    PubMed

    Frith, Amy L; Naved, Ruchira T; Persson, Lars Ake; Rasmussen, Kathleen M; Frongillo, Edward A

    2012-06-01

    Food insecurity is detrimental to child development, yet little is known about the combined influence of food insecurity and nutritional interventions on child development in low-income countries. We proposed that women assigned to an early invitation time to start a prenatal food supplementation program could reduce the negative influence of food insecurity on maternal-infant interaction. A cohort of 180 mother-infant dyads were studied (born between May and October 2003) from among 3267 in the randomized controlled trial Maternal Infant Nutritional Interventions Matlab, which was conducted in Matlab, Bangladesh. At 8 wk gestation, women were randomly assigned an invitation time to start receiving food supplements (2.5 MJ/d; 6 d/wk) either early (~9 wk gestation; early-invitation group) or at the usual start time (~20 wk gestation; usual-invitation group) for the government program. Maternal-infant interaction was observed in homes with the use of the Nursing Child Assessment Satellite Training Feeding Scale, and food-insecurity status was obtained from questionnaires completed when infants were 3.4-4.0 mo old. By using a general linear model for maternal-infant interaction, we found a significant interaction (P = 0.012) between invitation time to start a prenatal food supplementation program and food insecurity. Those in the usual-invitation group with higher food insecurity scores (i.e., more food insecure) had a lower quality of maternal-infant interaction, but this relationship was ameliorated among those in the early-invitation group. Food insecurity limits the ability of mothers and infants to interact well, but an early invitation time to start a prenatal food supplementation program can support mother-infant interaction among those who are food insecure.

  14. ORBIT: an integrated environment for user-customized bioinformatics tools.

    PubMed

    Bellgard, M I; Hiew, H L; Hunter, A; Wiebrands, M

    1999-10-01

    There are a large number of computational programs freely available to bioinformaticians via a client/server, web-based environment. However, the client interface to these tools (typically an html form page) cannot be customized from the client side as it is created by the service provider. The form page is usually generic enough to cater for a wide range of users. However, this implies that a user cannot set as 'default' advanced program parameters on the form or even customize the interface to his/her specific requirements or preferences. Currently, there is a lack of end-user interface environments that can be modified by the user when accessing computer programs available on a remote server running on an intranet or over the Internet. We have implemented a client/server system called ORBIT (Online Researcher's Bioinformatics Interface Tools) where individual clients can have interfaces created and customized to command-line-driven, server-side programs. Thus, Internet-based interfaces can be tailored to a user's specific bioinformatic needs. As interfaces are created on the client machine independent of the server, there can be different interfaces to the same server-side program to cater for different parameter settings. The interface customization is relatively quick (between 10 and 60 min) and all client interfaces are integrated into a single modular environment which will run on any computer platform supporting Java. The system has been developed to allow for a number of future enhancements and features. ORBIT represents an important advance in the way researchers gain access to bioinformatics tools on the Internet.

  15. Computer program for analysis of hemodynamic response to head-up tilt test

    NASA Astrophysics Data System (ADS)

    ŚwiÄ tek, Eliza; Cybulski, Gerard; Koźluk, Edward; PiÄ tkowska, Agnieszka; Niewiadomski, Wiktor

    2014-11-01

    The aim of this work was to create a computer program, written in the MATLAB environment, which enables the visualization and analysis of hemodynamic parameters recorded during a passive tilt test using the CNS Task Force Monitor System. The application was created to help in the assessment of the relationship between the values and dynamics of changes of the selected parameters and the risk of orthostatic syncope. The signal analysis included: R-R intervals (RRI), heart rate (HR), systolic blood pressure (sBP), diastolic blood pressure (dBP), mean blood pressure (mBP), stroke volume (SV), stroke index (SI), cardiac output (CO), cardiac index (CI), total peripheral resistance (TPR), total peripheral resistance index (TPRI), ventricular ejection time (LVET) and thoracic fluid content (TFC). The program enables the user to visualize waveforms for a selected parameter and to perform smoothing with selected moving average parameters. It allows one to construct the graph of means for any range, and the Poincare plot for a selected time range. The program automatically determines the average value of the parameter before tilt, its minimum and maximum value immediately after changing positions and the times of their occurrence. It is possible to correct the automatically detected points manually. For the RR interval, it determines the acceleration index (AI) and the brake index (BI). It is possible to save calculated values to an XLS with a name specified by user. The application has a user-friendly graphical interface and can run on a computer that has no MATLAB software.

  16. Technology Acquisition Reform

    DTIC Science & Technology

    2004-03-01

    technologies until they are ready to be handed over to an established program. This office would also provide a home for disruptive technologies emerging...the development and acquisition of disruptive technologies .3 Disruptive technologies threaten programs of record but are essential to future Naval...and rarely emerge in response to customer demand. Disruptive technologies have features that a few fringe (and generally new) customers value

  17. Alternative Fuels Data Center

    Science.gov Websites

    of the following measures: Payment of incentives to customers that install EVSE; Time-of-use rates customers; and Technical assistance programs for government fleets and private organizations. Utilities may

  18. 12 CFR 222.82 - Duties of users regarding address discrepancies.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... consumer's identity in accordance with the requirements of the Customer Identification Program (CIP) rules... of address notifications, other customer account records, or retained CIP documentation; or (C...

  19. 12 CFR 222.82 - Duties of users regarding address discrepancies.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... consumer's identity in accordance with the requirements of the Customer Identification Program (CIP) rules... of address notifications, other customer account records, or retained CIP documentation; or (C...

  20. 12 CFR 222.82 - Duties of users regarding address discrepancies.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... consumer's identity in accordance with the requirements of the Customer Identification Program (CIP) rules... of address notifications, other customer account records, or retained CIP documentation; or (C...

  1. 12 CFR 571.82 - Duties of users regarding address discrepancies.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... consumer's identity in accordance with the requirements of the Customer Identification Program (CIP) rules... of address notifications, other customer account records, or retained CIP documentation; or (C...

  2. 12 CFR 222.82 - Duties of users regarding address discrepancies.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... consumer's identity in accordance with the requirements of the Customer Identification Program (CIP) rules... of address notifications, other customer account records, or retained CIP documentation; or (C...

  3. 12 CFR 571.82 - Duties of users regarding address discrepancies.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... consumer's identity in accordance with the requirements of the Customer Identification Program (CIP) rules... of address notifications, other customer account records, or retained CIP documentation; or (C...

  4. Excitation-scanning hyperspectral imaging as a means to discriminate various tissues types

    NASA Astrophysics Data System (ADS)

    Deal, Joshua; Favreau, Peter F.; Lopez, Carmen; Lall, Malvika; Weber, David S.; Rich, Thomas C.; Leavesley, Silas J.

    2017-02-01

    Little is currently known about the fluorescence excitation spectra of disparate tissues and how these spectra change with pathological state. Current imaging diagnostic techniques have limited capacity to investigate fluorescence excitation spectral characteristics. This study utilized excitation-scanning hyperspectral imaging to perform a comprehensive assessment of fluorescence spectral signatures of various tissues. Immediately following tissue harvest, a custom inverted microscope (TE-2000, Nikon Instruments) with Xe arc lamp and thin film tunable filter array (VersaChrome, Semrock, Inc.) were used to acquire hyperspectral image data from each sample. Scans utilized excitation wavelengths from 340 nm to 550 nm in 5 nm increments. Hyperspectral images were analyzed with custom Matlab scripts including linear spectral unmixing (LSU), principal component analysis (PCA), and Gaussian mixture modeling (GMM). Spectra were examined for potential characteristic features such as consistent intensity peaks at specific wavelengths or intensity ratios among significant wavelengths. The resultant spectral features were conserved among tissues of similar molecular composition. Additionally, excitation spectra appear to be a mixture of pure endmembers with commonalities across tissues of varied molecular composition, potentially identifiable through GMM. These results suggest the presence of common autofluorescent molecules in most tissues and that excitationscanning hyperspectral imaging may serve as an approach for characterizing tissue composition as well as pathologic state. Future work will test the feasibility of excitation-scanning hyperspectral imaging as a contrast mode for discriminating normal and pathological tissues.

  5. A MATLAB-Aided Method for Teaching Calculus-Based Business Mathematics

    ERIC Educational Resources Information Center

    Liang, Jiajuan; Pan, William S. Y.

    2009-01-01

    MATLAB is a powerful package for numerical computation. MATLAB contains a rich pool of mathematical functions and provides flexible plotting functions for illustrating mathematical solutions. The course of calculus-based business mathematics consists of two major topics: 1) derivative and its applications in business; and 2) integration and its…

  6. Using Matlab in a Multivariable Calculus Course.

    ERIC Educational Resources Information Center

    Schlatter, Mark D.

    The benefits of high-level mathematics packages such as Matlab include both a computer algebra system and the ability to provide students with concrete visual examples. This paper discusses how both capabilities of Matlab were used in a multivariate calculus class. Graphical user interfaces which display three-dimensional surfaces, contour plots,…

  7. Sparse Matrices in MATLAB: Design and Implementation

    NASA Technical Reports Server (NTRS)

    Gilbert, John R.; Moler, Cleve; Schreiber, Robert

    1992-01-01

    The matrix computation language and environment MATLAB is extended to include sparse matrix storage and operations. The only change to the outward appearance of the MATLAB language is a pair of commands to create full or sparse matrices. Nearly all the operations of MATLAB now apply equally to full or sparse matrices, without any explicit action by the user. The sparse data structure represents a matrix in space proportional to the number of nonzero entries, and most of the operations compute sparse results in time proportional to the number of arithmetic operations on nonzeros.

  8. Mathematical Formulation used by MATLAB Code to Convert FTIR Interferograms to Calibrated Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Derek Elswick

    This report discusses the mathematical procedures used to convert raw interferograms from Fourier transform infrared (FTIR) sensors to calibrated spectra. The work discussed in this report was completed as part of the Helios project at Los Alamos National Laboratory. MATLAB code was developed to convert the raw interferograms to calibrated spectra. The report summarizes the developed MATLAB scripts and functions, along with a description of the mathematical methods used by the code. The first step in working with raw interferograms is to convert them to uncalibrated spectra by applying an apodization function to the raw data and then by performingmore » a Fourier transform. The developed MATLAB code also addresses phase error correction by applying the Mertz method. This report provides documentation for the MATLAB scripts.« less

  9. Quanty4RIXS: a program for crystal field multiplet calculations of RIXS and RIXS-MCD spectra using Quanty.

    PubMed

    Zimmermann, Patric; Green, Robert J; Haverkort, Maurits W; de Groot, Frank M F

    2018-05-01

    Some initial instructions for the Quanty4RIXS program written in MATLAB ® are provided. The program assists in the calculation of 1s 2p RIXS and 1s 2p RIXS-MCD spectra using Quanty. Furthermore, 1s XAS and 2p 3d RIXS calculations in different symmetries can also be performed. It includes the Hartree-Fock values for the Slater integrals and spin-orbit interactions for several 3d transition metal ions that are required to create the .lua scripts containing all necessary parameters and quantum mechanical definitions for the calculations. The program can be used free of charge and is designed to allow for further adjustments of the scripts. open access.

  10. TEP Power Partners Project [Tucson Electric Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    2014-02-06

    The Arizona Governor’s Office of Energy Policy, in partnership with Tucson Electric Power (TEP), Tendril, and Next Phase Energy (NPE), formed the TEP Power Partners pilot project to demonstrate how residential customers could access their energy usage data and third party applications using data obtained from an Automatic Meter Reading (AMR) network. The project applied for and was awarded a Smart Grid Data Access grant through the U.S. Department of Energy. The project participants’ goal for Phase I is to actively engage 1,700 residential customers to demonstrate sustained participation, reduction in energy usage (kWh) and cost ($), and measure relatedmore » aspects of customer satisfaction. This Demonstration report presents a summary of the findings, effectiveness, and customer satisfaction with the 15-month TEP Power Partners pilot project. The objective of the program is to provide residential customers with energy consumption data from AMR metering and empower these participants to better manage their electricity use. The pilot recruitment goals included migrating 700 existing customers from the completed Power Partners Demand Response Load Control Project (DRLC), and enrolling 1,000 new participants. Upon conclusion of the project on November 19, 2013; 1,390 Home Area Networks (HANs) were registered; 797 new participants installed a HAN; Survey respondents’ are satisfied with the program and found value with a variety of specific program components; Survey respondents report feeling greater control over their energy usage and report taking energy savings actions in their homes after participating in the program; On average, 43 % of the participants returned to the web portal monthly and 15% returned weekly; and An impact evaluation was completed by Opinion Dynamics and found average participant savings for the treatment period1 to be 2.3% of their household use during this period.2 In total, the program saved 163 MWh in the treatment period of 2013.« less

  11. 12 CFR Appendix J to Part 571 - Interagency Guidelines on Identity Theft Detection, Prevention, and Mitigation

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... foreseeable risks to customers or to the safety and soundness of the financial institution or creditor from...) Notice from customers, victims of identity theft, law enforcement authorities, or other persons regarding... the Customer Identification Program rules implementing 31 U.S.C. 5318(l) (31 CFR 103.121); and (b...

  12. 12 CFR Appendix J to Part 334 - Interagency Guidelines on Identity Theft Detection, Prevention, and Mitigation

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... control reasonably foreseeable risks to customers or to the safety and soundness of the financial...; and (5) Notice from customers, victims of identity theft, law enforcement authorities, or other... verification set forth in the Customer Identification Program rules implementing 31 U.S.C. 5318(l)(31 CFR 103...

  13. No More "Magic Aprons": Longitudinal Assessment and Continuous Improvement of Customer Service at the University of North Dakota Libraries

    ERIC Educational Resources Information Center

    Clark, Karlene T.; Walker, Stephanie R.

    2017-01-01

    The University of North Dakota (UND) Libraries have developed a multi-award winning Customer Service Program (CSP) involving longitudinal assessment and continuous improvement. The CSP consists of iterative training modules; constant reinforcement of Customer Service Principles with multiple communication strategies and tools, and incentives that…

  14. 20 CFR 666.240 - Under what circumstances may a sanction be applied to a State that fails to achieve negotiated...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... agreed to under § 666.120 for core indicators of performance or customer satisfaction indicators for the... meet the negotiated levels of performance for core indicators of performance or customer satisfaction... indicators of performance and customer satisfaction indicators for that program. (WIA sec. 136(g).) (d) Only...

  15. 10 CFR 905.15 - What are the requirements for the small customer plan alternative?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false What are the requirements for the small customer plan alternative? 905.15 Section 905.15 Energy DEPARTMENT OF ENERGY ENERGY PLANNING AND MANAGEMENT PROGRAM Integrated Resource Planning § 905.15 What are the requirements for the small customer plan alternative? (a...

  16. 10 CFR 905.15 - What are the requirements for the small customer plan alternative?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false What are the requirements for the small customer plan alternative? 905.15 Section 905.15 Energy DEPARTMENT OF ENERGY ENERGY PLANNING AND MANAGEMENT PROGRAM Integrated Resource Planning § 905.15 What are the requirements for the small customer plan alternative? (a...

  17. 10 CFR 905.15 - What are the requirements for the small customer plan alternative?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false What are the requirements for the small customer plan alternative? 905.15 Section 905.15 Energy DEPARTMENT OF ENERGY ENERGY PLANNING AND MANAGEMENT PROGRAM Integrated Resource Planning § 905.15 What are the requirements for the small customer plan alternative? (a...

  18. A Typology Framework of Loyalty Reward Programs

    NASA Astrophysics Data System (ADS)

    Cao, Yuheng; Nsakanda, Aaron Luntala; Mann, Inder Jit Singh

    Loyalty reward programs (LRPs), initially developed as marketing programs to enhance customer retention, have now become an important part of customer-focused business strategy. With the proliferation and increasing economy impact of the programs, the management complexity in the programs has also increased. However, despite widespread adoption of LRPs in business, academic research in the field seems to lag behind its practical application. Even the fundamental questions such as what LRPs are and how to classify them have not yet been fully addressed. In this paper, a comprehensive framework for LRP classification is proposed, which provides a foundation for further study of LRP design and planning issues.

  19. Engaging community businesses in human immunodeficiency virus prevention: a feasibility study.

    PubMed

    Rovniak, Liza S; Hovell, Melbourne F; Hofstetter, C Richard; Blumberg, Elaine J; Sipan, Carol L; Batista, Marcia F; Martinez-Donate, Ana P; Mulvihill, Mary M; Ayala, Guadalupe X

    2010-01-01

    To explore the feasibility of engaging community businesses in human immunodeficiency virus (HIV) prevention. Randomly selected business owners/managers were asked to display discreetly wrapped condoms and brochures, both of which were provided free-of-charge for 3 months. Assessments were conducted at baseline, mid-program, and post-program. Customer feedback was obtained through an online survey. Participants were selected from a San Diego, California neighborhood with a high rate of acquired immune deficiency syndrome. Fifty-one business owners/managers who represented 10 retail categories, and 52 customers. Participation rates, descriptive characteristics, number of condoms and brochures distributed, customer feedback, business owners'/managers' program satisfaction, and business owners'/managers' willingness to provide future support for HIV prevention were measured. Kruskal-Wallis, Mann-Whitney U, Fisher's exact, and McNemar's tests were used to analyze data. The 20 business owners/managers (39%) who agreed to distribute condoms and brochures reported fewer years in business and more employees than those who agreed only to distribute brochures (20%) or who refused to participate (41%; p < .05). Bars were the easiest of ten retail categories to recruit. Businesses with more employees and customers distributed more condoms and brochures (p < .05). More than 90% of customers supported distributing condoms and brochures in businesses, and 96% of business owners/managers described their program experience as positive. Businesses are willing to distribute condoms and brochures to prevent HIV. Policies to increase business participation in HIV prevention should be developed and tested.

  20. High-Fidelity Real-Time Trajectory Optimization for Reusable Launch Vehicles

    DTIC Science & Technology

    2006-12-01

    6.20 Max DR Yawing Moment History. ...............................................................270 Figure 6.21 Snapshot from MATLAB “Profile...Propagation using “ode45” (Euler Angles)...........................................330 Figure 6.114 Interpolated Elevon Controls using Various MATLAB ...Schemes.................332 Figure 6.115 Interpolated Flap Controls using Various MATLAB Schemes.....................333 Figure 6.116 Interpolated

  1. A Matlab/Simulink-Based Interactive Module for Servo Systems Learning

    ERIC Educational Resources Information Center

    Aliane, N.

    2010-01-01

    This paper presents an interactive module for learning both the fundamental and practical issues of servo systems. This module, developed using Simulink in conjunction with the Matlab graphical user interface (Matlab-GUI) tool, is used to supplement conventional lectures in control engineering and robotics subjects. First, the paper introduces the…

  2. Customer Credit. Unit 19. Level 1. Instructor Guide. PACE: Program for Acquiring Competence in Entrepreneurship. Third Edition. Research & Development Series No. 301-19.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This instructor guide for a unit on customer credit in the PACE (Program for Acquiring Competence in Entrepreneurship) curriculum includes the full text of the student module and lesson plans, instructional suggestions, and other teacher resources. The competencies that are incorporated into this module are at Level 1 of learning--understanding…

  3. Design and Testing of an Air Force Services Mystery Shopping Program.

    DTIC Science & Technology

    1998-11-01

    Base level Air Force Services’ lodging and foodservice activities use limited service quality measurement tools to determine customer perceptions of... service quality . These tools, specifically management observation and customer comment cards, do not provide a complete picture of service quality . Other... service quality measurement methods such as mystery shopping are rarely used. Bases do not consider using mystery shopping programs because of the

  4. 31 CFR 1020.220 - Customer identification programs for banks, savings associations, credit unions, and certain non...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... written Customer Identification Program (CIP) appropriate for its size and type of business that, at a.... 5318(h), 12 U.S.C. 1818(s), or 12 U.S.C. 1786(q)(1), then the CIP must be a part of the anti-money... directors. (2) Identity verification procedures. The CIP must include risk-based procedures for verifying...

  5. 31 CFR 1020.220 - Customer identification programs for banks, savings associations, credit unions, and certain non...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... written Customer Identification Program (CIP) appropriate for its size and type of business that, at a.... 5318(h), 12 U.S.C. 1818(s), or 12 U.S.C. 1786(q)(1), then the CIP must be a part of the anti-money... directors. (2) Identity verification procedures. The CIP must include risk-based procedures for verifying...

  6. 76 FR 14793 - Procedures for Monitoring Bank Secrecy Act Compliance and Fair Credit Reporting: Technical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-18

    ... regulations in 12 CFR 326.8, and specific cross- references to the Customer Identification Program (``CIP''), 31 CFR 103.121, in 12 CFR 326.8, 12 CFR 334.82, and Appendix J to Part 334. The CIP regulation, which... of the Customer Identification Program (CIP) rules implementing 31 U.S.C. 5318(l) (31 CFR 1020.220...

  7. 31 CFR 1020.220 - Customer identification programs for banks, savings associations, credit unions, and certain non...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... written Customer Identification Program (CIP) appropriate for its size and type of business that, at a.... 5318(h), 12 U.S.C. 1818(s), or 12 U.S.C. 1786(q)(1), then the CIP must be a part of the anti-money... directors. (2) Identity verification procedures. The CIP must include risk-based procedures for verifying...

  8. 31 CFR 1020.220 - Customer identification programs for banks, savings associations, credit unions, and certain non...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... written Customer Identification Program (CIP) appropriate for its size and type of business that, at a.... 5318(h), 12 U.S.C. 1818(s), or 12 U.S.C. 1786(q)(1), then the CIP must be a part of the anti-money... directors. (2) Identity verification procedures. The CIP must include risk-based procedures for verifying...

  9. Optimisation of groundwater level monitoring networks using geostatistical modelling based on the Spartan family variogram and a genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Parasyris, Antonios E.; Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2016-04-01

    Groundwater level monitoring networks provide essential information for water resources management, especially in areas with significant groundwater exploitation for agricultural and domestic use. Given the high maintenance costs of these networks, development of tools, which can be used by regulators for efficient network design is essential. In this work, a monitoring network optimisation tool is presented. The network optimisation tool couples geostatistical modelling based on the Spartan family variogram with a genetic algorithm method and is applied to Mires basin in Crete, Greece, an area of high socioeconomic and agricultural interest, which suffers from groundwater overexploitation leading to a dramatic decrease of groundwater levels. The purpose of the optimisation tool is to determine which wells to exclude from the monitoring network because they add little or no beneficial information to groundwater level mapping of the area. Unlike previous relevant investigations, the network optimisation tool presented here uses Ordinary Kriging with the recently-established non-differentiable Spartan variogram for groundwater level mapping, which, based on a previous geostatistical study in the area leads to optimal groundwater level mapping. Seventy boreholes operate in the area for groundwater abstraction and water level monitoring. The Spartan variogram gives overall the most accurate groundwater level estimates followed closely by the power-law model. The geostatistical model is coupled to an integer genetic algorithm method programmed in MATLAB 2015a. The algorithm is used to find the set of wells whose removal leads to the minimum error between the original water level mapping using all the available wells in the network and the groundwater level mapping using the reduced well network (error is defined as the 2-norm of the difference between the original mapping matrix with 70 wells and the mapping matrix of the reduced well network). The solution to the optimization problem (the best wells to retain in the monitoring network) depends on the total number of wells removed; this number is a management decision. The water level monitoring network of Mires basin has been optimized 6 times by removing 5, 8, 12, 15, 20 and 25 wells from the original network. In order to achieve the optimum solution in the minimum possible computational time, a stall generations criterion was set for each optimisation scenario. An improvement made to the classic genetic algorithm was the change of the mutation and crossover fraction in respect to the change of the mean fitness value. This results to a randomness in reproduction, if the solution converges, to avoid local minima, or, in a more educated reproduction (higher crossover ratio) when there is higher change in the mean fitness value. The choice of integer genetic algorithm in MATLAB 2015a poses the restriction of adding custom selection and crossover-mutation functions. Therefore, custom population and crossover-mutation-selection functions have been created to set the initial population type to custom and have the ability to change the mutation crossover probability in respect to the convergence of the genetic algorithm, achieving thus higher accuracy. The application of the network optimisation tool to Mires basin indicates that 25 wells can be removed with a relatively small deterioration of the groundwater level map. The results indicate the robustness of the network optimisation tool: Wells were removed from high well-density areas while preserving the spatial pattern of the original groundwater level map. Varouchakis, E. A. and D. T. Hristopulos (2013). "Improvement of groundwater level prediction in sparsely gauged basins using physical laws and local geographic features as auxiliary variables." Advances in Water Resources 52: 34-49.

  10. Image Analysis Using Quantum Entropy Scale Space and Diffusion Concepts

    DTIC Science & Technology

    2009-11-01

    images using a combination of analytic methods and prototype Matlab and Mathematica programs. We investigated concepts of generalized entropy and...Schmidt strength from quantum logic gate decomposition. This form of entropy gives a measure of the nonlocal content of an entangling logic gate...11 We recall that the Schmidt number is an indicator of entanglement , but not a measure of entanglement . For instance, let us compare

  11. [Design of hand-held heart rate variability acquisition and analysis system].

    PubMed

    Li, Kaiyuan; Wang, Buqing; Wang, Weidong

    2012-07-01

    A design of handheld heart rate variability acquisition and analysis system is proposed. The system collects and stores the patient's ECG every five minutes through both hands touching on the electrodes, and then -uploads data to a PC through USB port. The system uses software written in LabVIEW to analyze heart rate variability parameters, The parameters calculated function is programmed and generated to components in Matlab.

  12. ISMRM Raw data format: A proposed standard for MRI raw datasets.

    PubMed

    Inati, Souheil J; Naegele, Joseph D; Zwart, Nicholas R; Roopchansingh, Vinai; Lizak, Martin J; Hansen, David C; Liu, Chia-Ying; Atkinson, David; Kellman, Peter; Kozerke, Sebastian; Xue, Hui; Campbell-Washburn, Adrienne E; Sørensen, Thomas S; Hansen, Michael S

    2017-01-01

    This work proposes the ISMRM Raw Data format as a common MR raw data format, which promotes algorithm and data sharing. A file format consisting of a flexible header and tagged frames of k-space data was designed. Application Programming Interfaces were implemented in C/C++, MATLAB, and Python. Converters for Bruker, General Electric, Philips, and Siemens proprietary file formats were implemented in C++. Raw data were collected using magnetic resonance imaging scanners from four vendors, converted to ISMRM Raw Data format, and reconstructed using software implemented in three programming languages (C++, MATLAB, Python). Images were obtained by reconstructing the raw data from all vendors. The source code, raw data, and images comprising this work are shared online, serving as an example of an image reconstruction project following a paradigm of reproducible research. The proposed raw data format solves a practical problem for the magnetic resonance imaging community. It may serve as a foundation for reproducible research and collaborations. The ISMRM Raw Data format is a completely open and community-driven format, and the scientific community is invited (including commercial vendors) to participate either as users or developers. Magn Reson Med 77:411-421, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Monitoring and Acquisition Real-time System (MARS)

    NASA Technical Reports Server (NTRS)

    Holland, Corbin

    2013-01-01

    MARS is a graphical user interface (GUI) written in MATLAB and Java, allowing the user to configure and control the Scalable Parallel Architecture for Real-Time Acquisition and Analysis (SPARTAA) data acquisition system. SPARTAA not only acquires data, but also allows for complex algorithms to be applied to the acquired data in real time. The MARS client allows the user to set up and configure all settings regarding the data channels attached to the system, as well as have complete control over starting and stopping data acquisition. It provides a unique "Test" programming environment, allowing the user to create tests consisting of a series of alarms, each of which contains any number of data channels. Each alarm is configured with a particular algorithm, determining the type of processing that will be applied on each data channel and tested against a defined threshold. Tests can be uploaded to SPARTAA, thereby teaching it how to process the data. The uniqueness of MARS is in its capability to be adaptable easily to many test configurations. MARS sends and receives protocols via TCP/IP, which allows for quick integration into almost any test environment. The use of MATLAB and Java as the programming languages allows for developers to integrate the software across multiple operating platforms.

  14. The pointillism method for creating stimuli suitable for use in computer-based visual contrast sensitivity testing.

    PubMed

    Turner, Travis H

    2005-03-30

    An increasingly large corpus of clinical and experimental neuropsychological research has demonstrated the utility of measuring visual contrast sensitivity. Unfortunately, existing means of measuring contrast sensitivity can be prohibitively expensive, difficult to standardize, or lack reliability. Additionally, most existing tests do not allow full control over important characteristics, such as off-angle rotations, waveform, contrast, and spatial frequency. Ideally, researchers could manipulate characteristics and display stimuli in a computerized task designed to meet experimental needs. Thus far, 256-bit color limitation in standard cathode ray tube (CRT) monitors has been preclusive. To this end, the pointillism method (PM) was developed. Using MATLAB software, stimuli are created based on both mathematical and stochastic components, such that differences in regional luminance values of the gradient field closely approximate the desired contrast. This paper describes the method and examines its performance in sine and square-wave image sets from a range of contrast values. Results suggest the utility of the method for most experimental applications. Weaknesses in the current version, the need for validation and reliability studies, and considerations regarding applications are discussed. Syntax for the program is provided in an appendix, and a version of the program independent of MATLAB is available from the author.

  15. Acoustics Research of Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Gao, Ximing; Houston, Janice

    2014-01-01

    The liftoff phase induces high acoustic loading over a broad frequency range for a launch vehicle. These external acoustic environments are used in the prediction of the internal vibration responses of the vehicle and components. Present liftoff vehicle acoustic environment prediction methods utilize stationary data from previously conducted hold-down tests to generate 1/3 octave band Sound Pressure Level (SPL) spectra. In an effort to update the accuracy and quality of liftoff acoustic loading predictions, non-stationary flight data from the Ares I-X were processed in PC-Signal in two flight phases: simulated hold-down and liftoff. In conjunction, the Prediction of Acoustic Vehicle Environments (PAVE) program was developed in MATLAB to allow for efficient predictions of sound pressure levels (SPLs) as a function of station number along the vehicle using semi-empirical methods. This consisted of generating the Dimensionless Spectrum Function (DSF) and Dimensionless Source Location (DSL) curves from the Ares I-X flight data. These are then used in the MATLAB program to generate the 1/3 octave band SPL spectra. Concluding results show major differences in SPLs between the hold-down test data and the processed Ares I-X flight data making the Ares I-X flight data more practical for future vehicle acoustic environment predictions.

  16. Increasing the computational efficient of digital cross correlation by a vectorization method

    NASA Astrophysics Data System (ADS)

    Chang, Ching-Yuan; Ma, Chien-Ching

    2017-08-01

    This study presents a vectorization method for use in MATLAB programming aimed at increasing the computational efficiency of digital cross correlation in sound and images, resulting in a speedup of 6.387 and 36.044 times compared with performance values obtained from looped expression. This work bridges the gap between matrix operations and loop iteration, preserving flexibility and efficiency in program testing. This paper uses numerical simulation to verify the speedup of the proposed vectorization method as well as experiments to measure the quantitative transient displacement response subjected to dynamic impact loading. The experiment involved the use of a high speed camera as well as a fiber optic system to measure the transient displacement in a cantilever beam under impact from a steel ball. Experimental measurement data obtained from the two methods are in excellent agreement in both the time and frequency domain, with discrepancies of only 0.68%. Numerical and experiment results demonstrate the efficacy of the proposed vectorization method with regard to computational speed in signal processing and high precision in the correlation algorithm. We also present the source code with which to build MATLAB-executable functions on Windows as well as Linux platforms, and provide a series of examples to demonstrate the application of the proposed vectorization method.

  17. SPSS and SAS programs for generalizability theory analyses.

    PubMed

    Mushquash, Christopher; O'Connor, Brian P

    2006-08-01

    The identification and reduction of measurement errors is a major challenge in psychological testing. Most investigators rely solely on classical test theory for assessing reliability, whereas most experts have long recommended using generalizability theory instead. One reason for the common neglect of generalizability theory is the absence of analytic facilities for this purpose in popular statistical software packages. This article provides a brief introduction to generalizability theory, describes easy to use SPSS, SAS, and MATLAB programs for conducting the recommended analyses, and provides an illustrative example, using data (N = 329) for the Rosenberg Self-Esteem Scale. Program output includes variance components, relative and absolute errors and generalizability coefficients, coefficients for D studies, and graphs of D study results.

  18. InterFace: A software package for face image warping, averaging, and principal components analysis.

    PubMed

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  19. U.S. Army Corps of Engineers: Building Overhead Costs into Projects and Customers’ Views on Information Provided

    DTIC Science & Technology

    2013-06-01

    U.S. ARMY CORPS OF ENGINEERS Building Overhead Costs into Projects and Customers ’ Views on Information Provided...Overhead Costs into Projects and Customers ’ Views on Information Provided 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...and Customers ’ Views on Information Provided Why GAO Did This Study The Corps spends billions of dollars annually on projects in its Civil Works

  20. Department of Defense’s Need to Become a Responsible Commercial Customer

    DTIC Science & Technology

    2002-01-01

    Commercial Customer 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT...Of the military services, the Navy has been the sole one 24 Department of Defense’s Need To Become a Responsible Commercial Customer O By Lt...leased. For DoD customers , this means that finding the com- mercial SATCOM bandwidth they require, as and when they require it, can be difficult

Top