14 CFR 1215.108 - Defining user service requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Defining user service requirements. 1215.108 Section 1215.108 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND DATA..., spacecraft design, operations planning, and other significant mission parameters. When these user evaluations...
14 CFR 1215.108 - Defining user service requirements.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 5 2012-01-01 2012-01-01 false Defining user service requirements. 1215.108 Section 1215.108 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND... services, spacecraft design, operations planning, and other significant mission parameters. When these user...
2009-04-01
Contrast signature plots for the simple wireframe model with user-defined thermal boundary conditions and an exhaust plume ...boundary conditions but no exhaust plume ................................................................................. 25 A.3. Contrast signature...plots for the simple wireframe model with no user-defined thermal boundary conditions or exhaust plume
14 CFR § 1215.108 - Defining user service requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... planning, and other significant mission parameters. It is recommended that potential users contact the NIMO... Flight Center, M/S 450.1, 8800 Greenbelt Road Greenbelt, MD 20771. [77 FR 6953, Feb. 10, 2012] ...
MAPA: an interactive accelerator design code with GUI
NASA Astrophysics Data System (ADS)
Bruhwiler, David L.; Cary, John R.; Shasharina, Svetlana G.
1999-06-01
The MAPA code is an interactive accelerator modeling and design tool with an X/Motif GUI. MAPA has been developed in C++ and makes full use of object-oriented features. We present an overview of its features and describe how users can independently extend the capabilities of the entire application, including the GUI. For example, a user can define a new model for a focusing or accelerating element. If the appropriate form is followed, and the new element is "registered" with a single line in the specified file, then the GUI will fully support this user-defined element type after it has been compiled and then linked to the existing application. In particular, the GUI will bring up windows for modifying any relevant parameters of the new element type. At present, one can use the GUI for phase space tracking, finding fixed points and generating line plots for the Twiss parameters, the dispersion and the accelerator geometry. The user can define new types of simulations which the GUI will automatically support by providing a menu option to execute the simulation and subsequently rendering line plots of the resulting data.
Halford, Keith J.
2006-01-01
MODOPTIM is a non-linear ground-water model calibration and management tool that simulates flow with MODFLOW-96 as a subroutine. A weighted sum-of-squares objective function defines optimal solutions for calibration and management problems. Water levels, discharges, water quality, subsidence, and pumping-lift costs are the five direct observation types that can be compared in MODOPTIM. Differences between direct observations of the same type can be compared to fit temporal changes and spatial gradients. Water levels in pumping wells, wellbore storage in the observation wells, and rotational translation of observation wells also can be compared. Negative and positive residuals can be weighted unequally so inequality constraints such as maximum chloride concentrations or minimum water levels can be incorporated in the objective function. Optimization parameters are defined with zones and parameter-weight matrices. Parameter change is estimated iteratively with a quasi-Newton algorithm and is constrained to a user-defined maximum parameter change per iteration. Parameters that are less sensitive than a user-defined threshold are not estimated. MODOPTIM facilitates testing more conceptual models by expediting calibration of each conceptual model. Examples of applying MODOPTIM to aquifer-test analysis, ground-water management, and parameter estimation problems are presented.
Potential markets for a satellite-based mobile communications system
NASA Technical Reports Server (NTRS)
Jamieson, W. M.; Peet, C. S.; Bengston, R. J.
1976-01-01
The objective of the study was to define the market needs for improved land mobile communications systems. Within the context of this objective, the following goals were set: (1) characterize the present mobile communications industry; (2) determine the market for an improved system for mobile communications; and (3) define the system requirements as seen from the potential customer's viewpoint. The scope of the study was defined by the following parameters: (1) markets were confined to U.S. and Canada; (2) range of operation generally exceeded 20 miles, but this was not restrictive; (3) the classes of potential users considered included all private sector users, and non-military public sector users; (4) the time span examined was 1975 to 1985; and (5) highly localized users were generally excluded - e.g., taxicabs, and local paging.
Path tracking control of an omni-directional walker considering pressures from a user.
Tan, Renpeng; Wang, Shuoyu; Jiang, Yinlai; Ishida, Kenji; Fujie, Masakatsu G
2013-01-01
An omni-directional walker (ODW) is being developed to support the people with walking disabilities to do walking rehabilitation. The training paths, which the user follows in the rehabilitation, are defined by physical therapists and stored in the ODW. In order to obtain a good training effect, the defined training paths need to be performed accurately. However, the ODW deviates from the training path in real rehabilitation, which is caused by the variation of the whole system's parameters due to the force from the user. In this paper, the characteristics of pressures from a user are measured, based on which an adaptive controller is proposed to deal with this problem, and validated in an experiment in which a pseudo handicapped person follows the ODW. The experimental results show that the proposed method can control the ODW to accurately follow the defined path with or without a user.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-04
... end-user defined filters (the ``Service''). Spread Crawler, which was developed by MEB, listens to a... based on a registered end-user input (i.e. custom-set parameters for particular symbols or industry...-user(s), if any, would be interested in seeing this order. These filtering rules are contained in a...
Terrestrial Investigation Model, TIM, has several appendices to its user guide. This is the appendix that includes an example input file in its preserved format. Both parameters and comments defining them are included.
Enhanced Elliptic Grid Generation
NASA Technical Reports Server (NTRS)
Kaul, Upender K.
2007-01-01
An enhanced method of elliptic grid generation has been invented. Whereas prior methods require user input of certain grid parameters, this method provides for these parameters to be determined automatically. "Elliptic grid generation" signifies generation of generalized curvilinear coordinate grids through solution of elliptic partial differential equations (PDEs). Usually, such grids are fitted to bounding bodies and used in numerical solution of other PDEs like those of fluid flow, heat flow, and electromagnetics. Such a grid is smooth and has continuous first and second derivatives (and possibly also continuous higher-order derivatives), grid lines are appropriately stretched or clustered, and grid lines are orthogonal or nearly so over most of the grid domain. The source terms in the grid-generating PDEs (hereafter called "defining" PDEs) make it possible for the grid to satisfy requirements for clustering and orthogonality properties in the vicinity of specific surfaces in three dimensions or in the vicinity of specific lines in two dimensions. The grid parameters in question are decay parameters that appear in the source terms of the inhomogeneous defining PDEs. The decay parameters are characteristic lengths in exponential- decay factors that express how the influences of the boundaries decrease with distance from the boundaries. These terms govern the rates at which distance between adjacent grid lines change with distance from nearby boundaries. Heretofore, users have arbitrarily specified decay parameters. However, the characteristic lengths are coupled with the strengths of the source terms, such that arbitrary specification could lead to conflicts among parameter values. Moreover, the manual insertion of decay parameters is cumbersome for static grids and infeasible for dynamically changing grids. In the present method, manual insertion and user specification of decay parameters are neither required nor allowed. Instead, the decay parameters are determined automatically as part of the solution of the defining PDEs. Depending on the shape of the boundary segments and the physical nature of the problem to be solved on the grid, the solution of the defining PDEs may provide for rates of decay to vary along and among the boundary segments and may lend itself to interpretation in terms of one or more physical quantities associated with the problem.
Manned Orbital Transfer Vehicle (MOTV). Volume 4: Supporting analysis
NASA Technical Reports Server (NTRS)
Boyland, R. E.; Sherman, S. W.; Morfin, H. W.
1979-01-01
Generic missions were defined to enable potential users to determine the parameters for suggested user projects. Mission modes were identified for providing operation, interfaces, performance, and cost data for studying payloads. Safety requirements for emergencies during various phases of the mission are considered with emphasis on radiation hazards.
Nonlinear Meshfree Analysis Program (NMAP) Version 1.0 (User’s Manual)
2012-12-01
divided by the number of time increments used in the analysis . In addition to prescribing total nodal displacements in the neutral file, users are...conditions, the user must define material properties, initial conditions, and a variety of control parameters for the NMAP analysis . These data are provided...a script file. Restart A restart function is provided in the NMAP code, where the user may restart an analysis using a set of restart files. In
Differential-Evolution Control Parameter Optimization for Unmanned Aerial Vehicle Path Planning
Kok, Kai Yit; Rajendran, Parvathy
2016-01-01
The differential evolution algorithm has been widely applied on unmanned aerial vehicle (UAV) path planning. At present, four random tuning parameters exist for differential evolution algorithm, namely, population size, differential weight, crossover, and generation number. These tuning parameters are required, together with user setting on path and computational cost weightage. However, the optimum settings of these tuning parameters vary according to application. Instead of trial and error, this paper presents an optimization method of differential evolution algorithm for tuning the parameters of UAV path planning. The parameters that this research focuses on are population size, differential weight, crossover, and generation number. The developed algorithm enables the user to simply define the weightage desired between the path and computational cost to converge with the minimum generation required based on user requirement. In conclusion, the proposed optimization of tuning parameters in differential evolution algorithm for UAV path planning expedites and improves the final output path and computational cost. PMID:26943630
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, K.R.; Fisher, J.E.
1997-03-01
ACE/gr is XY plotting tool for workstations or X-terminals using X. A few of its features are: User defined scaling, tick marks, labels, symbols, line styles, colors. Batch mode for unattended plotting. Read and write parameters used during a session. Polynomial regression, splines, running averages, DFT/FFT, cross/auto-correlation. Hardcopy support for PostScript, HP-GL, and FrameMaker.mif format. While ACE/gr has a convenient point-and-click interface, most parameter settings and operations are available through a command line interface (found in Files/Commands).
NASA Astrophysics Data System (ADS)
Polster, Lisa; Schuemann, Jan; Rinaldi, Ilaria; Burigo, Lucas; McNamara, Aimee L.; Stewart, Robert D.; Attili, Andrea; Carlson, David J.; Sato, Tatsuhiko; Ramos Méndez, José; Faddegon, Bruce; Perl, Joseph; Paganetti, Harald
2015-07-01
The aim of this work is to extend a widely used proton Monte Carlo tool, TOPAS, towards the modeling of relative biological effect (RBE) distributions in experimental arrangements as well as patients. TOPAS provides a software core which users configure by writing parameter files to, for instance, define application specific geometries and scoring conditions. Expert users may further extend TOPAS scoring capabilities by plugging in their own additional C++ code. This structure was utilized for the implementation of eight biophysical models suited to calculate proton RBE. As far as physics parameters are concerned, four of these models are based on the proton linear energy transfer, while the others are based on DNA double strand break induction and the frequency-mean specific energy, lineal energy, or delta electron generated track structure. The biological input parameters for all models are typically inferred from fits of the models to radiobiological experiments. The model structures have been implemented in a coherent way within the TOPAS architecture. Their performance was validated against measured experimental data on proton RBE in a spread-out Bragg peak using V79 Chinese Hamster cells. This work is an important step in bringing biologically optimized treatment planning for proton therapy closer to the clinical practice as it will allow researchers to refine and compare pre-defined as well as user-defined models.
Polster, Lisa; Schuemann, Jan; Rinaldi, Ilaria; Burigo, Lucas; McNamara, Aimee L.; Stewart, Robert D.; Attili, Andrea; Carlson, David J.; Sato, Tatsuhiko; Méndez, José Ramos; Faddegon, Bruce; Perl, Joseph; Paganetti, Harald
2015-01-01
The aim of this work is to extend a widely used proton Monte Carlo tool, TOPAS, towards the modeling of relative biological effect (RBE) distributions in experimental arrangements as well as patients. TOPAS provides a software core which users configure by writing parameter files to, for instance, define application specific geometries and scoring conditions. Expert users may further extend TOPAS scoring capabilities by plugging in their own additional C++ code. This structure was utilized for the implementation of eight biophysical models suited to calculate proton RBE. As far as physics parameters are concerned, four of these models are based on the proton linear energy transfer (LET), while the others are based on DNA Double Strand Break (DSB) induction and the frequency-mean specific energy, lineal energy, or delta electron generated track structure. The biological input parameters for all models are typically inferred from fits of the models to radiobiological experiments. The model structures have been implemented in a coherent way within the TOPAS architecture. Their performance was validated against measured experimental data on proton RBE in a spread-out Bragg peak using V79 Chinese Hamster cells. This work is an important step in bringing biologically optimized treatment planning for proton therapy closer to the clinical practice as it will allow researchers to refine and compare pre-defined as well as user-defined models. PMID:26061666
Bird species of concern were identified to define default parameters (body weight and diet composition) to represent birds in KABAM. KABAM is a simulation model used to predict pesticide concentrations in aquatic regions for use in exposure assessments.
Distributed Energy Resources Customer Adoption Model - Graphical User Interface, Version 2.1.8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewald, Friedrich; Stadler, Michael; Cardoso, Goncalo F
The DER-CAM Graphical User Interface has been redesigned to consist of a dynamic tree structure on the left side of the application window to allow users to quickly navigate between different data categories and views. Views can either be tables with model parameters and input data, the optimization results, or a graphical interface to draw circuit topology and visualize investment results. The model parameters and input data consist of tables where values are assigned to specific keys. The aggregation of all model parameters and input data amounts to the data required to build a DER-CAM model, and is passed tomore » the GAMS solver when users initiate the DER-CAM optimization process. Passing data to the GAMS solver relies on the use of a Java server that handles DER-CAM requests, queuing, and results delivery. This component of the DER-CAM GUI can be deployed either locally or remotely, and constitutes an intermediate step between the user data input and manipulation, and the execution of a DER-CAM optimization in the GAMS engine. The results view shows the results of the DER-CAM optimization and distinguishes between a single and a multi-objective process. The single optimization runs the DER-CAM optimization once and presents the results as a combination of summary charts and hourly dispatch profiles. The multi-objective optimization process consists of a sequence of runs initiated by the GUI, including: 1) CO2 minimization, 2) cost minimization, 3) a user defined number of points in-between objectives 1) and 2). The multi-objective results view includes both access to the detailed results of each point generated by the process as well as the generation of a Pareto Frontier graph to illustrate the trade-off between objectives. DER-CAM GUI 2.1.8 also introduces the ability to graphically generate circuit topologies, enabling support to DER-CAM 5.0.0. This feature consists of: 1) The drawing area, where users can manually create nodes and define their properties (e.g. point of common coupling, slack bus, load) and connect them through edges representing either power lines, transformers, or heat pipes, all with user defined characteristics (e.g., length, ampacity, inductance, or heat loss); 2) The tables, which display the user-defined topology in the final numerical form that will be passed to the DER-CAM optimization. Finally, the DER-CAM GUI is also deployed with a database schema that allows users to provide different energy load profiles, solar irradiance profiles, and tariff data, that can be stored locally and later used in any DER-CAM model. However, no real data will be delivered with this version.« less
NASA Technical Reports Server (NTRS)
Birch, J. N.
1971-01-01
A compromise optimum design for the low data rate users of the Tracking and Data Relay Satellite System (TDRSS) is presented. Design goals for the TDRSS are employed in this report to arrive at the transponder design. Multipath, R.F.I., antenna pattern anomolies, other user signals, and other definable degrading factors are included as trade-off parameters in the design. Synchronization, emergency voice, user stabilization, polarization diversity and error control coding are also considered and their impact on the transponder design is evaluated.
Image Display And Manipulation System (IDAMS), user's guide
NASA Technical Reports Server (NTRS)
Cecil, R. W.
1972-01-01
A combination operator's guide and user's handbook for the Image Display and Manipulation System (IDAMS) is reported. Information is presented to define how to operate the computer equipment, how to structure a run deck, and how to select parameters necessary for executing a sequence of IDAMS task routines. If more detailed information is needed on any IDAMS program, see the IDAMS program documentation.
The Gamma-Ray Burst ToolSHED is Open for Business
NASA Astrophysics Data System (ADS)
Giblin, Timothy W.; Hakkila, Jon; Haglin, David J.; Roiger, Richard J.
2004-09-01
The GRB ToolSHED, a Gamma-Ray Burst SHell for Expeditions in Data-Mining, is now online and available via a web browser to all in the scientific community. The ToolSHED is an online web utility that contains pre-processed burst attributes of the BATSE catalog and a suite of induction-based machine learning and statistical tools for classification and cluster analysis. Users create their own login account and study burst properties within user-defined multi-dimensional parameter spaces. Although new GRB attributes are periodically added to the database for user selection, the ToolSHED has a feature that allows users to upload their own burst attributes (e.g. spectral parameters, etc.) so that additional parameter spaces can be explored. A data visualization feature using GNUplot and web-based IDL has also been implemented to provide interactive plotting of user-selected session output. In an era in which GRB observations and attributes are becoming increasingly more complex, a utility such as the GRB ToolSHED may play an important role in deciphering GRB classes and understanding intrinsic burst properties.
MMA, A Computer Code for Multi-Model Analysis
Poeter, Eileen P.; Hill, Mary C.
2007-01-01
This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will be well served by the default methods provided. To use the default methods, the only required input for MMA is a list of directories where the files for the alternate models are located. Evaluation and development of model-analysis methods are active areas of research. To facilitate exploration and innovation, MMA allows the user broad discretion to define alternatives to the default procedures. For example, MMA allows the user to (a) rank models based on model criteria defined using a wide range of provided and user-defined statistics in addition to the default AIC, AICc, BIC, and KIC criteria, (b) create their own criteria using model measures available from the code, and (c) define how each model criterion is used to calculate related posterior model probabilities. The default model criteria rate models are based on model fit to observations, the number of observations and estimated parameters, and, for KIC, the Fisher information matrix. In addition, MMA allows the analysis to include an evaluation of estimated parameter values. This is accomplished by allowing the user to define unreasonable estimated parameter values or relative estimated parameter values. An example of the latter is that it may be expected that one parameter value will be less than another, as might be the case if two parameters represented the hydraulic conductivity of distinct materials such as fine and coarse sand. Models with parameter values that violate the user-defined conditions are excluded from further consideration by MMA. Ground-water models are used as examples in this report, but MMA can be used to evaluate any set of models for which the required files have been produced. MMA needs to read files from a separate directory for each alternative model considered. The needed files are produced when using the Sensitivity-Analysis or Parameter-Estimation mode of UCODE_2005, or, possibly, the equivalent capability of another program. MMA is constructed using
The iCARE R Package allows researchers to quickly build models for absolute risk, and apply them to estimate an individual's risk of developing disease during a specifed time interval, based on a set of user defined input parameters.
XPI: The Xanadu Parameter Interface
NASA Technical Reports Server (NTRS)
White, N.; Barrett, P.; Oneel, B.; Jacobs, P.
1992-01-01
XPI is a table driven parameter interface which greatly simplifies both command driven programs such as BROWSE and XIMAGE as well as stand alone single-task programs. It moves all of the syntax and semantic parsing of commands and parameters out of the users code into common code and externally defined tables. This allows the programmer to concentrate on writing the code unique to the application rather than reinventing the user interface and for external graphical interfaces to interface with no changes to the command driven program. XPI also includes a compatibility library which allows programs written using the IRAF host interface (Mandel and Roll) to use XPI in place of the IRAF host interface.
Digital Microwave System Design Guide.
1984-02-01
traffic analysis is a continuous effort, setting parameters for subsequent stages of expansion after the system design is finished. 2.1.3 Quality of...operational structure of the user for whom he is providing service. 2.2.3 Quality of Service. In digital communications, the basic performance parameter ...the basic interpretation of system performance is measured in terms of a single parameter , throughput. Throughput can be defined as the number of
Figure of Merit Characteristics Compared to Engineering Parameters
NASA Technical Reports Server (NTRS)
Rickman, D.L.; Schrader, C.M.
2010-01-01
A workshop held in 2005 defined a large number of parameters of interest for users of lunar simulants. The need for formal requirements and standards in the manufacture and use of simulants necessitates certain features of measurements. They must be definable, measureable, useful, and primary rather than derived. There are also certain features that must be avoided. Analysis of the total parameter list led to the realization that almost all of the parameters could be tightly constrained, though not predicted, if only four properties were measured: Particle composition, particle size distribution, particle shape distribution, and bulk density. These four are collectively referred to as figures of merit (FoMs). An evaluation of how each of the parameters identified in 2005 is controlled by the four FoMs is given.
NASA Astrophysics Data System (ADS)
Bresnahan, Patricia A.; Pukinskis, Madeleine; Wiggins, Michael
1999-03-01
Image quality assessment systems differ greatly with respect to the number and types of mags they need to evaluate, and their overall architectures. Managers of these systems, however, all need to be able to tune and evaluate system performance, requirements often overlooked or under-designed during project planning. Performance tuning tools allow users to define acceptable quality standards for image features and attributes by adjusting parameter settings. Performance analysis tools allow users to evaluate and/or predict how well a system performs in a given parameter state. While image assessment algorithms are becoming quite sophisticated, duplicating or surpassing the human decision making process in their speed and reliability, they often require a greater investment in 'training' or fine tuning of parameters in order to achieve optimum performance. This process may involve the analysis of hundreds or thousands of images, generating a large database of files and statistics that can be difficult to sort through and interpret. Compounding the difficulty is the fact that personnel charged with tuning and maintaining the production system may not have the statistical or analytical background required for the task. Meanwhile, hardware innovations have greatly increased the volume of images that can be handled in a given time frame, magnifying the consequences of running a production site with an inadequately tuned system. In this paper, some general requirements for a performance evaluation and tuning data visualization system are discussed. A custom engineered solution to the tuning and evaluation problem is then presented, developed within the context of a high volume image quality assessment, data entry, OCR, and image archival system. A key factor influencing the design of the system was the context-dependent definition of image quality, as perceived by a human interpreter. This led to the development of a five-level, hierarchical approach to image quality evaluation. Lower-level pass-fail conditions and decision rules were coded into the system. Higher-level image quality states were defined by allowing the users to interactively adjust the system's sensitivity to various image attributes by manipulating graphical controls. Results were presented in easily interpreted bar graphs. These graphs were mouse- sensitive, allowing the user to more fully explore the subsets of data indicated by various color blocks. In order to simplify the performance evaluation and tuning process, users could choose to view the results of (1) the existing system parameter state, (2) the results of any arbitrary parameter values they chose, or (3) the results of a quasi-optimum parameter state, derived by applying a decision rule to a large set of possible parameter states. Giving managers easy- to-use tools for defining the more subjective aspects of quality resulted in a system that responded to contextual cues that are difficult to hard-code. It had the additional advantage of allowing the definition of quality to evolve over time, as users became more knowledgeable as to the strengths and limitations of an automated quality inspection system.
Sinkó, József; Kákonyi, Róbert; Rees, Eric; Metcalf, Daniel; Knight, Alex E.; Kaminski, Clemens F.; Szabó, Gábor; Erdélyi, Miklós
2014-01-01
Localization-based super-resolution microscopy image quality depends on several factors such as dye choice and labeling strategy, microscope quality and user-defined parameters such as frame rate and number as well as the image processing algorithm. Experimental optimization of these parameters can be time-consuming and expensive so we present TestSTORM, a simulator that can be used to optimize these steps. TestSTORM users can select from among four different structures with specific patterns, dye and acquisition parameters. Example results are shown and the results of the vesicle pattern are compared with experimental data. Moreover, image stacks can be generated for further evaluation using localization algorithms, offering a tool for further software developments. PMID:24688813
The "Consumer Report" version of Earth Science Data Quality description
NASA Astrophysics Data System (ADS)
Vicente, G. A.
2014-12-01
The generation, delivery and access of Earth Observation (EO) data quality information is a difficult problem because it is not uniquely defined, user dependent, difficult to be quantified, handled differently by different teams and perceived differently by data providers and data users. Initiatives such as the International Organization for Standards (ISO) 19115 and 19157 are important steps forward but difficult to implement, too complex and out of reach for the majority of data producers and users. This is because most users only want a quick and intelligible way to compare data sets from different providers to find the ones that best fit their interest. Therefore we need to simplify the problem by focusing on a few relevant quality parameters and develop a common framework to deliver them. This work is intended to tap into the data producers and user's knowledge and expertise on data quality for the development and adoption of a "Consumer Report" version of a "Data Quality Matrix". The goal is to find the most efficient and friendly approach to displays a selected number of quality parameters rated to each product and to target group of users.
CMS Configuration Editor: GUI based application for user analysis job
NASA Astrophysics Data System (ADS)
de Cosa, A.
2011-12-01
We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.
User manual for SPLASH (Single Panel Lamp and Shroud Helper).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, Marvin Elwood
2006-02-01
The radiant heat test facility develops test sets providing well-characterized thermal environments, often representing fires. Many of the components and procedures have become standardized to such an extent that the development of a specialized design tool to determine optimal configurations for radiant heat experiments was appropriate. SPLASH (Single Panel Lamp and Shroud Helper) is that tool. SPLASH is implemented as a user-friendly, Windows-based program that allows a designer to describe a test setup in terms of parameters such as number of lamps, power, position, and separation distance. This document is a user manual for that software. Any incidental descriptions ofmore » theory are only for the purpose of defining the model inputs. The theory for the underlying model is described in SAND2005-2947 (Ref. [1]). SPLASH provides a graphical user interface to define lamp panel and shroud designs parametrically, solves the resulting radiation enclosure problem for up to 2500 surfaces, and provides post-processing to facilitate understanding and documentation of analyzed designs.« less
The role of price and enforcement in water allocation: insights from Game Theory
NASA Astrophysics Data System (ADS)
Souza Filho, F.; Lall, U.; Porto, R.
2007-12-01
As many countries are moving towards water sector reforms, practical issues of how water management institutions can better effect allocation, regulation and enforcement of water rights have emerged. The uncertainty associated with water that is available at a particular diversion point becomes a parameter that is likely to influence the behavior of water users as to their application for water licenses, as well as their willingness to pay for licensed use. The ability of a water agency to reduce this uncertainty through effective water rights enforcement is related to the fiscal ability of the agency to sustain the enforcement effort. In this paper, this interplay across the users and the agency is explored, considering the hydraulic structure or sequence of water use, and parameters that define the users and the agency's economics. The potential for free rider behavior by the users, as well as their proposals for licensed use are derived conditional on this setting. The analyses presented are developed in the framework of the theory of "Law and Economics", with user interactions modeled as a game theoretic enterprise. The state of Ceara, Brazil is used loosely as an example setting, with parameter values for the experiments indexed to be approximately those relevant for current decisions. The potential for using the ideas in participatory decision making is discussed.
Systematic Propulsion Optimization Tools (SPOT)
NASA Technical Reports Server (NTRS)
Bower, Mark; Celestian, John
1992-01-01
This paper describes a computer program written by senior-level Mechanical Engineering students at the University of Alabama in Huntsville which is capable of optimizing user-defined delivery systems for carrying payloads into orbit. The custom propulsion system is designed by the user through the input of configuration, payload, and orbital parameters. The primary advantages of the software, called Systematic Propulsion Optimization Tools (SPOT), are a user-friendly interface and a modular FORTRAN 77 code designed for ease of modification. The optimization of variables in an orbital delivery system is of critical concern in the propulsion environment. The mass of the overall system must be minimized within the maximum stress, force, and pressure constraints. SPOT utilizes the Design Optimization Tools (DOT) program for the optimization techniques. The SPOT program is divided into a main program and five modules: aerodynamic losses, orbital parameters, liquid engines, solid engines, and nozzles. The program is designed to be upgraded easily and expanded to meet specific user needs. A user's manual and a programmer's manual are currently being developed to facilitate implementation and modification.
Parallel noise barrier prediction procedure : report 2 user's manual revision 1
DOT National Transportation Integrated Search
1987-11-01
This report defines the parameters which are used to input the data required to run Program Barrier and BarrierX on a microcomputer such as an IBM PC or compatible. Directions for setting up and operating a working disk are presented. Examples of inp...
Applications of Microwaves to Remote Sensing of Terrain
NASA Technical Reports Server (NTRS)
Porter, R. A.
1975-01-01
A survey and study was conducted to define the role that microwaves may play in the measurement of a variety of terrain-related parameters. The survey consisted of discussions with many users and researchers in the field of remote sensing. In addition, a survey questionnaire was prepared and replies were solicited from these and other users and researchers. The results of the survey, and associated bibliography, were studied and conclusions were drawn as to the usefulness of radiometric systems for remote sensing of terrain.
The Stratway Program for Strategic Conflict Resolution: User's Guide
NASA Technical Reports Server (NTRS)
Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.
2016-01-01
Stratway is a strategic conflict detection and resolution program. It provides both intent-based conflict detection and conflict resolution for a single ownship in the presence of multiple traffic aircraft and weather cells defined by moving polygons. It relies on a set of heuristic search strategies to solve conflicts. These strategies are user configurable through multiple parameters. The program can be called from other programs through an application program interface (API) and can also be executed from a command line.
M-Split: A Graphical User Interface to Analyze Multilayered Anisotropy from Shear Wave Splitting
NASA Astrophysics Data System (ADS)
Abgarmi, Bizhan; Ozacar, A. Arda
2017-04-01
Shear wave splitting analysis are commonly used to infer deep anisotropic structure. For simple cases, obtained delay times and fast-axis orientations are averaged from reliable results to define anisotropy beneath recording seismic stations. However, splitting parameters show systematic variations with back azimuth in the presence of complex anisotropy and cannot be represented by average time delay and fast axis orientation. Previous researchers had identified anisotropic complexities at different tectonic settings and applied various approaches to model them. Most commonly, such complexities are modeled by using multiple anisotropic layers with priori constraints from geologic data. In this study, a graphical user interface called M-Split is developed to easily process and model multilayered anisotropy with capabilities to properly address the inherited non-uniqueness. M-Split program runs user defined grid searches through the model parameter space for two-layer anisotropy using formulation of Silver and Savage (1994) and creates sensitivity contour plots to locate local maximas and analyze all possible models with parameter tradeoffs. In order to minimize model ambiguity and identify the robust model parameters, various misfit calculation procedures are also developed and embedded to M-Split which can be used depending on the quality of the observations and their back-azimuthal coverage. Case studies carried out to evaluate the reliability of the program using real noisy data and for this purpose stations from two different networks are utilized. First seismic network is the Kandilli Observatory and Earthquake research institute (KOERI) which includes long term running permanent stations and second network comprises seismic stations deployed temporary as part of the "Continental Dynamics-Central Anatolian Tectonics (CD-CAT)" project funded by NSF. It is also worth to note that M-Split is designed as open source program which can be modified by users for additional capabilities or for other applications.
Exploring NASA OMI Level 2 Data With Visualization
NASA Technical Reports Server (NTRS)
Wei, Jennifer; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vicente, Gilberto
2014-01-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms,... etc.). Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as "images", with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data/map sources.
Exploring NASA OMI Level 2 Data With Visualization
NASA Technical Reports Server (NTRS)
Wei, Jennifer C.; Yang, Wenli; Johnson, James; Zhao, Peisheng; Gerasimov, Irina; Pham, Long; Vincente, Gilbert
2014-01-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme events (such as volcano eruptions, dust storms, etc.).Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. Such obstacles may be avoided by allowing users to visualize satellite data as images, with accurate pixel-level (Level-2) information, including pixel coverage area delineation and science team recommended quality screening for individual geophysical parameters. We present a prototype service from the Goddard Earth Sciences Data and Information Services Center (GES DISC) supporting Aura OMI Level-2 Data with GIS-like capabilities. Functionality includes selecting data sources (e.g., multiple parameters under the same scene, like NO2 and SO2, or the same parameter with different aggregation methods, like NO2 in OMNO2G and OMNO2D products), user-defined area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting, reformatting, and reprojection. The system will allow any user-defined portal interface (front-end) to connect to our backend server with OGC standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls. This back-end service should greatly enhance its expandability to integrate additional outside data-map sources.
Automated Source Depth Estimation Using Array Processing Techniques
2009-10-14
station for each depth cell , whose width is a user defined parameter, n [Bonner et al., 2002; Murphy et al. 1999]. The largest peak in the stack...Columbia University ATTN: Dr. Paul Richards Lamont-Doherty Earth Observatory Route 9W Palisades NY 10964 University of California, Davis ATTN
Information relevant to KABAM and explanations of default parameters used to define the 7 trophic levels. KABAM is a simulation model used to predict pesticide concentrations in aquatic regions for use in exposure assessments.
Shuttle/TDRSS modelling and link simulation study
NASA Technical Reports Server (NTRS)
Braun, W. R.; Mckenzie, T. M.; Biederman, L.; Lindsey, W. C.
1979-01-01
A Shuttle/TDRSS S-band and Ku-band link simulation package called LinCsim was developed for the evaluation of link performance for specific Shuttle signal designs. The link models were described in detail and the transmitter distortion parameters or user constraints were carefully defined. The overall link degradation (excluding hardware degradations) relative to an ideal BPSK channel were given for various sets of user constraint values. The performance sensitivity to each individual user constraint was then illustrated. The effect of excessive Spacelab clock jitter on the return link BER performance was also investigated as was the problem of subcarrier recovery for the K-band Shuttle return link signal.
An interactive in-game approach to user adjustment of stereoscopic 3D settings
NASA Astrophysics Data System (ADS)
Tawadrous, Mina; Hogue, Andrew; Kapralos, Bill; Collins, Karen
2013-03-01
Given the popularity of 3D film, content developers have been creating customizable stereoscopic 3D experiences for the user to enjoy at home. Stereoscopic 3D game developers often offer a `white box' approach whereby far too many controls and settings are exposed to the average consumer who may have little knowledge or interest to correctly adjust these settings. Improper settings can lead to users being uncomfortable or unimpressed with their own user-defined stereoscopic 3D experience. We have begun investigating interactive approaches to in-game adjustment of the various stereoscopic 3D parameters to reduce the reliance on the user doing so and thefore creating a more pleasurable stereoscopic 3D experience. In this paper, we describe a preliminary technique for interactively calibrating the various stereoscopic 3D parameters and we compare this interface with the typical slider-based control interface game developers utilize in commercial S3D games. Inspired by standard testing methodologies experienced at an optometrist, we've created a split-screen game with the same stereoscopic 3D game running in both screens, but with different interaxial distances. We expect that the interactive nature of the calibration will impact the final game experience providing us with an indication of whether in-game, interactive, S3D parameter calibration is a mechanism that game developers should adopt.
NASA Technical Reports Server (NTRS)
Kedar, Sharon; Baxter, Sean C.; Parker, Jay W.; Webb, Frank H.; Owen, Susan E.; Sibthorpe, Anthony J.; Dong, Danan
2011-01-01
A geodetic software analysis tool enables the user to analyze 2D crustal strain from geodetic ground motion, and create models of crustal deformation using a graphical interface. Users can use any geodetic measurements of ground motion and derive the 2D crustal strain interactively. This software also provides a forward-modeling tool that calculates a geodetic velocity and strain field for a given fault model, and lets the user compare the modeled strain field with the strain field obtained from the user s data. Users may change parameters on-the-fly and obtain a real-time recalculation of the resulting strain field. Four data products are computed: maximum shear, dilatation, shear angle, and principal components. The current view and data dependencies are processed first. The remaining data products and views are then computed in a round-robin fashion to anticipate view changes. When an analysis or display parameter is changed, the affected data products and views are invalidated and progressively re-displayed as available. This software is designed to facilitate the derivation of the strain fields from the GPS and strain meter data that sample it to facilitate the understanding of the strengths and weaknesses of the strain field derivation from continuous GPS (CGPS) and other geodetic data from a variety of tectonic settings, to converge on the "best practices" strain derivation strategy for the Solid Earth Science ESDR System (SESES) project given the CGPS station distribution in the western U.S., and to provide SESES users with a scientific and educational tool to explore the strain field on their own with user-defined parameters.
Design and milling manufacture of polyurethane custom contoured cushions for wheelchair users.
da Silva, Fabio Pinto; Beretta, Elisa Marangon; Prestes, Rafael Cavalli; Kindlein Junior, Wilson
2011-01-01
The design of custom contoured cushions manufactured in flexible polyurethane foams is an option to improve positioning and comfort for people with disabilities that spend most of the day seated in the same position. These surfaces increase the contact area between the seat and the user. This fact contributes to minimise the local pressures that can generate problems like decubitus ulcers. The present research aims at establishing development routes for custom cushion production to wheelchair users. This study also contributes to the investigation of Computer Numerical Control (CNC) machining of flexible polyurethane foams. The proposed route to obtain the customised seat began with acquiring the user's contour in adequate posture through plaster cast. To collect the surface geometry, the cast was three-dimensionally scanned and manipulated in CAD/CAM software. CNC milling parameters such as tools, spindle speeds and feed rates to machine flexible polyurethane foams were tested. These parameters were analysed regarding the surface quality. The best parameters were then tested in a customised seat. The possible dimensional changes generated during foam cutting were analysed through 3D scanning. Also, the customised seat pressure and temperature distribution was tested. The best parameters found for foams with a density of 50kg/cm(3) were high spindle speeds (24000 rpm) and feed rates between 2400-4000mm/min. Those parameters did not generate significant deformities in the machined cushions. The custom contoured cushion satisfactorily increased the contact area between wheelchair and user, as it distributed pressure and heat evenly. Through this study it was possible to define routes for the development and manufacturing of customised seats using direct CNC milling in flexible polyurethane foams. It also showed that custom contoured cushions efficiently distribute pressure and temperature, which is believed to minimise tissue lesions such as pressure ulcers.
Anand, T S; Sujatha, S
2017-08-01
Polycentric knees for transfemoral prostheses have a variety of geometries, but a survey of literature shows that there are few ways of comparing their performance. Our objective was to present a method for performance comparison of polycentric knee geometries and design a new geometry. In this work, we define parameters to compare various commercially available prosthetic knees in terms of their stability, toe clearance, maximum flexion, and so on and optimize the parameters to obtain a new knee design. We use the defined parameters and optimization to design a new knee geometry that provides the greater stability and toe clearance necessary to navigate uneven terrain which is typically encountered in developing countries. Several commercial knees were compared based on the defined parameters to determine their suitability for uneven terrain. A new knee was designed based on optimization of these parameters. Preliminary user testing indicates that the new knee is very stable and easy to use. The methodology can be used for better knee selection and design of more customized knee geometries. Clinical relevance The method provides a tool to aid in the selection and design of polycentric knees for transfemoral prostheses.
Evolution simulation of lightning discharge based on a magnetohydrodynamics method
NASA Astrophysics Data System (ADS)
Fusheng, WANG; Xiangteng, MA; Han, CHEN; Yao, ZHANG
2018-07-01
In order to solve the load problem for aircraft lightning strikes, lightning channel evolution is simulated under the key physical parameters for aircraft lightning current component C. A numerical model of the discharge channel is established, based on magnetohydrodynamics (MHD) and performed by FLUENT software. With the aid of user-defined functions and a user-defined scalar, the Lorentz force, Joule heating and material parameters of an air thermal plasma are added. A three-dimensional lightning arc channel is simulated and the arc evolution in space is obtained. The results show that the temperature distribution of the lightning channel is symmetrical and that the hottest region occurs at the center of the lightning channel. The distributions of potential and current density are obtained, showing that the difference in electric potential or energy between two points tends to make the arc channel develop downwards. The arc channel comes into expansion on the anode surface due to stagnation of the thermal plasma and there exists impingement on the copper plate when the arc channel comes into contact with the anode plate.
Qualitative Description of Obscuration Factors in Central Europe
1980-09-01
parameters have been made for years, but optical parameters are more difficult to observe and a complete data base is Oil lacking. 7he four users of...visibility also restrict ground-to-air operations to the extent that visual sighting of aircraft or targets cannot be made from the ground. If moderate...has been made to relate air mass characteristics to E-O systems behavior. An air mass is defined as a mass of air with approximately the same 4
LMSS communication network design
NASA Technical Reports Server (NTRS)
1982-01-01
The architecture of the telecommunication network as the first step in the design of the LMSS system is described. A set of functional requirements including the total number of users to be served by the LMSS are hypothesized. The design parameters are then defined at length and are systematically selected such that the resultant system is capable of serving the hypothesized number of users. The design of the backhaul link is presented. The number of multiple backhaul beams required for communication to the base stations is determined. A conceptual procedure for call-routing and locating a mobile subscriber within the LMSS network is presented. The various steps in placing a call are explained, and the relationship between the two sets of UHF and S-band multiple beams is developed. A summary of the design parameters is presented.
Analysis of counting data: Development of the SATLAS Python package
NASA Astrophysics Data System (ADS)
Gins, W.; de Groote, R. P.; Bissell, M. L.; Granados Buitrago, C.; Ferrer, R.; Lynch, K. M.; Neyens, G.; Sels, S.
2018-01-01
For the analysis of low-statistics counting experiments, a traditional nonlinear least squares minimization routine may not always provide correct parameter and uncertainty estimates due to the assumptions inherent in the algorithm(s). In response to this, a user-friendly Python package (SATLAS) was written to provide an easy interface between the data and a variety of minimization algorithms which are suited for analyzinglow, as well as high, statistics data. The advantage of this package is that it allows the user to define their own model function and then compare different minimization routines to determine the optimal parameter values and their respective (correlated) errors. Experimental validation of the different approaches in the package is done through analysis of hyperfine structure data of 203Fr gathered by the CRIS experiment at ISOLDE, CERN.
Anderman, E.R.; Hill, M.C.
2000-01-01
This report documents the Hydrogeologic-Unit Flow (HUF) Package for the groundwater modeling computer program MODFLOW-2000. The HUF Package is an alternative internal flow package that allows the vertical geometry of the system hydrogeology to be defined explicitly within the model using hydrogeologic units that can be different than the definition of the model layers. The HUF Package works with all the processes of MODFLOW-2000. For the Ground-Water Flow Process, the HUF Package calculates effective hydraulic properties for the model layers based on the hydraulic properties of the hydrogeologic units, which are defined by the user using parameters. The hydraulic properties are used to calculate the conductance coefficients and other terms needed to solve the ground-water flow equation. The sensitivity of the model to the parameters defined within the HUF Package input file can be calculated using the Sensitivity Process, using observations defined with the Observation Process. Optimal values of the parameters can be estimated by using the Parameter-Estimation Process. The HUF Package is nearly identical to the Layer-Property Flow (LPF) Package, the major difference being the definition of the vertical geometry of the system hydrogeology. Use of the HUF Package is illustrated in two test cases, which also serve to verify the performance of the package by showing that the Parameter-Estimation Process produces the true parameter values when exact observations are used.
Modeling and simulation of five-axis virtual machine based on NX
NASA Astrophysics Data System (ADS)
Li, Xiaoda; Zhan, Xianghui
2018-04-01
Virtual technology in the machinery manufacturing industry has shown the role of growing. In this paper, the Siemens NX software is used to model the virtual CNC machine tool, and the parameters of the virtual machine are defined according to the actual parameters of the machine tool so that the virtual simulation can be carried out without loss of the accuracy of the simulation. How to use the machine builder of the CAM module to define the kinematic chain and machine components of the machine is described. The simulation of virtual machine can provide alarm information of tool collision and over cutting during the process to users, and can evaluate and forecast the rationality of the technological process.
NASA Technical Reports Server (NTRS)
Pindera, Marek-Jerzy; Salzar, Robert S.
1996-01-01
A user's guide for the computer program OPTCOMP2 is presented in this report. This program provides a capability to optimize the fabrication or service-induced residual stresses in unidirectional metal matrix composites subjected to combined thermomechanical axisymmetric loading by altering the processing history, as well as through the microstructural design of interfacial fiber coatings. The user specifies the initial architecture of the composite and the load history, with the constituent materials being elastic, plastic, viscoplastic, or as defined by the 'user-defined' constitutive model, in addition to the objective function and constraints, through a user-friendly data input interface. The optimization procedure is based on an efficient solution methodology for the inelastic response of a fiber/interface layer(s)/matrix concentric cylinder model where the interface layers can be either homogeneous or heterogeneous. The response of heterogeneous layers is modeled using Aboudi's three-dimensional method of cells micromechanics model. The commercial optimization package DOT is used for the nonlinear optimization problem. The solution methodology for the arbitrarily layered cylinder is based on the local-global stiffness matrix formulation and Mendelson's iterative technique of successive elastic solutions developed for elastoplastic boundary-value problems. The optimization algorithm employed in DOT is based on the method of feasible directions.
Space station needs, attributes and architectural options. Volume 3, task 1: Mission requirements
NASA Technical Reports Server (NTRS)
1983-01-01
The mission requirements of the space station program are investigated. Mission parameters are divided into user support from private industry, scientific experimentation, U.S. national security, and space operations away from the space station. These categories define the design and use of the space station. An analysis of cost estimates is included.
Radiometrie recalibration procedure for landsat-5 thematic mapper data
Chander, G.; Micijevic, E.; Hayes, R.W.; Barsi, J.A.
2008-01-01
The Landsat-5 (L5) satellite was launched on March 01, 1984, with a design life of three years. Incredibly, the L5 Thematic Mapper (TM) has collected data for 23 years. Over this time, the detectors have aged, and its radiometric characteristics have changed since launch. The calibration procedures and parameters have also changed with time. Revised radiometric calibrations have improved the radiometric accuracy of recently processed data; however, users with data that were processed prior to the calibration update do not benefit from the revisions. A procedure has been developed to give users the ability to recalibrate their existing Level 1 (L1) products without having to purchase reprocessed data from the U.S. Geological Survey (USGS). The accuracy of the recalibration is dependent on the knowledge of the prior calibration applied to the data. The ""Work Order" file, included with standard National Land Archive Production System (NLAFS) data products, gives parameters that define the applied calibration. These are the Internal Calibrator (IC) calibration parameters or the default prelaunch calibration, if there were problems with the IC calibration. This paper details the recalibration procedure for data processed using IC, in which users have the Work Order file. ?? 2001 IEEE.
From the desktop to the grid: scalable bioinformatics via workflow conversion.
de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver
2016-03-12
Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.
Computational tools for fitting the Hill equation to dose-response curves.
Gadagkar, Sudhindra R; Call, Gerald B
2015-01-01
Many biological response curves commonly assume a sigmoidal shape that can be approximated well by means of the 4-parameter nonlinear logistic equation, also called the Hill equation. However, estimation of the Hill equation parameters requires access to commercial software or the ability to write computer code. Here we present two user-friendly and freely available computer programs to fit the Hill equation - a Solver-based Microsoft Excel template and a stand-alone GUI-based "point and click" program, called HEPB. Both computer programs use the iterative method to estimate two of the Hill equation parameters (EC50 and the Hill slope), while constraining the values of the other two parameters (the minimum and maximum asymptotes of the response variable) to fit the Hill equation to the data. In addition, HEPB draws the prediction band at a user-defined confidence level, and determines the EC50 value for each of the limits of this band to give boundary values that help objectively delineate sensitive, normal and resistant responses to the drug being tested. Both programs were tested by analyzing twelve datasets that varied widely in data values, sample size and slope, and were found to yield estimates of the Hill equation parameters that were essentially identical to those provided by commercial software such as GraphPad Prism and nls, the statistical package in the programming language R. The Excel template provides a means to estimate the parameters of the Hill equation and plot the regression line in a familiar Microsoft Office environment. HEPB, in addition to providing the above results, also computes the prediction band for the data at a user-defined level of confidence, and determines objective cut-off values to distinguish among response types (sensitive, normal and resistant). Both programs are found to yield estimated values that are essentially the same as those from standard software such as GraphPad Prism and the R-based nls. Furthermore, HEPB also has the option to simulate 500 response values based on the range of values of the dose variable in the original data and the fit of the Hill equation to that data. Copyright © 2014. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Schiess, J. R.
1994-01-01
Scientific data often contains random errors that make plotting and curve-fitting difficult. The Rational-Spline Approximation with Automatic Tension Adjustment algorithm lead to a flexible, smooth representation of experimental data. The user sets the conditions for each consecutive pair of knots:(knots are user-defined divisions in the data set) to apply no tension; to apply fixed tension; or to determine tension with a tension adjustment algorithm. The user also selects the number of knots, the knot abscissas, and the allowed maximum deviations from line segments. The selection of these quantities depends on the actual data and on the requirements of a particular application. This program differs from the usual spline under tension in that it allows the user to specify different tension values between each adjacent pair of knots rather than a constant tension over the entire data range. The subroutines use an automatic adjustment scheme that varies the tension parameter for each interval until the maximum deviation of the spline from the line joining the knots is less than or equal to a user-specified amount. This procedure frees the user from the drudgery of adjusting individual tension parameters while still giving control over the local behavior of the spline The Rational Spline program was written completely in FORTRAN for implementation on a CYBER 850 operating under NOS. It has a central memory requirement of approximately 1500 words. The program was released in 1988.
Estimating the Geocenter from GNSS Observations
NASA Astrophysics Data System (ADS)
Dach, Rolf; Michael, Meindl; Beutler, Gerhard; Schaer, Stefan; Lutz, Simon; Jäggi, Adrian
2014-05-01
The satellites of the Global Navigation Satellite Systems (GNSS) are orbiting the Earth according to the laws of celestial mechanics. As a consequence, the satellites are sensitive to the coordinates of the center of mass of the Earth. The coordinates of the (ground) tracking stations are referring to the center of figure as the conventional origin of the reference frame. The difference between the center of mass and center of figure is the instantaneous geocenter. Following this definition the global GNSS solutions are sensitive to the geocenter. Several studies demonstrated strong correlations of the GNSS-derived geocenter coordinates with parameters intended to absorb radiation pressure effects acting on the GNSS satellites, and with GNSS satellite clock parameters. One should thus pose the question to what extent these satellite-related parameters absorb (or hide) the geocenter information. A clean simulation study has been performed to answer this question. The simulation environment allows it in particular to introduce user-defined shifts of the geocenter (systematic inconsistencies between the satellite's and station's reference frames). These geocenter shifts may be recovered by the mentioned parameters - provided they were set up in the analysis. If the geocenter coordinates are not estimated, one may find out which other parameters absorb the user-defined shifts of the geocenter and to what extent. Furthermore, the simulation environment also allows it to extract the correlation matrix from the a posteriori covariance matrix to study the correlations between different parameter types of the GNSS analysis system. Our results show high degrees of correlations between geocenter coordinates, orbit-related parameters, and satellite clock parameters. These correlations are of the same order of magnitude as the correlations between station heights, troposphere, and receiver clock parameters in each regional or global GNSS network analysis. If such correlations are accepted in a GNSS analysis when estimating station coordinates, geocenter coordinates must be considered as mathematically estimable in a global GNSS analysis. The geophysical interpretation may of course become difficult, e.g., if insufficient orbit models are used.
User's guide: Programs for processing altimeter data over inland seas
NASA Technical Reports Server (NTRS)
Au, A. Y.; Brown, R. D.; Welker, J. E.
1989-01-01
The programs described were developed to process GEODYN-formatted satellite altimeter data, and to apply the processed results to predict geoid undulations and gravity anomalies of inland sea areas. These programs are written in standard FORTRAN 77 and are designed to run on the NSESCC IBM 3081(MVS) computer. Because of the experimental nature of these programs they are tailored to the geographical area analyzed. The attached program listings are customized for processing the altimeter data over the Black Sea. Users interested in the Caspian Sea data are expected to modify each program, although the required modifications are generally minor. Program control parameters are defined in the programs via PARAMETER statements and/or DATA statements. Other auxiliary parameters, such as labels, are hard-wired into the programs. Large data files are read in or written out through different input or output units. The program listings of these programs are accompanied by sample IBM job control language (JCL) images. Familiarity with IBM JCL and the TEMPLATE graphic package is assumed.
NASA Astrophysics Data System (ADS)
Zinke, Stephan
2017-02-01
Memory sensitive applications for remote sensing data require memory-optimized data types in remote sensing products. Hierarchical Data Format version 5 (HDF5) offers user defined floating point numbers and integers and the n-bit filter to create data types optimized for memory consumption. The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) applies a compaction scheme to the disseminated products of the Day and Night Band (DNB) data of Suomi National Polar-orbiting Partnership (S-NPP) satellite's instrument Visible Infrared Imager Radiometer Suite (VIIRS) through the EUMETSAT Advanced Retransmission Service, converting the original 32 bits floating point numbers to user defined floating point numbers in combination with the n-bit filter for the radiance dataset of the product. The radiance dataset requires a floating point representation due to the high dynamic range of the DNB. A compression factor of 1.96 is reached by using an automatically determined exponent size and an 8 bits trailing significand and thus reducing the bandwidth requirements for dissemination. It is shown how the parameters needed for user defined floating point numbers are derived or determined automatically based on the data present in a product.
Evaluating benefits and costs of changes in water quality.
Jessica Koteen; Susan J. Alexander; John B. Loomis
2002-01-01
Water quality affects a variety of uses, such as municipal water consumption and recreation. Changes in water quality can influence the benefits water users receive. The problem is how to define water quality for specific uses. It is not possible to come up with one formal definition of water quality that fits all water uses. There are many parameters that influence...
Anderson, Jeffrey R; Barrett, Steven F
2009-01-01
Image segmentation is the process of isolating distinct objects within an image. Computer algorithms have been developed to aid in the process of object segmentation, but a completely autonomous segmentation algorithm has yet to be developed [1]. This is because computers do not have the capability to understand images and recognize complex objects within the image. However, computer segmentation methods [2], requiring user input, have been developed to quickly segment objects in serial sectioned images, such as magnetic resonance images (MRI) and confocal laser scanning microscope (CLSM) images. In these cases, the segmentation process becomes a powerful tool in visualizing the 3D nature of an object. The user input is an important part of improving the performance of many segmentation methods. A double threshold segmentation method has been investigated [3] to separate objects in gray scaled images, where the gray level of the object is among the gray levels of the background. In order to best determine the threshold values for this segmentation method the image must be manipulated for optimal contrast. The same is true of other segmentation and edge detection methods as well. Typically, the better the image contrast, the better the segmentation results. This paper describes a graphical user interface (GUI) that allows the user to easily change image contrast parameters that will optimize the performance of subsequent object segmentation. This approach makes use of the fact that the human brain is extremely effective in object recognition and understanding. The GUI provides the user with the ability to define the gray scale range of the object of interest. These lower and upper bounds of this range are used in a histogram stretching process to improve image contrast. Also, the user can interactively modify the gamma correction factor that provides a non-linear distribution of gray scale values, while observing the corresponding changes to the image. This interactive approach gives the user the power to make optimal choices in the contrast enhancement parameters.
Automatic Clustering Using FSDE-Forced Strategy Differential Evolution
NASA Astrophysics Data System (ADS)
Yasid, A.
2018-01-01
Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.
Parasitology tutoring system: a hypermedia computer-based application.
Theodoropoulos, G; Loumos, V
1994-02-14
The teaching of parasitology is a basic course in all life sciences curricula, and up to now no computer-assisted tutoring system has been developed for this purpose. By using Knowledge Pro, an object-oriented software development tool, a hypermedia tutoring system for teaching parasitology to college students was developed. Generally, a tutoring system contains a domain expert, a student model, a pedagogical expert and the user interface. In this project, particular emphasis was given to the user interface design and the expert knowledge representation. The system allows access to the educational material through hypermedia and indexing at the pace of the student. The hypermedia access is facilitated through key words defined as hypertext and objects in pictures defined as hyper-areas. The indexing access is based on a list of parameters that refers to various characteristics of the parasites, e.g. taxonomy, host, organ, etc. In addition, this indexing access can be used for testing the student's level of understanding. The advantages of this system are its user-friendliness, graphical interface and ability to incorporate new educational material in the area of parasitology.
Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0
NASA Technical Reports Server (NTRS)
Knox, J. C.
1996-01-01
The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.
Role of price and enforcement in water allocation: Insights from Game Theory
NASA Astrophysics Data System (ADS)
Souza Filho, Francisco Assis; Lall, Upmanu; Porto, Rubem La Laina
2008-12-01
As many countries are moving toward water sector reforms, practical issues of how water management institutions can better effect allocation, regulation, and enforcement of water rights have emerged. The problem of nonavailability of water to tailenders on an irrigation system in developing countries, due to unlicensed upstream diversions is well documented. The reliability of access or equivalently the uncertainty associated with water availability at their diversion point becomes a parameter that is likely to influence the application by users for water licenses, as well as their willingness to pay for licensed use. The ability of a water agency to reduce this uncertainty through effective water rights enforcement is related to the fiscal ability of the agency to monitor and enforce licensed use. In this paper, this interplay across the users and the agency is explored, considering the hydraulic structure or sequence of water use and parameters that define the users and the agency's economics. The potential for free rider behavior by the users, as well as their proposals for licensed use are derived conditional on this setting. The analyses presented are developed in the framework of the theory of "Law and Economics," with user interactions modeled as a game theoretic enterprise. The state of Ceara, Brazil, is used loosely as an example setting, with parameter values for the experiments indexed to be approximately those relevant for current decisions. The potential for using the ideas in participatory decision making is discussed. This paper is an initial attempt to develop a conceptual framework for analyzing such situations but with a focus on the reservoir-canal system water rights enforcement.
TREXMO: A Translation Tool to Support the Use of Regulatory Occupational Exposure Models.
Savic, Nenad; Racordon, Dimitri; Buchs, Didier; Gasic, Bojan; Vernez, David
2016-10-01
Occupational exposure models vary significantly in their complexity, purpose, and the level of expertise required from the user. Different parameters in the same model may lead to different exposure estimates for the same exposure situation. This paper presents a tool developed to deal with this concern-TREXMO or TRanslation of EXposure MOdels. TREXMO integrates six commonly used occupational exposure models, namely, ART v.1.5, STOFFENMANAGER(®) v.5.1, ECETOC TRA v.3, MEASE v.1.02.01, EMKG-EXPO-TOOL, and EASE v.2.0. By enabling a semi-automatic translation between the parameters of these six models, TREXMO facilitates their simultaneous use. For a given exposure situation, defined by a set of parameters in one of the models, TREXMO provides the user with the most appropriate parameters to use in the other exposure models. Results showed that, once an exposure situation and parameters were set in ART, TREXMO reduced the number of possible outcomes in the other models by 1-4 orders of magnitude. The tool should manage to reduce the uncertain entry or selection of parameters in the six models, improve between-user reliability, and reduce the time required for running several models for a given exposure situation. In addition to these advantages, registrants of chemicals and authorities should benefit from more reliable exposure estimates for the risk characterization of dangerous chemicals under Regulation, Evaluation, Authorisation and restriction of CHemicals (REACH). © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
Hill, Mary C.; Banta, E.R.; Harbaugh, A.W.; Anderman, E.R.
2000-01-01
This report documents the Observation, Sensitivity, and Parameter-Estimation Processes of the ground-water modeling computer program MODFLOW-2000. The Observation Process generates model-calculated values for comparison with measured, or observed, quantities. A variety of statistics is calculated to quantify this comparison, including a weighted least-squares objective function. In addition, a number of files are produced that can be used to compare the values graphically. The Sensitivity Process calculates the sensitivity of hydraulic heads throughout the model with respect to specified parameters using the accurate sensitivity-equation method. These are called grid sensitivities. If the Observation Process is active, it uses the grid sensitivities to calculate sensitivities for the simulated values associated with the observations. These are called observation sensitivities. Observation sensitivities are used to calculate a number of statistics that can be used (1) to diagnose inadequate data, (2) to identify parameters that probably cannot be estimated by regression using the available observations, and (3) to evaluate the utility of proposed new data. The Parameter-Estimation Process uses a modified Gauss-Newton method to adjust values of user-selected input parameters in an iterative procedure to minimize the value of the weighted least-squares objective function. Statistics produced by the Parameter-Estimation Process can be used to evaluate estimated parameter values; statistics produced by the Observation Process and post-processing program RESAN-2000 can be used to evaluate how accurately the model represents the actual processes; statistics produced by post-processing program YCINT-2000 can be used to quantify the uncertainty of model simulated values. Parameters are defined in the Ground-Water Flow Process input files and can be used to calculate most model inputs, such as: for explicitly defined model layers, horizontal hydraulic conductivity, horizontal anisotropy, vertical hydraulic conductivity or vertical anisotropy, specific storage, and specific yield; and, for implicitly represented layers, vertical hydraulic conductivity. In addition, parameters can be defined to calculate the hydraulic conductance of the River, General-Head Boundary, and Drain Packages; areal recharge rates of the Recharge Package; maximum evapotranspiration of the Evapotranspiration Package; pumpage or the rate of flow at defined-flux boundaries of the Well Package; and the hydraulic head at constant-head boundaries. The spatial variation of model inputs produced using defined parameters is very flexible, including interpolated distributions that require the summation of contributions from different parameters. Observations can include measured hydraulic heads or temporal changes in hydraulic heads, measured gains and losses along head-dependent boundaries (such as streams), flows through constant-head boundaries, and advective transport through the system, which generally would be inferred from measured concentrations. MODFLOW-2000 is intended for use on any computer operating system. The program consists of algorithms programmed in Fortran 90, which efficiently performs numerical calculations and is fully compatible with the newer Fortran 95. The code is easily modified to be compatible with FORTRAN 77. Coordination for multiple processors is accommodated using Message Passing Interface (MPI) commands. The program is designed in a modular fashion that is intended to support inclusion of new capabilities.
A sectionwise defined model for the material description of 100Cr6 in the thixotropic state
NASA Astrophysics Data System (ADS)
Behrens, B.-A.; Chugreev, A.; Hootak, M.
2018-05-01
A sectionwise defined material model has been developed for the numerical description of thixoforming processes. It consists of two sections. The first one describes the material behaviour below the solidus temperature and comprises an approach from structure mechanics, whereas the second section model describes the thixotropic behaviour above the solidus temperature based on the Ostwald-de Waele power law. The material model has been implemented in a commercial FE software Simufact Forming by means of user-defined subroutines. Numerical and experimental investigations of special upsetting tests have been designed and carried out with Armco iron-coated specimens. Finally, the model parameters were fitted by reverse engineering.
Preference Mining Using Neighborhood Rough Set Model on Two Universes.
Zeng, Kai
2016-01-01
Preference mining plays an important role in e-commerce and video websites for enhancing user satisfaction and loyalty. Some classical methods are not available for the cold-start problem when the user or the item is new. In this paper, we propose a new model, called parametric neighborhood rough set on two universes (NRSTU), to describe the user and item data structures. Furthermore, the neighborhood lower approximation operator is used for defining the preference rules. Then, we provide the means for recommending items to users by using these rules. Finally, we give an experimental example to show the details of NRSTU-based preference mining for cold-start problem. The parameters of the model are also discussed. The experimental results show that the proposed method presents an effective solution for preference mining. In particular, NRSTU improves the recommendation accuracy by about 19% compared to the traditional method.
Waste treatability guidance program. User`s guide. Revision 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toth, C.
1995-12-21
DOE sites across the country generate and manage radioactive, hazardous, mixed, and sanitary wastes. It is necessary for each site to find the technologies and associated capacities required to manage its waste. One role of DOE HQ Office of Environmental Restoration and Waste Management is to facilitate the integration of the site- specific plans into coherent national plans. DOE has developed a standard methodology for defining and categorizing waste streams into treatability groups based on characteristic parameters that influence waste management technology needs. This Waste Treatability Guidance Program automates the Guidance Document for the categorization of waste information into treatabilitymore » groups; this application provides a consistent implementation of the methodology across the National TRU Program. This User`s Guide provides instructions on how to use the program, including installations instructions and program operation. This document satisfies the requirements of the Software Quality Assurance Plan.« less
Recommended Parameter Values for GENII Modeling of Radionuclides in Routine Air and Water Releases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snyder, Sandra F.; Arimescu, Carmen; Napier, Bruce A.
The GENII v2 code is used to estimate dose to individuals or populations from the release of radioactive materials into air or water. Numerous parameter values are required for input into this code. User-defined parameters cover the spectrum from chemical data, meteorological data, agricultural data, and behavioral data. This document is a summary of parameter values that reflect conditions in the United States. Reasonable regional and age-dependent data is summarized. Data availability and quality varies. The set of parameters described address scenarios for chronic air emissions or chronic releases to public waterways. Considerations for the special tritium and carbon-14 modelsmore » are briefly addressed. GENIIv2.10.0 is the current software version that this document supports.« less
Experimental setup for evaluating an adaptive user interface for teleoperation control
NASA Astrophysics Data System (ADS)
Wijayasinghe, Indika B.; Peetha, Srikanth; Abubakar, Shamsudeen; Saadatzi, Mohammad Nasser; Cremer, Sven; Popa, Dan O.
2017-05-01
A vital part of human interactions with a machine is the control interface, which single-handedly could define the user satisfaction and the efficiency of performing a task. This paper elaborates the implementation of an experimental setup to study an adaptive algorithm that can help the user better tele-operate the robot. The formulation of the adaptive interface and associate learning algorithms are general enough to apply when the mapping between the user controls and the robot actuators is complex and/or ambiguous. The method uses a genetic algorithm to find the optimal parameters that produce the input-output mapping for teleoperation control. In this paper, we describe the experimental setup and associated results that was used to validate the adaptive interface to a differential drive robot from two different input devices; a joystick, and a Myo gesture control armband. Results show that after the learning phase, the interface converges to an intuitive mapping that can help even inexperienced users drive the system to a goal location.
Raguin, Olivier; Gruaz-Guyon, Anne; Barbet, Jacques
2002-11-01
An add-in to Microsoft Excel was developed to simulate multiple binding equilibriums. A partition function, readily written even when the equilibrium is complex, describes the experimental system. It involves the concentrations of the different free molecular species and of the different complexes present in the experiment. As a result, the software is not restricted to a series of predefined experimental setups but can handle a large variety of problems involving up to nine independent molecular species. Binding parameters are estimated by nonlinear least-square fitting of experimental measurements as supplied by the user. The fitting process allows user-defined weighting of the experimental data. The flexibility of the software and the way it may be used to describe common experimental situations and to deal with usual problems such as tracer reactivity or nonspecific binding is demonstrated by a few examples. The software is available free of charge upon request.
Future heavy duty trucking engine requirements
NASA Technical Reports Server (NTRS)
Strawhorn, L. W.; Suski, V. A.
1985-01-01
Developers of advanced heavy duty diesel engines are engaged in probing the opportunities presented by new materials and techniques. This process is technology driven, but there is neither assurance that the eventual users of the engines so developed will be comfortable with them nor, indeed, that those consumers will continue to exist in either the same form, or numbers as they do today. To ensure maximum payoff of research dollars, the equipment development process must consider user needs. This study defines motor carrier concerns, cost tolerances, and the engine parameters which match the future projected industry needs. The approach taken to do that is to be explained and the results presented. The material to be given comes basically from a survey of motor carrier fleets. It provides indications of the role of heavy duty vehicles in the 1998 period and their desired maintenance and engine performance parameters.
Hands-on parameter search for neural simulations by a MIDI-controller.
Eichner, Hubert; Borst, Alexander
2011-01-01
Computational neuroscientists frequently encounter the challenge of parameter fitting--exploring a usually high dimensional variable space to find a parameter set that reproduces an experimental data set. One common approach is using automated search algorithms such as gradient descent or genetic algorithms. However, these approaches suffer several shortcomings related to their lack of understanding the underlying question, such as defining a suitable error function or getting stuck in local minima. Another widespread approach is manual parameter fitting using a keyboard or a mouse, evaluating different parameter sets following the users intuition. However, this process is often cumbersome and time-intensive. Here, we present a new method for manual parameter fitting. A MIDI controller provides input to the simulation software, where model parameters are then tuned according to the knob and slider positions on the device. The model is immediately updated on every parameter change, continuously plotting the latest results. Given reasonably short simulation times of less than one second, we find this method to be highly efficient in quickly determining good parameter sets. Our approach bears a close resemblance to tuning the sound of an analog synthesizer, giving the user a very good intuition of the problem at hand, such as immediate feedback if and how results are affected by specific parameter changes. In addition to be used in research, our approach should be an ideal teaching tool, allowing students to interactively explore complex models such as Hodgkin-Huxley or dynamical systems.
Hands-On Parameter Search for Neural Simulations by a MIDI-Controller
Eichner, Hubert; Borst, Alexander
2011-01-01
Computational neuroscientists frequently encounter the challenge of parameter fitting – exploring a usually high dimensional variable space to find a parameter set that reproduces an experimental data set. One common approach is using automated search algorithms such as gradient descent or genetic algorithms. However, these approaches suffer several shortcomings related to their lack of understanding the underlying question, such as defining a suitable error function or getting stuck in local minima. Another widespread approach is manual parameter fitting using a keyboard or a mouse, evaluating different parameter sets following the users intuition. However, this process is often cumbersome and time-intensive. Here, we present a new method for manual parameter fitting. A MIDI controller provides input to the simulation software, where model parameters are then tuned according to the knob and slider positions on the device. The model is immediately updated on every parameter change, continuously plotting the latest results. Given reasonably short simulation times of less than one second, we find this method to be highly efficient in quickly determining good parameter sets. Our approach bears a close resemblance to tuning the sound of an analog synthesizer, giving the user a very good intuition of the problem at hand, such as immediate feedback if and how results are affected by specific parameter changes. In addition to be used in research, our approach should be an ideal teaching tool, allowing students to interactively explore complex models such as Hodgkin-Huxley or dynamical systems. PMID:22066027
General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets
NASA Technical Reports Server (NTRS)
Marchen, Luis F.
2011-01-01
The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.
Parametric analysis and temperature effect of deployable hinged shells using shape memory polymers
NASA Astrophysics Data System (ADS)
Tao, Ran; Yang, Qing-Sheng; He, Xiao-Qiao; Liew, Kim-Meow
2016-11-01
Shape memory polymers (SMPs) are a class of intelligent materials, which are defined by their capacity to store a temporary shape and recover an original shape. In this work, the shape memory effect of SMP deployable hinged shell is simulated by using compiled user defined material subroutine (UMAT) subroutine of ABAQUS. Variations of bending moment and strain energy of the hinged shells with different temperatures and structural parameters in the loading process are given. The effects of the parameters and temperature on the nonlinear deformation process are emphasized. The entire thermodynamic cycle of SMP deployable hinged shell includes loading at high temperature, load carrying with cooling, unloading at low temperature and recovering the original shape with heating. The results show that the complicated thermo-mechanical deformation and shape memory effect of SMP deployable hinge are influenced by the structural parameters and temperature. The design ability of SMP smart hinged structures in practical application is prospected.
Case studies on optimization problems in MATLAB and COMSOL multiphysics by means of the livelink
NASA Astrophysics Data System (ADS)
Ozana, Stepan; Pies, Martin; Docekal, Tomas
2016-06-01
LiveLink for COMSOL is a tool that integrates COMSOL Multiphysics with MATLAB to extend one's modeling with scripting programming in the MATLAB environment. It allows user to utilize the full power of MATLAB and its toolboxes in preprocessing, model manipulation, and post processing. At first, the head script launches COMSOL with MATLAB and defines initial value of all parameters, refers to the objective function J described in the objective function and creates and runs the defined optimization task. Once the task is launches, the COMSOL model is being called in the iteration loop (from MATLAB environment by use of API interface), changing defined optimization parameters so that the objective function is minimized, using fmincon function to find a local or global minimum of constrained linear or nonlinear multivariable function. Once the minimum is found, it returns exit flag, terminates optimization and returns the optimized values of the parameters. The cooperation with MATLAB via LiveLink enhances a powerful computational environment with complex multiphysics simulations. The paper will introduce using of the LiveLink for COMSOL for chosen case studies in the field of technical cybernetics and bioengineering.
User interfaces in space science instrumentation
NASA Astrophysics Data System (ADS)
McCalden, Alec John
This thesis examines user interaction with instrumentation in the specific context of space science. It gathers together existing practice in machine interfaces with a look at potential future usage and recommends a new approach to space science projects with the intention of maximising their science return. It first takes a historical perspective on user interfaces and ways of defining and measuring the science return of a space instrument. Choices of research methodology are considered. Implementation details such as the concepts of usability, mental models, affordance and presentation of information are described, and examples of existing interfaces in space science are given. A set of parameters for use in analysing and synthesizing a user interface is derived by using a set of case studies of diverse failures and from previous work. A general space science user analysis is made by looking at typical practice, and an interview plus persona technique is used to group users with interface designs. An examination is made of designs in the field of astronomical instrumentation interfaces, showing the evolution of current concepts and including ideas capable of sustaining progress in the future. The parameters developed earlier are then tested against several established interfaces in the space science context to give a degree of confidence in their use. The concept of a simulator that is used to guide the development of an instrument over the whole lifecycle is described, and the idea is proposed that better instrumentation would result from more efficient use of the resources available. The previous ideas in this thesis are then brought together to describe a proposed new approach to a typical development programme, with an emphasis on user interaction. The conclusion shows that there is significant room for improvement in the science return from space instrumentation by attention to the user interface.
NEMAR plotting computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1981-01-01
A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.
The use of Graphic User Interface for development of a user-friendly CRS-Stack software
NASA Astrophysics Data System (ADS)
Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah
2017-04-01
The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the parameter values for each operation. The knowledge of CRS-Stack processing procedure is still preserved in the software, which is easy and efficient to be learned. The software will still be developed in the future. Any new innovative seismic processing workflow will also be added into this GUI software.
Carey, Ryan M.; Wachowiak, Matt
2009-01-01
Sniffing has long been thought to play a critical role in shaping neural responses to odorants at multiple levels of the nervous system. However, it has been difficult to systematically examine how particular parameters of sniffing behavior shape odorant-evoked activity, in large part because of the complexity of sniffing behavior and the difficulty in reproducing this behavior in an anesthetized or reduced preparation. Here we present a method for generating naturalistic sniffing patterns in such preparations. The method involves a nasal ventilator whose movement is controlled by an analog command voltage. The command signal may consist of intranasal pressure transients recorded from awake rats and mice or user-defined waveforms. This “sniff playback” device generates intranasal pressure and airflow transients in anesthetized animals that approximate those recorded from the awake animal and are reproducible across trials and across preparations. The device accurately reproduces command waveforms over an amplitude range of approximately 1 log unit and up to frequencies of approximately 12 Hz. Further, odorant-evoked neural activity imaged during sniff playback appears similar to that seen in awake animals. This method should prove useful in investigating how the parameters of odorant sampling shape neural responses in a variety of experimental settings. PMID:18791186
Predicting online ratings based on the opinion spreading process
NASA Astrophysics Data System (ADS)
He, Xing-Sheng; Zhou, Ming-Yang; Zhuo, Zhao; Fu, Zhong-Qian; Liu, Jian-Guo
2015-10-01
Predicting users' online ratings is always a challenge issue and has drawn lots of attention. In this paper, we present a rating prediction method by combining the user opinion spreading process with the collaborative filtering algorithm, where user similarity is defined by measuring the amount of opinion a user transfers to another based on the primitive user-item rating matrix. The proposed method could produce a more precise rating prediction for each unrated user-item pair. In addition, we introduce a tunable parameter λ to regulate the preferential diffusion relevant to the degree of both opinion sender and receiver. The numerical results for Movielens and Netflix data sets show that this algorithm has a better accuracy than the standard user-based collaborative filtering algorithm using Cosine and Pearson correlation without increasing computational complexity. By tuning λ, our method could further boost the prediction accuracy when using Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) as measurements. In the optimal cases, on Movielens and Netflix data sets, the corresponding algorithmic accuracy (MAE and RMSE) are improved 11.26% and 8.84%, 13.49% and 10.52% compared to the item average method, respectively.
GDF v2.0, an enhanced version of GDF
NASA Astrophysics Data System (ADS)
Tsoulos, Ioannis G.; Gavrilis, Dimitris; Dermatas, Evangelos
2007-12-01
An improved version of the function estimation program GDF is presented. The main enhancements of the new version include: multi-output function estimation, capability of defining custom functions in the grammar and selection of the error function. The new version has been evaluated on a series of classification and regression datasets, that are widely used for the evaluation of such methods. It is compared to two known neural networks and outperforms them in 5 (out of 10) datasets. Program summaryTitle of program: GDF v2.0 Catalogue identifier: ADXC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 98 147 No. of bytes in distributed program, including test data, etc.: 2 040 684 Distribution format: tar.gz Programming language: GNU C++ Computer: The program is designed to be portable in all systems running the GNU C++ compiler Operating system: Linux, Solaris, FreeBSD RAM: 200000 bytes Classification: 4.9 Does the new version supersede the previous version?: Yes Nature of problem: The technique of function estimation tries to discover from a series of input data a functional form that best describes them. This can be performed with the use of parametric models, whose parameters can adapt according to the input data. Solution method: Functional forms are being created by genetic programming which are approximations for the symbolic regression problem. Reasons for new version: The GDF package was extended in order to be more flexible and user customizable than the old package. The user can extend the package by defining his own error functions and he can extend the grammar of the package by adding new functions to the function repertoire. Also, the new version can perform function estimation of multi-output functions and it can be used for classification problems. Summary of revisions: The following features have been added to the package GDF: Multi-output function approximation. The package can now approximate any function f:R→R. This feature gives also to the package the capability of performing classification and not only regression. User defined function can be added to the repertoire of the grammar, extending the regression capabilities of the package. This feature is limited to 3 functions, but easily this number can be increased. Capability of selecting the error function. The package offers now to the user apart from the mean square error other error functions such as: mean absolute square error, maximum square error. Also, user defined error functions can be added to the set of error functions. More verbose output. The main program displays more information to the user as well as the default values for the parameters. Also, the package gives to the user the capability to define an output file, where the output of the gdf program for the testing set will be stored after the termination of the process. Additional comments: A technical report describing the revisions, experiments and test runs is packaged with the source code. Running time: Depending on the train data.
Milewski, Marek C; Kamel, Karol; Kurzynska-Kokorniak, Anna; Chmielewski, Marcin K; Figlerowicz, Marek
2017-10-01
Experimental methods based on DNA and RNA hybridization, such as multiplex polymerase chain reaction, multiplex ligation-dependent probe amplification, or microarray analysis, require the use of mixtures of multiple oligonucleotides (primers or probes) in a single test tube. To provide an optimal reaction environment, minimal self- and cross-hybridization must be achieved among these oligonucleotides. To address this problem, we developed EvOligo, which is a software package that provides the means to design and group DNA and RNA molecules with defined lengths. EvOligo combines two modules. The first module performs oligonucleotide design, and the second module performs oligonucleotide grouping. The software applies a nearest-neighbor model of nucleic acid interactions coupled with a parallel evolutionary algorithm to construct individual oligonucleotides, and to group the molecules that are characterized by the weakest possible cross-interactions. To provide optimal solutions, the evolutionary algorithm sorts oligonucleotides into sets, preserves preselected parts of the oligonucleotides, and shapes their remaining parts. In addition, the oligonucleotide sets can be designed and grouped based on their melting temperatures. For the user's convenience, EvOligo is provided with a user-friendly graphical interface. EvOligo was used to design individual oligonucleotides, oligonucleotide pairs, and groups of oligonucleotide pairs that are characterized by the following parameters: (1) weaker cross-interactions between the non-complementary oligonucleotides and (2) more uniform ranges of the oligonucleotide pair melting temperatures than other available software products. In addition, in contrast to other grouping algorithms, EvOligo offers time-efficient sorting of paired and unpaired oligonucleotides based on various parameters defined by the user.
Wake Vortex Inverse Model User's Guide
NASA Technical Reports Server (NTRS)
Lai, David; Delisi, Donald
2008-01-01
NorthWest Research Associates (NWRA) has developed an inverse model for inverting landing aircraft vortex data. The data used for the inversion are the time evolution of the lateral transport position and vertical position of both the port and starboard vortices. The inverse model performs iterative forward model runs using various estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Forward model predictions of lateral transport and altitude are then compared with the observed data. Differences between the data and model predictions guide the choice of vortex parameter values, crosswind profile and circulation evolution in the next iteration. Iterations are performed until a user-defined criterion is satisfied. Currently, the inverse model is set to stop when the improvement in the rms deviation between the data and model predictions is less than 1 percent for two consecutive iterations. The forward model used in this inverse model is a modified version of the Shear-APA model. A detailed description of this forward model, the inverse model, and its validation are presented in a different report (Lai, Mellman, Robins, and Delisi, 2007). This document is a User's Guide for the Wake Vortex Inverse Model. Section 2 presents an overview of the inverse model program. Execution of the inverse model is described in Section 3. When executing the inverse model, a user is requested to provide the name of an input file which contains the inverse model parameters, the various datasets, and directories needed for the inversion. A detailed description of the list of parameters in the inversion input file is presented in Section 4. A user has an option to save the inversion results of each lidar track in a mat-file (a condensed data file in Matlab format). These saved mat-files can be used for post-inversion analysis. A description of the contents of the saved files is given in Section 5. An example of an inversion input file, with preferred parameters values, is given in Appendix A. An example of the plot generated at a normal completion of the inversion is shown in Appendix B.
ISOT_Calc: A versatile tool for parameter estimation in sorption isotherms
NASA Astrophysics Data System (ADS)
Beltrán, José L.; Pignatello, Joseph J.; Teixidó, Marc
2016-09-01
Geochemists and soil chemists commonly use parametrized sorption data to assess transport and impact of pollutants in the environment. However, this evaluation is often hampered by a lack of detailed sorption data analysis, which implies further non-accurate transport modeling. To this end, we present a novel software tool to precisely analyze and interpret sorption isotherm data. Our developed tool, coded in Visual Basic for Applications (VBA), operates embedded within the Microsoft Excel™ environment. It consists of a user-defined function named ISOT_Calc, followed by a supplementary optimization Excel macro (Ref_GN_LM). The ISOT_Calc function estimates the solute equilibrium concentration in the aqueous and solid phases (Ce and q, respectively). Hence, it represents a very flexible way in the optimization of the sorption isotherm parameters, as it can be carried out over the residuals of q, Ce, or both simultaneously (i.e., orthogonal distance regression). The developed function includes the most usual sorption isotherm models, as predefined equations, as well as the possibility to easily introduce custom-defined ones. Regarding the Ref_GN_LM macro, it allows the parameter optimization by using a Levenberg-Marquardt modified Gauss-Newton iterative procedure. In order to evaluate the performance of the presented tool, both function and optimization macro have been applied to different sorption data examples described in the literature. Results showed that the optimization of the isotherm parameters was successfully achieved in all cases, indicating the robustness and reliability of the developed tool. Thus, the presented software tool, available to researchers and students for free, has proven to be a user-friendly and an interesting alternative to conventional fitting tools used in sorption data analysis.
NASA Astrophysics Data System (ADS)
Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.
2014-12-01
Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.
Moessfit. A free Mössbauer fitting program
NASA Astrophysics Data System (ADS)
Kamusella, Sirko; Klauss, Hans-Henning
2016-12-01
A free data analysis program for Mössbauer spectroscopy was developed to solve commonly faced problems such as simultaneous fitting of multiple data sets, Maximum Entropy Method and a proper error estimation. The program is written in C++ using the Qt application framework and the Gnu Scientific Library. Moessfit makes use of multithreading to reasonably apply the multi core CPU capacities of modern PC. The whole fit is specified in a text input file issued to simplify work flow for the user and provide a simple start in the Mössbauer data analysis for beginners. However, the possibility to define arbitrary parameter dependencies and distributions as well as relaxation spectra makes Moessfit interesting for advanced user as well.
Software for real-time control of a tidal liquid ventilator.
Heckman, J L; Hoffman, J; Shaffer, T H; Wolfson, M R
1999-01-01
The purpose of this project was to develop and test computer software and control algorithms designed to operate a tidal liquid ventilator. The tests were executed on a 90-MHz Pentium PC with 16 MB RAM and a prototype liquid ventilator. The software was designed using Microsoft Visual C++ (Ver. 5.0) and the Microsoft Foundation Classes. It uses a graphic user interface, is multithreaded, runs in real time, and has a built-in simulator that facilitates user education in liquid-ventilation principles. The operator can use the software to specify ventilation parameters such as the frequency of ventilation, the tidal volume, and the inspiratory-expiratory time ratio. Commands are implemented via control of the pump speed and by setting the position of two two-way solenoid-controlled valves. Data for use in monitoring and control are gathered by analog-to-digital conversion. Control strategies are implemented to maintain lung volumes and airway pressures within desired ranges, according to limits set by the operator. Also, the software allows the operator to define the shape of the flow pulse during inspiration and expiration, and to optimize perfluorochemical liquid transfer while minimizing airway pressures and maintaining the desired tidal volume. The operator can stop flow during inspiration and expiration to measure alveolar pressures. At the end of expiration, the software stores all user commands and 30 ventilation parameters into an Excel spreadsheet for later review and analysis. Use of these software and control algorithms affords user-friendly operation of a tidal liquid ventilator while providing precise control of ventilation parameters.
Automated Design Space Exploration with Aspen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spafford, Kyle L.; Vetter, Jeffrey S.
Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less
Automated Design Space Exploration with Aspen
Spafford, Kyle L.; Vetter, Jeffrey S.
2015-01-01
Architects and applications scientists often use performance models to explore a multidimensional design space of architectural characteristics, algorithm designs, and application parameters. With traditional performance modeling tools, these explorations forced users to first develop a performance model and then repeatedly evaluate and analyze the model manually. These manual investigations proved laborious and error prone. More importantly, the complexity of this traditional process often forced users to simplify their investigations. To address this challenge of design space exploration, we extend our Aspen (Abstract Scalable Performance Engineering Notation) language with three new language constructs: user-defined resources, parameter ranges, and a collection ofmore » costs in the abstract machine model. Then, we use these constructs to enable automated design space exploration via a nonlinear optimization solver. We show how four interesting classes of design space exploration scenarios can be derived from Aspen models and formulated as pure nonlinear programs. The analysis tools are demonstrated using examples based on Aspen models for a three-dimensional Fast Fourier Transform, the CoMD molecular dynamics proxy application, and the DARPA Streaming Sensor Challenge Problem. Our results show that this approach can compose and solve arbitrary performance modeling questions quickly and rigorously when compared to the traditional manual approach.« less
IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.
Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M
2016-04-01
Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.
Novel Virtual User Models of Mild Cognitive Impairment for Simulating Dementia
Segkouli, Sofia; Tzovaras, Dimitrios; Tsakiris, Thanos; Tsolaki, Magda; Karagiannidis, Charalampos
2015-01-01
Virtual user modeling research has attempted to address critical issues of human-computer interaction (HCI) such as usability and utility through a large number of analytic, usability-oriented approaches as cognitive models in order to provide users with experiences fitting to their specific needs. However, there is demand for more specific modules embodied in cognitive architecture that will detect abnormal cognitive decline across new synthetic task environments. Also, accessibility evaluation of graphical user interfaces (GUIs) requires considerable effort for enhancing ICT products accessibility for older adults. The main aim of this study is to develop and test virtual user models (VUM) simulating mild cognitive impairment (MCI) through novel specific modules, embodied at cognitive models and defined by estimations of cognitive parameters. Well-established MCI detection tests assessed users' cognition, elaborated their ability to perform multitasks, and monitored the performance of infotainment related tasks to provide more accurate simulation results on existing conceptual frameworks and enhanced predictive validity in interfaces' design supported by increased tasks' complexity to capture a more detailed profile of users' capabilities and limitations. The final outcome is a more robust cognitive prediction model, accurately fitted to human data to be used for more reliable interfaces' evaluation through simulation on the basis of virtual models of MCI users. PMID:26339282
A Generalized DBMS to Support Diversified Data.
1987-07-21
interest on bonds ). Hence. they require a definition of subtraction which yields 30 days as the answer to the above computation. Only a user-defined...STON85]. Alternately. one can follow the standard scheduler model [BERNS1] in which a module is callable by code in the access methods when a...direction for evolution . These could include when to cease investigating alternate plans. and the ability to specify one’s own optimizer parameters
User's Manual for Aerofcn: a FORTRAN Program to Compute Aerodynamic Parameters
NASA Technical Reports Server (NTRS)
Conley, Joseph L.
1992-01-01
The computer program AeroFcn is discussed. AeroFcn is a utility program that computes the following aerodynamic parameters: geopotential altitude, Mach number, true velocity, dynamic pressure, calibrated airspeed, equivalent airspeed, impact pressure, total pressure, total temperature, Reynolds number, speed of sound, static density, static pressure, static temperature, coefficient of dynamic viscosity, kinematic viscosity, geometric altitude, and specific energy for a standard- or a modified standard-day atmosphere using compressible flow and normal shock relations. Any two parameters that define a unique flight condition are selected, and their values are entered interactively. The remaining parameters are computed, and the solutions are stored in an output file. Multiple cases can be run, and the multiple case solutions can be stored in another output file for plotting. Parameter units, the output format, and primary constants in the atmospheric and aerodynamic equations can also be changed.
Realistic simplified gaugino-higgsino models in the MSSM
NASA Astrophysics Data System (ADS)
Fuks, Benjamin; Klasen, Michael; Schmiemann, Saskia; Sunder, Marthijn
2018-03-01
We present simplified MSSM models for light neutralinos and charginos with realistic mass spectra and realistic gaugino-higgsino mixing, that can be used in experimental searches at the LHC. The formerly used naive approach of defining mass spectra and mixing matrix elements manually and independently of each other does not yield genuine MSSM benchmarks. We suggest the use of less simplified, but realistic MSSM models, whose mass spectra and mixing matrix elements are the result of a proper matrix diagonalisation. We propose a novel strategy targeting the design of such benchmark scenarios, accounting for user-defined constraints in terms of masses and particle mixing. We apply it to the higgsino case and implement a scan in the four relevant underlying parameters {μ , tan β , M1, M2} for a given set of light neutralino and chargino masses. We define a measure for the quality of the obtained benchmarks, that also includes criteria to assess the higgsino content of the resulting charginos and neutralinos. We finally discuss the distribution of the resulting models in the MSSM parameter space as well as their implications for supersymmetric dark matter phenomenology.
A users manual for the method of moments Aircraft Modeling Code (AMC), version 2
NASA Technical Reports Server (NTRS)
Peters, M. E.; Newman, E. H.
1994-01-01
This report serves as a user's manual for Version 2 of the 'Aircraft Modeling Code' or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. This report describes the input command language and also includes several examples which illustrate typical code inputs and outputs.
A user's manual for the method of moments Aircraft Modeling Code (AMC)
NASA Technical Reports Server (NTRS)
Peters, M. E.; Newman, E. H.
1989-01-01
This report serves as a user's manual for the Aircraft Modeling Code or AMC. AMC is a user-oriented computer code, based on the method of moments (MM), for the analysis of the radiation and/or scattering from geometries consisting of a main body or fuselage shape with attached wings and fins. The shape of the main body is described by defining its cross section at several stations along its length. Wings, fins, rotor blades, and radiating monopoles can then be attached to the main body. Although AMC was specifically designed for aircraft or helicopter shapes, it can also be applied to missiles, ships, submarines, jet inlets, automobiles, spacecraft, etc. The problem geometry and run control parameters are specified via a two character command language input format. The input command language is described and several examples which illustrate typical code inputs and outputs are also included.
Open Source GIS Connectors to the NASA GES DISC Satellite Data
NASA Astrophysics Data System (ADS)
Pham, L.; Kempler, S. J.; Yang, W.
2014-12-01
The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) houses a suite of satellite-derived GIS data including high spatiotemporal resolution precipitation, air quality, and modeled land surface parameter data. The data are extremely useful to various GIS research and applications at regional, continental, and global scales, as evidenced by the growing GIS user requests to the data. On the other hand, we also found that some GIS users, especially those from the ArcGIS community, having difficulties in obtaining, importing, and using our data, primarily due to the unfamiliarity of the users with our products and GIS software's lack of capabilities in dealing with the predominately raster form data in various sometimes very complicated formats. In this presentation, we introduce a set of open source ArcGIS data connectors that significantly simplify the access and use of our data in ArcGIS. With the connectors, users do not need to know the data access URLs, the access protocols or syntaxes, and data formats. Nor do they need to browse through a long list of variables that are often embedded into one single science data file and whose names may sometimes be confusing to those not familiar with the file (such as variable CH4_VMR_D for "CH4 Volume mixing ratio from the descending orbit" and variable EVPsfc for "Total Evapotranspiration"). The connectors will expose most GIS-related variables to the users with easy to understand names. User can simply define the spatiotemporal range of their study, select interested parameter(s), and have the needed data be downloaded, imported, and displayed in ArcGIS. The connectors are python text files and there is no installation process. They can be placed at any user directory and be started by simply clicking on it. In the presentation, we'll also demonstrate how to use the tools to load GES DISC time series air quality data with a few clicks and how such data depict the spatial and temporal patterns of air quality in different parts of the world during the past decade.
Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh
2010-02-01
Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.
Measuring the Interestingness of Articles in a Limited User Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pon, R; Cardenas, A; Buttler, David
Search engines, such as Google, assign scores to news articles based on their relevance to a query. However, not all relevant articles for the query may be interesting to a user. For example, if the article is old or yields little new information, the article would be uninteresting. Relevance scores do not take into account what makes an article interesting, which would vary from user to user. Although methods such as collaborative filtering have been shown to be effective in recommendation systems, in a limited user environment, there are not enough users that would make collaborative filtering effective. A generalmore » framework, called iScore, is presented for defining and measuring the ‘‘interestingness of articles, incorporating user-feedback. iScore addresses the various aspects of what makes an article interesting, such as topic relevance, uniqueness, freshness, source reputation, and writing style. It employs various methods, such as multiple topic tracking, online parameter selection, language models, clustering, sentiment analysis, and phrase extraction to measure these features. Due to varying reasons that users hold about why an article is interesting, an online feature selection method in naι¨ve Bayes is also used to improve recommendation results. iScore can outperform traditional IR techniques by as much as 50.7%. iScore and its components are evaluated in the news recommendation task using three datasets from Yahoo! News, actual users, and Digg.« less
In-Space Radiator Shape Optimization using Genetic Algorithms
NASA Technical Reports Server (NTRS)
Hull, Patrick V.; Kittredge, Ken; Tinker, Michael; SanSoucie, Michael
2006-01-01
Future space exploration missions will require the development of more advanced in-space radiators. These radiators should be highly efficient and lightweight, deployable heat rejection systems. Typical radiators for in-space heat mitigation commonly comprise a substantial portion of the total vehicle mass. A small mass savings of even 5-10% can greatly improve vehicle performance. The objective of this paper is to present the development of detailed tools for the analysis and design of in-space radiators using evolutionary computation techniques. The optimality criterion is defined as a two-dimensional radiator with a shape demonstrating the smallest mass for the greatest overall heat transfer, thus the end result is a set of highly functional radiator designs. This cross-disciplinary work combines topology optimization and thermal analysis design by means of a genetic algorithm The proposed design tool consists of the following steps; design parameterization based on the exterior boundary of the radiator, objective function definition (mass minimization and heat loss maximization), objective function evaluation via finite element analysis (thermal radiation analysis) and optimization based on evolutionary algorithms. The radiator design problem is defined as follows: the input force is a driving temperature and the output reaction is heat loss. Appropriate modeling of the space environment is added to capture its effect on the radiator. The design parameters chosen for this radiator shape optimization problem fall into two classes, variable height along the width of the radiator and a spline curve defining the -material boundary of the radiator. The implementation of multiple design parameter schemes allows the user to have more confidence in the radiator optimization tool upon demonstration of convergence between the two design parameter schemes. This tool easily allows the user to manipulate the driving temperature regions thus permitting detailed design of in-space radiators for unique situations. Preliminary results indicate an optimized shape following that of the temperature distribution regions in the "cooler" portions of the radiator. The results closely follow the expected radiator shape.
NASA Astrophysics Data System (ADS)
Rivas-Medina, A.; Gutierrez, V.; Gaspar-Escribano, J. M.; Benito, B.
2009-04-01
Results of a seismic risk assessment study are often applied and interpreted by users unspecialised on the topic or lacking a scientific background. In this context, the availability of tools that help translating essentially scientific contents to broader audiences (such as decision makers or civil defence officials) as well as representing and managing results in a user-friendly fashion, are on indubitable value. On of such tools is the visualization tool VISOR-RISNA, a web tool developed within the RISNA project (financed by the Emergency Agency of Navarre, Spain) for regional seismic risk assessment of Navarre and the subsequent development of emergency plans. The RISNA study included seismic hazard evaluation, geotechnical characterization of soils, incorporation of site effects to expected ground motions, vulnerability distribution assessment and estimation of expected damage distributions for a 10% probability of exceedance in 50 years. The main goal of RISNA was the identification of higher risk area where focusing detailed, local-scale risk studies in the future and the corresponding urban emergency plans. A geographic information system was used to combine different information layers, generate tables of results and represent maps with partial and final results. The visualization tool VISOR-RISNA is intended to facilitate the interpretation and representation of the collection of results, with the ultimate purpose of defining actuation plans. A number of criteria for defining actuation priorities are proposed in this work. They are based on combinations of risk parameters resulting from the risk study (such as expected ground motion and damage and exposed population), as determined by risk assessment specialists. Although the values that these parameters take are a result of the risk study, their distribution in several classes depends on the intervals defined by decision takers or civil defense officials. These criteria provide a ranking of municipalities according to the expected actuation level and eventually, to alert levels. In this regard, the visualization tool constitutes an intuitive and useful tool that the end-user of the risk study may use to optimize and guide its application on emergency planning. The use of this type of tools can be adapted to other scenarios with different boundary conditions (seismicity level, vulnerability distribution) and user profiles (policy makers, stakeholders, students, general public) maintaining the same final goal: to improve the adaptation of the results of a scientific-technical work to the needs of other users with different backgrounds.
Interactive model evaluation tool based on IPython notebook
NASA Astrophysics Data System (ADS)
Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet
2015-04-01
In hydrological modelling, some kind of parameter optimization is mostly performed. This can be the selection of a single best parameter set, a split in behavioural and non-behavioural parameter sets based on a selected threshold or a posterior parameter distribution derived with a formal Bayesian approach. The selection of the criterion to measure the goodness of fit (likelihood or any objective function) is an essential step in all of these methodologies and will affect the final selected parameter subset. Moreover, the discriminative power of the objective function is also dependent from the time period used. In practice, the optimization process is an iterative procedure. As such, in the course of the modelling process, an increasing amount of simulations is performed. However, the information carried by these simulation outputs is not always fully exploited. In this respect, we developed and present an interactive environment that enables the user to intuitively evaluate the model performance. The aim is to explore the parameter space graphically and to visualize the impact of the selected objective function on model behaviour. First, a set of model simulation results is loaded along with the corresponding parameter sets and a data set of the same variable as the model outcome (mostly discharge). The ranges of the loaded parameter sets define the parameter space. A selection of the two parameters visualised can be made by the user. Furthermore, an objective function and a time period of interest need to be selected. Based on this information, a two-dimensional parameter response surface is created, which actually just shows a scatter plot of the parameter combinations and assigns a color scale corresponding with the goodness of fit of each parameter combination. Finally, a slider is available to change the color mapping of the points. Actually, the slider provides a threshold to exclude non behaviour parameter sets and the color scale is only attributed to the remaining parameter sets. As such, by interactively changing the settings and interpreting the graph, the user gains insight in the model structural behaviour. Moreover, a more deliberate choice of objective function and periods of high information content can be identified. The environment is written in an IPython notebook and uses the available interactive functions provided by the IPython community. As such, the power of the IPython notebook as a development environment for scientific computing is illustrated (Shen, 2014).
Accuracy Dimensions in Remote Sensing
NASA Astrophysics Data System (ADS)
Barsi, Á.; Kugler, Zs.; László, I.; Szabó, Gy.; Abdulmutalib, H. M.
2018-04-01
The technological developments in remote sensing (RS) during the past decade has contributed to a significant increase in the size of data user community. For this reason data quality issues in remote sensing face a significant increase in importance, particularly in the era of Big Earth data. Dozens of available sensors, hundreds of sophisticated data processing techniques, countless software tools assist the processing of RS data and contributes to a major increase in applications and users. In the past decades, scientific and technological community of spatial data environment were focusing on the evaluation of data quality elements computed for point, line, area geometry of vector and raster data. Stakeholders of data production commonly use standardised parameters to characterise the quality of their datasets. Yet their efforts to estimate the quality did not reach the general end-user community running heterogeneous applications who assume that their spatial data is error-free and best fitted to the specification standards. The non-specialist, general user group has very limited knowledge how spatial data meets their needs. These parameters forming the external quality dimensions implies that the same data system can be of different quality to different users. The large collection of the observed information is uncertain in a level that can decry the reliability of the applications. Based on prior paper of the authors (in cooperation within the Remote Sensing Data Quality working group of ISPRS), which established a taxonomy on the dimensions of data quality in GIS and remote sensing domains, this paper is aiming at focusing on measures of uncertainty in remote sensing data lifecycle, focusing on land cover mapping issues. In the paper we try to introduce how quality of the various combination of data and procedures can be summarized and how services fit the users' needs. The present paper gives the theoretic overview of the issue, besides selected, practice-oriented approaches are evaluated too, finally widely-used dimension metrics like Root Mean Squared Error (RMSE) or confusion matrix are discussed. The authors present data quality features of well-defined and poorly defined object. The central part of the study is the land cover mapping, describing its accuracy management model, presented relevance and uncertainty measures of its influencing quality dimensions. In the paper theory is supported by a case study, where the remote sensing technology is used for supporting the area-based agricultural subsidies of the European Union, in Hungarian administration.
Guidelines and Parameter Selection for the Simulation of Progressive Delamination
NASA Technical Reports Server (NTRS)
Song, Kyongchan; Davila, Carlos G.; Rose, Cheryl A.
2008-01-01
Turon s methodology for determining optimal analysis parameters for the simulation of progressive delamination is reviewed. Recommended procedures for determining analysis parameters for efficient delamination growth predictions using the Abaqus/Standard cohesive element and relatively coarse meshes are provided for single and mixed-mode loading. The Abaqus cohesive element, COH3D8, and a user-defined cohesive element are used to develop finite element models of the double cantilever beam specimen, the end-notched flexure specimen, and the mixed-mode bending specimen to simulate progressive delamination growth in Mode I, Mode II, and mixed-mode fracture, respectively. The predicted responses are compared with their analytical solutions. The results show that for single-mode fracture, the predicted responses obtained with the Abaqus cohesive element correlate well with the analytical solutions. For mixed-mode fracture, it was found that the response predicted using COH3D8 elements depends on the damage evolution criterion that is used. The energy-based criterion overpredicts the peak loads and load-deflection response. The results predicted using a tabulated form of the BK criterion correlate well with the analytical solution and with the results predicted with the user-written element.
Exploiting Aura OMI Level 2 Data with High Resolution Visualization
NASA Astrophysics Data System (ADS)
Wei, J. C.; Yang, W.; Johnson, J. E.; Zhao, P.; Gerasimov, I. V.; Pham, L.; Vicente, G. A.; Shen, S.
2014-12-01
Satellite data products are important for a wide variety of applications that can bring far-reaching benefits to the science community and the broader society. These benefits can best be achieved if the satellite data are well utilized and interpreted, such as model inputs from satellite, or extreme event (such as volcano eruption, dust storm, …etc) interpretation from satellite. Unfortunately, this is not always the case, despite the abundance and relative maturity of numerous satellite data products provided by NASA and other organizations. One way to help users better understand the satellite data is to provide data along with 'Images', including accurate pixel-level (Level 2) information, pixel coverage area delineation, and science team recommended quality screening for individual geophysical parameters. Goddard Earth Sciences Data and Information Services Center (GES DISC) always strives to best support (i.e., Software-as-a-service, SaaS) the user-community for NASA Earth Science Data. In this case, we will present a new visualization tool that helps users exploiting Aura Ozone Monitoring Instrument (OMI) Level 2 data. This new visualization service utilizes Open Geospatial Consortium (OGC) standard-compliant Web Mapping Service (WMS) and Web Coverage Service (WCS) calls in the backend infrastructure. The functionality of the service allows users to select data sources (e.g., multiple parameters under the same measurement, like NO2 and SO2 from OMI Level 2 or same parameter with different methods of aggregation, like NO2 in OMNO2G and OMNO2D products), defining area-of-interest and temporal extents, zooming, panning, overlaying, sliding, and data subsetting and reformatting. The interface will also be able to connect to other OGC WMS and WCS servers, which will greatly enhance its expandability to integrate additional outside data/map sources (such as Global Imagery Browse Services (GIBS)).
Computational Software for Fitting Seismic Data to Epidemic-Type Aftershock Sequence Models
NASA Astrophysics Data System (ADS)
Chu, A.
2014-12-01
Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work introduces software to implement two of ETAS models described in Ogata (1998). To find the Maximum-Likelihood Estimates (MLEs), my software provides estimates of the homogeneous background rate parameter and the temporal and spatial parameters that govern triggering effects by applying the Expectation-Maximization (EM) algorithm introduced in Veen and Schoenberg (2008). Despite other computer programs exist for similar data modeling purpose, using EM-algorithm has the benefits of stability and robustness (Veen and Schoenberg, 2008). Spatial shapes that are very long and narrow cause difficulties in optimization convergence and problems with flat or multi-modal log-likelihood functions encounter similar issues. My program uses a robust method to preset a parameter to overcome the non-convergence computational issue. In addition to model fitting, the software is equipped with useful tools for examining modeling fitting results, for example, visualization of estimated conditional intensity, and estimation of expected number of triggered aftershocks. A simulation generator is also given with flexible spatial shapes that may be defined by the user. This open-source software has a very simple user interface. The user may execute it on a local computer, and the program also has potential to be hosted online. Java language is used for the software's core computing part and an optional interface to the statistical package R is provided.
Automated Glacier Surface Velocity using Multi-Image/Multi-Chip (MIMC) Feature Tracking
NASA Astrophysics Data System (ADS)
Ahn, Y.; Howat, I. M.
2009-12-01
Remote sensing from space has enabled effective monitoring of remote and inhospitable polar regions. Glacier velocity, and its variation in time, is one of the most important parameters needed to understand glacier dynamics, glacier mass balance and contribution to sea level rise. Regular measurements of ice velocity are possible from large and accessible satellite data set archives, such as ASTER and LANDSAT-7. Among satellite imagery, optical imagery (i.e. passive, visible to near-infrared band sensors) provides abundant data with optimal spatial resolution and repeat interval for tracking glacier motion at high temporal resolution. Due to massive amounts of data, computation of ice velocity from feature tracking requires 1) user-friendly interface, 2) minimum local/user parameter inputs and 3) results that need minimum editing. We focus on robust feature tracking, applicable to all currently available optical satellite imagery, that is ASTER, SPOT and LANDSAT etc. We introduce the MIMC (multiple images/multiple chip sizes) matching approach that does not involve any user defined local/empirical parameters except approximate average glacier speed. We also introduce a method for extracting velocity from LANDSAT-7 SLC-off data, which has 22 percent of scene data missing in slanted strips due to failure of the scan line corrector. We apply our approach to major outlet glaciers in west/east Greenland and assess our MIMC feature tracking technique by comparison with conventional correlation matching and other methods (e.g. InSAR).
Montenegro, Cristian R
2018-04-24
Although the organisation of mental health service users and ex-users in Latin America is a recent and under-researched phenomenon, global calls for their involvement in policy have penetrated national agendas, shaping definitions and expectations about their role in mental health systems. In this context, how such groups react to these expectations and define their own goals, strategies and partnerships can reveal the specificity of the "user movement" in Chile and Latin America. This study draws on Jacques Rancière's theorisation of "police order" and "politics" to understand the emergence of users' collective identity and activism, highlighting the role of practices of disengagement and rejection. It is based on interviews and participant observation with a collective of users, ex-users and professionals in Chile. The findings show how the group's aims and self-understandings evolved through hesitations and reflexive engagements with the legal system, the mental health system, and wider society. The notion of a "politics of incommensurability" is proposed to thread together a reflexive rejection of external expectations and definitions and the development of a sense of being "outside" of the intelligibility of the mental health system and its frameworks of observation and proximity. This incommensurability problematises a technical definition of users' presence and influence and the generalisation of abstract parameters of engagement, calling for approaches that address how these groups constitute themselves meaningfully in specific situations.
Web Services and Other Enhancements at the Northern California Earthquake Data Center
NASA Astrophysics Data System (ADS)
Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.
2012-12-01
The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These data can be accessed through the above web services and through special NCEDC web pages.
Piehowski, Paul D; Petyuk, Vladislav A; Sandoval, John D; Burnum, Kristin E; Kiebel, Gary R; Monroe, Matthew E; Anderson, Gordon A; Camp, David G; Smith, Richard D
2013-03-01
For bottom-up proteomics, there are wide variety of database-searching algorithms in use for matching peptide sequences to tandem MS spectra. Likewise, there are numerous strategies being employed to produce a confident list of peptide identifications from the different search algorithm outputs. Here we introduce a grid-search approach for determining optimal database filtering criteria in shotgun proteomics data analyses that is easily adaptable to any search. Systematic Trial and Error Parameter Selection--referred to as STEPS--utilizes user-defined parameter ranges to test a wide array of parameter combinations to arrive at an optimal "parameter set" for data filtering, thus maximizing confident identifications. The benefits of this approach in terms of numbers of true-positive identifications are demonstrated using datasets derived from immunoaffinity-depleted blood serum and a bacterial cell lysate, two common proteomics sample types. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sheet metals characterization using the virtual fields method
NASA Astrophysics Data System (ADS)
Marek, Aleksander; Davis, Frances M.; Pierron, Fabrice
2018-05-01
In this work, a characterisation method involving a deep-notched specimen subjected to a tensile loading is introduced. This specimen leads to heterogeneous states of stress and strain, the latter being measured using a stereo DIC system (MatchID). This heterogeneity enables the identification of multiple material parameters in a single test. In order to identify material parameters from the DIC data, an inverse method called the Virtual Fields Method is employed. The method combined with recently developed sensitivity-based virtual fields allows to optimally locate areas in the test where information about each material parameter is encoded, improving accuracy of the identification over the traditional user-defined virtual fields. It is shown that a single test performed at 45° to the rolling direction is sufficient to obtain all anisotropic plastic parameters, thus reducing experimental effort involved in characterisation. The paper presents the methodology and some numerical validation.
Recommendation Systems for Geoscience Data Portals Built by Analyzing Usage Patterns
NASA Astrophysics Data System (ADS)
Crosby, C.; Nandigam, V.; Baru, C.
2009-04-01
Since its launch five years ago, the National Science Foundation-funded GEON Project (www.geongrid.org) has been providing access to a variety of geoscience data sets such as geologic maps and other geographic information system (GIS)-oriented data, paleontologic databases, gravity and magnetics data and LiDAR topography via its online portal interface. In addition to data, the GEON Portal also provides web-based tools and other resources that enable users to process and interact with data. Examples of these tools include functions to dynamically map and integrate GIS data, compute synthetic seismograms, and to produce custom digital elevation models (DEMs) with user defined parameters such as resolution. The GEON portal built on the Gridsphere-portal framework allows us to capture user interaction with the system. In addition to the site access statistics captured by tools like Google Analystics which capture hits per unit time, search key words, operating systems, browsers, and referring sites, we also record additional statistics such as which data sets are being downloaded and in what formats, processing parameters, and navigation pathways through the portal. With over four years of data now available from the GEON Portal, this record of usage is a rich resource for exploring how earth scientists discover and utilize online data sets. Furthermore, we propose that this data could ultimately be harnessed to optimize the way users interact with the data portal, design intelligent processing and data management systems, and to make recommendations on algorithm settings and other available relevant data. The paradigm of integrating popular and commonly used patterns to make recommendations to a user is well established in the world of e-commerce where users receive suggestions on books, music and other products that they may find interesting based on their website browsing and purchasing history, as well as the patterns of fellow users who have made similar selections. However, this paradigm has not yet been explored for geoscience data portals. In this presentation we will present an initial analysis of user interaction and access statistics for the GEON OpenTopography LiDAR data distribution and processing system to illustrate what they reveal about user's spatial and temporal data access patterns, data processing parameter selections, and pathways through the data portal. We also demonstrate what these usage statistics can illustrate about aspects of the data sets that are of greatest interest. Finally, we explore how these usage statistics could be used to improve the user's experience in the data portal and to optimize how data access interfaces and tools are designed and implemented.
Interactive-cut: Real-time feedback segmentation for translational research.
Egger, Jan; Lüddemann, Tobias; Schwarzenberg, Robert; Freisleben, Bernd; Nimsky, Christopher
2014-06-01
In this contribution, a scale-invariant image segmentation algorithm is introduced that "wraps" the algorithm's parameters for the user by its interactive behavior, avoiding the definition of "arbitrary" numbers that the user cannot really understand. Therefore, we designed a specific graph-based segmentation method that only requires a single seed-point inside the target-structure from the user and is thus particularly suitable for immediate processing and interactive, real-time adjustments by the user. In addition, color or gray value information that is needed for the approach can be automatically extracted around the user-defined seed point. Furthermore, the graph is constructed in such a way, so that a polynomial-time mincut computation can provide the segmentation result within a second on an up-to-date computer. The algorithm presented here has been evaluated with fixed seed points on 2D and 3D medical image data, such as brain tumors, cerebral aneurysms and vertebral bodies. Direct comparison of the obtained automatic segmentation results with costlier, manual slice-by-slice segmentations performed by trained physicians, suggest a strong medical relevance of this interactive approach. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Sulyma, P. R.
1980-01-01
Fundamental equations and similarity definition and application are described as well as the computational steps of a computer program developed to design model nozzles for wind tunnel tests conducted to define power-on aerodynamic characteristics of the space shuttle over a range of ascent trajectory conditions. The computer code capabilities, a user's guide for the model nozzle design program, and the output format are examined. A program listing is included.
SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.
Smith, Lucas R; Barton, Elisabeth R
2014-01-01
Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection along with user correction. The output of the code provides data in accordance with established standards of practice. The results of the program have been validated using a small set of wild-type and mdx muscle sections. This program is the first freely available and open source image processing program designed to automate analysis of skeletal muscle histological sections.
Tree Cover Mapping Tool—Documentation and user manual
Cotillon, Suzanne E.; Mathis, Melissa L.
2016-06-02
The Tree Cover Mapping (TCM) tool was developed by scientists at the U.S. Geological Survey Earth Resources Observation and Science Center to allow a user to quickly map tree cover density over large areas using visual interpretation of high resolution imagery within a geographic information system interface. The TCM tool uses a systematic sample grid to produce maps of tree cover. The TCM tool allows the user to define sampling parameters to estimate tree cover within each sample unit. This mapping method generated the first on-farm tree cover maps of vast regions of Niger and Burkina Faso. The approach contributes to implementing integrated landscape management to scale up re-greening and restore degraded land in the drylands of Africa. The TCM tool is easy to operate, practical, and can be adapted to many other applications such as crop mapping, settlements mapping, or other features. This user manual provides step-by-step instructions for installing and using the tool, and creating tree cover maps. Familiarity with ArcMap tools and concepts is helpful for using the tool.
On the Study of Cognitive Bidirectional Relaying with Asymmetric Traffic Demands
NASA Astrophysics Data System (ADS)
Ji, Xiaodong
2015-05-01
In this paper, we consider a cognitive radio network scenario, where two primary users want to exchange information with each other and meanwhile, one secondary node wishes to send messages to a cognitive base station. To meet the target quality of service (QoS) of the primary users and raise the communication opportunity of the secondary nodes, a cognitive bidirectional relaying (BDR) scheme is examined. First, system outage probabilities of conventional direct transmission and BDR schemes are presented. Next, a new system parameter called operating region is defined and calculated, which indicates in which position a secondary node can be a cognitive relay to assist the primary users. Then, a cognitive BDR scheme is proposed, giving a transmission protocol along with a time-slot splitting algorithm between the primary and secondary transmissions. Information-theoretic metric of ergodic capacity is studied for the cognitive BDR scheme to evaluate its performance. Simulation results show that with the proposed scheme, the target QoS of the primary users can be guaranteed, while increasing the communication opportunity for the secondary nodes.
NASA Technical Reports Server (NTRS)
Pindera, Marek-Jerzy; Salzar, Robert S.; Williams, Todd O.
1994-01-01
A user's guide for the computer program OPTCOMP is presented in this report. This program provides a capability to optimize the fabrication or service-induced residual stresses in uni-directional metal matrix composites subjected to combined thermo-mechanical axisymmetric loading using compensating or compliant layers at the fiber/matrix interface. The user specifies the architecture and the initial material parameters of the interfacial region, which can be either elastic or elastoplastic, and defines the design variables, together with the objective function, the associated constraints and the loading history through a user-friendly data input interface. The optimization procedure is based on an efficient solution methodology for the elastoplastic response of an arbitrarily layered multiple concentric cylinder model that is coupled to the commercial optimization package DOT. The solution methodology for the arbitrarily layered cylinder is based on the local-global stiffness matrix formulation and Mendelson's iterative technique of successive elastic solutions developed for elastoplastic boundary-value problems. The optimization algorithm employed in DOT is based on the method of feasible directions.
Eye-gaze determination of user intent at the computer interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, J.H.; Schryver, J.C.
1993-12-31
Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less
Lightning Simulation and Design Program (LSDP)
NASA Astrophysics Data System (ADS)
Smith, D. A.
This computer program simulates a user-defined lighting configuration. It has been developed as a tool to aid in the design of exterior lighting systems. Although this program is used primarily for perimeter security lighting design, it has potential use for any application where the light can be approximated by a point source. A data base of luminaire photometric information is maintained for use with this program. The user defines the surface area to be illuminated with a rectangular grid and specifies luminaire positions. Illumination values are calculated for regularly spaced points in that area and isolux contour plots are generated. The numerical and graphical output for a particular site mode are then available for analysis. The amount of time spent on point-to-point illumination computation with this progress is much less than that required for tedious hand calculations. The ease with which various parameters can be interactively modified with the progress also reduces the time and labor expended. Consequently, the feasibility of design ideas can be examined, modified, and retested more thoroughly, and overall design costs can be substantially lessened by using this progress as an adjunct to the design process.
NASA Astrophysics Data System (ADS)
Lee, Dong Hyuk; Kim, JongHyo; Kim, Hee C.; Lee, Yong W.; Min, Byong Goo
1997-04-01
There have been a number of studies on the quantitative evaluation of diffuse liver disease by using texture analysis technique. However, the previous studies have been focused on the classification between only normal and abnormal pattern based on textural properties, resulting in lack of clinically useful information about the progressive status of liver disease. Considering our collaborative research experience with clinical experts, we judged that not only texture information but also several shape properties are necessary in order to successfully classify between various states of disease with liver ultrasonogram. Nine image parameters were selected experimentally. One of these was texture parameter and others were shape parameters measured as length, area and curvature. We have developed a neural-net algorithm that classifies liver ultrasonogram into 9 categories of liver disease: 3 main category and 3 sub-steps for each. Nine parameters were collected semi- automatically from the user by using graphical user interface tool, and then processed to give a grade for each parameter. Classifying algorithm consists of two steps. At the first step, each parameter was graded into pre-defined levels using neural network. in the next step, neural network classifier determined disease status using graded nine parameters. We implemented a PC based computer-assist diagnosis workstation and installed it in radiology department of Seoul National University Hospital. Using this workstation we collected 662 cases during 6 months. Some of these were used for training and others were used for evaluating accuracy of the developed algorithm. As a conclusion, a liver ultrasonogram classifying algorithm was developed using both texture and shape parameters and neural network classifier. Preliminary results indicate that the proposed algorithm is useful for evaluation of diffuse liver disease.
FIT-MART: Quantum Magnetism with a Gentle Learning Curve
NASA Astrophysics Data System (ADS)
Engelhardt, Larry; Garland, Scott C.; Rainey, Cameron; Freeman, Ray A.
We present a new open-source software package, FIT-MART, that allows non-experts to quickly get started sim- ulating quantum magnetism. FIT-MART can be downloaded as a platform-idependent executable Java (JAR) file. It allows the user to define (Heisenberg) Hamiltonians by electronically drawing pictures that represent quantum spins and operators. Sliders are automatically generated to control the values of the parameters in the model, and when the values change, several plots are updated in real time to display both the resulting energy spectra and the equilibruim magnetic properties. Several experimental data sets for real magnetic molecules are included in FIT-MART to allow easy comparison between simulated and experimental data, and FIT-MART users can also import their own data for analysis and compare the goodness of fit for different models.
A multi-objective approach to improve SWAT model calibration in alpine catchments
NASA Astrophysics Data System (ADS)
Tuo, Ye; Marcolini, Giorgia; Disse, Markus; Chiogna, Gabriele
2018-04-01
Multi-objective hydrological model calibration can represent a valuable solution to reduce model equifinality and parameter uncertainty. The Soil and Water Assessment Tool (SWAT) model is widely applied to investigate water quality and water management issues in alpine catchments. However, the model calibration is generally based on discharge records only, and most of the previous studies have defined a unique set of snow parameters for an entire basin. Only a few studies have considered snow observations to validate model results or have taken into account the possible variability of snow parameters for different subbasins. This work presents and compares three possible calibration approaches. The first two procedures are single-objective calibration procedures, for which all parameters of the SWAT model were calibrated according to river discharge alone. Procedures I and II differ from each other by the assumption used to define snow parameters: The first approach assigned a unique set of snow parameters to the entire basin, whereas the second approach assigned different subbasin-specific sets of snow parameters to each subbasin. The third procedure is a multi-objective calibration, in which we considered snow water equivalent (SWE) information at two different spatial scales (i.e. subbasin and elevation band), in addition to discharge measurements. We tested these approaches in the Upper Adige river basin where a dense network of snow depth measurement stations is available. Only the set of parameters obtained with this multi-objective procedure provided an acceptable prediction of both river discharge and SWE. These findings offer the large community of SWAT users a strategy to improve SWAT modeling in alpine catchments.
A Procedure for High Resolution Satellite Imagery Quality Assessment
Crespi, Mattia; De Vendictis, Laura
2009-01-01
Data products generated from High Resolution Satellite Imagery (HRSI) are routinely evaluated during the so-called in-orbit test period, in order to verify if their quality fits the desired features and, if necessary, to obtain the image correction parameters to be used at the ground processing center. Nevertheless, it is often useful to have tools to evaluate image quality also at the final user level. Image quality is defined by some parameters, such as the radiometric resolution and its accuracy, represented by the noise level, and the geometric resolution and sharpness, described by the Modulation Transfer Function (MTF). This paper proposes a procedure to evaluate these image quality parameters; the procedure was implemented in a suitable software and tested on high resolution imagery acquired by the QuickBird, WorldView-1 and Cartosat-1 satellites. PMID:22412312
DD3MAT - a code for yield criteria anisotropy parameters identification.
NASA Astrophysics Data System (ADS)
Barros, P. D.; Carvalho, P. D.; Alves, J. L.; Oliveira, M. C.; Menezes, L. F.
2016-08-01
This work presents the main strategies and algorithms adopted in the DD3MAT inhouse code, specifically developed for identifying the anisotropy parameters. The algorithm adopted is based on the minimization of an error function, using a downhill simplex method. The set of experimental values can consider yield stresses and r -values obtained from in-plane tension, for different angles with the rolling direction (RD), yield stress and r -value obtained for biaxial stress state, and yield stresses from shear tests performed also for different angles to RD. All these values can be defined for a specific value of plastic work. Moreover, it can also include the yield stresses obtained from in-plane compression tests. The anisotropy parameters are identified for an AA2090-T3 aluminium alloy, highlighting the importance of the user intervention to improve the numerical fit.
Application-Defined Decentralized Access Control
Xu, Yuanzhong; Dunn, Alan M.; Hofmann, Owen S.; Lee, Michael Z.; Mehdi, Syed Akbar; Witchel, Emmett
2014-01-01
DCAC is a practical OS-level access control system that supports application-defined principals. It allows normal users to perform administrative operations within their privilege, enabling isolation and privilege separation for applications. It does not require centralized policy specification or management, giving applications freedom to manage their principals while the policies are still enforced by the OS. DCAC uses hierarchically-named attributes as a generic framework for user-defined policies such as groups defined by normal users. For both local and networked file systems, its execution time overhead is between 0%–9% on file system microbenchmarks, and under 1% on applications. This paper shows the design and implementation of DCAC, as well as several real-world use cases, including sandboxing applications, enforcing server applications’ security policies, supporting NFS, and authenticating user-defined sub-principals in SSH, all with minimal code changes. PMID:25426493
Nieke, Jens; Reusen, Ils
2007-01-01
User-driven requirements for remote sensing data are difficult to define, especially details on geometric, spectral and radiometric parameters. Even more difficult is a decent assessment of the required degrees of processing and corresponding data quality. It is therefore a real challenge to appropriately assess data costs and services to be provided. In 2006, the HYRESSA project was initiated within the framework 6 programme of the European Commission to analyze the user needs of the hyperspectral remote sensing community. Special focus was given to finding an answer to the key question, “What are the individual user requirements for hyperspectral imagery and its related data products?”. A Value-Benefit Analysis (VBA) was performed to retrieve user needs and address open items accordingly. The VBA is an established tool for systematic problem solving by supporting the possibility of comparing competing projects or solutions. It enables evaluation on the basis of a multidimensional objective model and can be augmented with expert's preferences. After undergoing a VBA, the scaling method (e.g., Law of Comparative Judgment) was applied for achieving the desired ranking judgments. The result, which is the relative value of projects with respect to a well-defined main objective, can therefore be produced analytically using a VBA. A multidimensional objective model adhering to VBA methodology was established. Thereafter, end users and experts were requested to fill out a Questionnaire of User Needs (QUN) at the highest level of detail - the value indicator level. The end user was additionally requested to report personal preferences for his particular research field. In the end, results from the experts' evaluation and results from a sensor data survey can be compared in order to understand user needs and the drawbacks of existing data products. The investigation – focusing on the needs of the hyperspectral user community in Europe – showed that a VBA is a suitable method for analyzing the needs of hyperspectral data users and supporting the sensor/data specification-building process. The VBA has the advantage of being easy to handle, resulting in a comprehensive evaluation. The primary disadvantage is the large effort in realizing such an analysis because the level of detail is extremely high.
SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, D; Spaans, J; Kumaraswamy, L
Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty onmore » and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published reports or otherwise compared to the results of other users or vendors should clearly indicate whether the Measurement Uncertainty function is used.« less
NASA Technical Reports Server (NTRS)
Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)
2001-01-01
The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.
Fast simulation tool for ultraviolet radiation at the earth's surface
NASA Astrophysics Data System (ADS)
Engelsen, Ola; Kylling, Arve
2005-04-01
FastRT is a fast, yet accurate, UV simulation tool that computes downward surface UV doses, UV indices, and irradiances in the spectral range 290 to 400 nm with a resolution as small as 0.05 nm. It computes a full UV spectrum within a few milliseconds on a standard PC, and enables the user to convolve the spectrum with user-defined and built-in spectral response functions including the International Commission on Illumination (CIE) erythemal response function used for UV index calculations. The program accounts for the main radiative input parameters, i.e., instrumental characteristics, solar zenith angle, ozone column, aerosol loading, clouds, surface albedo, and surface altitude. FastRT is based on look-up tables of carefully selected entries of atmospheric transmittances and spherical albedos, and exploits the smoothness of these quantities with respect to atmospheric, surface, geometrical, and spectral parameters. An interactive site, http://nadir.nilu.no/~olaeng/fastrt/fastrt.html, enables the public to run the FastRT program with most input options. This page also contains updated information about FastRT and links to freely downloadable source codes and binaries.
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
GEsture: an online hand-drawing tool for gene expression pattern search.
Wang, Chunyan; Xu, Yiqing; Wang, Xuelin; Zhang, Li; Wei, Suyun; Ye, Qiaolin; Zhu, Youxiang; Yin, Hengfu; Nainwal, Manoj; Tanon-Reyes, Luis; Cheng, Feng; Yin, Tongming; Ye, Ning
2018-01-01
Gene expression profiling data provide useful information for the investigation of biological function and process. However, identifying a specific expression pattern from extensive time series gene expression data is not an easy task. Clustering, a popular method, is often used to classify similar expression genes, however, genes with a 'desirable' or 'user-defined' pattern cannot be efficiently detected by clustering methods. To address these limitations, we developed an online tool called GEsture. Users can draw, or graph a curve using a mouse instead of inputting abstract parameters of clustering methods. GEsture explores genes showing similar, opposite and time-delay expression patterns with a gene expression curve as input from time series datasets. We presented three examples that illustrate the capacity of GEsture in gene hunting while following users' requirements. GEsture also provides visualization tools (such as expression pattern figure, heat map and correlation network) to display the searching results. The result outputs may provide useful information for researchers to understand the targets, function and biological processes of the involved genes.
Takashima, Ryoichi; Takiguchi, Tetsuya; Ariki, Yasuo
2013-02-01
This paper presents a method for discriminating the location of the sound source (talker) using only a single microphone. In a previous work, the single-channel approach for discriminating the location of the sound source was discussed, where the acoustic transfer function from a user's position is estimated by using a hidden Markov model of clean speech in the cepstral domain. In this paper, each cepstral dimension of the acoustic transfer function is newly weighted, in order to obtain the cepstral dimensions having information that is useful for classifying the user's position. Then, this paper proposes a feature-weighting method for the cepstral parameter using multiple kernel learning, defining the base kernels for each cepstral dimension of the acoustic transfer function. The user's position is trained and classified by support vector machine. The effectiveness of this method has been confirmed by sound source (talker) localization experiments performed in different room environments.
On the selection of user-defined parameters in data-driven stochastic subspace identification
NASA Astrophysics Data System (ADS)
Priori, C.; De Angelis, M.; Betti, R.
2018-02-01
The paper focuses on the time domain output-only technique called Data-Driven Stochastic Subspace Identification (DD-SSI); in order to identify modal models (frequencies, damping ratios and mode shapes), the role of its user-defined parameters is studied, and rules to determine their minimum values are proposed. Such investigation is carried out using, first, the time histories of structural responses to stationary excitations, with a large number of samples, satisfying the hypothesis on the input imposed by DD-SSI. Then, the case of non-stationary seismic excitations with a reduced number of samples is considered. In this paper, partitions of the data matrix different from the one proposed in the SSI literature are investigated, together with the influence of different choices of the weighting matrices. The study is carried out considering two different applications: (1) data obtained from vibration tests on a scaled structure and (2) in-situ tests on a reinforced concrete building. Referring to the former, the identification of a steel frame structure tested on a shaking table is performed using its responses in terms of absolute accelerations to a stationary (white noise) base excitation and to non-stationary seismic excitations of low intensity. Black-box and modal models are identified in both cases and the results are compared with those from an input-output subspace technique. With regards to the latter, the identification of a complex hospital building is conducted using data obtained from ambient vibration tests.
al3c: high-performance software for parameter inference using Approximate Bayesian Computation.
Stram, Alexander H; Marjoram, Paul; Chen, Gary K
2015-11-01
The development of Approximate Bayesian Computation (ABC) algorithms for parameter inference which are both computationally efficient and scalable in parallel computing environments is an important area of research. Monte Carlo rejection sampling, a fundamental component of ABC algorithms, is trivial to distribute over multiple processors but is inherently inefficient. While development of algorithms such as ABC Sequential Monte Carlo (ABC-SMC) help address the inherent inefficiencies of rejection sampling, such approaches are not as easily scaled on multiple processors. As a result, current Bayesian inference software offerings that use ABC-SMC lack the ability to scale in parallel computing environments. We present al3c, a C++ framework for implementing ABC-SMC in parallel. By requiring only that users define essential functions such as the simulation model and prior distribution function, al3c abstracts the user from both the complexities of parallel programming and the details of the ABC-SMC algorithm. By using the al3c framework, the user is able to scale the ABC-SMC algorithm in parallel computing environments for his or her specific application, with minimal programming overhead. al3c is offered as a static binary for Linux and OS-X computing environments. The user completes an XML configuration file and C++ plug-in template for the specific application, which are used by al3c to obtain the desired results. Users can download the static binaries, source code, reference documentation and examples (including those in this article) by visiting https://github.com/ahstram/al3c. astram@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
User-Friendly Interface Developed for a Web-Based Service for SpaceCAL Emulations
NASA Technical Reports Server (NTRS)
Liszka, Kathy J.; Holtz, Allen P.
2004-01-01
A team at the NASA Glenn Research Center is developing a Space Communications Architecture Laboratory (SpaceCAL) for protocol development activities for coordinated satellite missions. SpaceCAL will provide a multiuser, distributed system to emulate space-based Internet architectures, backbone networks, formation clusters, and constellations. As part of a new effort in 2003, building blocks are being defined for an open distributed system to make the satellite emulation test bed accessible through an Internet connection. The first step in creating a Web-based service to control the emulation remotely is providing a user-friendly interface for encoding the data into a well-formed and complete Extensible Markup Language (XML) document. XML provides coding that allows data to be transferred between dissimilar systems. Scenario specifications include control parameters, network routes, interface bandwidths, delay, and bit error rate. Specifications for all satellite, instruments, and ground stations in a given scenario are also included in the XML document. For the SpaceCAL emulation, the XML document can be created using XForms, a Webbased forms language for data collection. Contrary to older forms technology, the interactive user interface makes the science prevalent, not the data representation. Required versus optional input fields, default values, automatic calculations, data validation, and reuse will help researchers quickly and accurately define missions. XForms can apply any XML schema defined for the test mission to validate data before forwarding it to the emulation facility. New instrument definitions, facilities, and mission types can be added to the existing schema. The first prototype user interface incorporates components for interactive input and form processing. Internet address, data rate, and the location of the facility are implemented with basic form controls with default values provided for convenience and efficiency using basic XForms operations. Because different emulation scenarios will vary widely in their component structure, more complex operations are used to add and delete facilities.
The varieties of ecstatic experience: an exploration of the subjective experiences of ecstasy.
Sumnall, Harry R; Cole, Jon C; Jerome, Lisa
2006-09-01
Previous investigations of the subjective effects of MDMA (material sold as ecstasy) have conducted interviews and surveys of various groups of ecstasy users within particular sub-populations. This study examined subjective drug effects reported by different sub-populations of ecstasy users and explored whether the function or purpose served by using ecstasy influenced the nature of the drug experience. Drawing on previous measures of alterations in consciousness, psychedelic drugs and cannabis, and informal interviews with ecstasy users and MDMA researchers, a 130-item survey assessing subjective effects of ecstasy/MDMA was developed. Principal components analysis of responses of ecstasy users revealed six components; perceptual alterations, entactogenic effects, prosocial effects, aesthetic effects, negative effects and sexual effects. The derived scale was used to predict ecstasy use behaviours, and functions and experiences of use. A variety of component scores were related to ecstasy use parameters; in particular, heavier users expected fewer negative, perceptual and aesthetic effects from taking the drug. The reasons given for using ecstasy (use function) also influenced reported drug effects. Abstainers expected greater negative, perceptual, aesthetic and sexual effects than users. These data indicate that the subjective ecstasy experience is influenced by a variety of extra-psychopharmacological factors. Drug intervention strategies may be made more effective by targeting particular user groups defined by reasons given for substance use, as it is likely that their experiences of ecstasy effects will differ. Future research into ecstasy may be improved by recognizing user diversity.
Woldegebriel, Michael; Zomer, Paul; Mol, Hans G J; Vivó-Truyols, Gabriel
2016-08-02
In this work, we introduce an automated, efficient, and elegant model to combine all pieces of evidence (e.g., expected retention times, peak shapes, isotope distributions, fragment-to-parent ratio) obtained from liquid chromatography-tandem mass spectrometry (LC-MS/MS/MS) data for screening purposes. Combining all these pieces of evidence requires a careful assessment of the uncertainties in the analytical system as well as all possible outcomes. To-date, the majority of the existing algorithms are highly dependent on user input parameters. Additionally, the screening process is tackled as a deterministic problem. In this work we present a Bayesian framework to deal with the combination of all these pieces of evidence. Contrary to conventional algorithms, the information is treated in a probabilistic way, and a final probability assessment of the presence/absence of a compound feature is computed. Additionally, all the necessary parameters except the chromatographic band broadening for the method are learned from the data in training and learning phase of the algorithm, avoiding the introduction of a large number of user-defined parameters. The proposed method was validated with a large data set and has shown improved sensitivity and specificity in comparison to a threshold-based commercial software package.
NASA Astrophysics Data System (ADS)
Abedini, M. J.; Nasseri, M.; Burn, D. H.
2012-04-01
In any geostatistical study, an important consideration is the choice of an appropriate, repeatable, and objective search strategy that controls the nearby samples to be included in the location-specific estimation procedure. Almost all geostatistical software available in the market puts the onus on the user to supply search strategy parameters in a heuristic manner. These parameters are solely controlled by geographical coordinates that are defined for the entire area under study, and the user has no guidance as to how to choose these parameters. The main thesis of the current study is that the selection of search strategy parameters has to be driven by data—both the spatial coordinates and the sample values—and cannot be chosen beforehand. For this purpose, a genetic-algorithm-based ordinary kriging with moving neighborhood technique is proposed. The search capability of a genetic algorithm is exploited to search the feature space for appropriate, either local or global, search strategy parameters. Radius of circle/sphere and/or radii of standard or rotated ellipse/ellipsoid are considered as the decision variables to be optimized by GA. The superiority of GA-based ordinary kriging is demonstrated through application to the Wolfcamp Aquifer piezometric head data. Assessment of numerical results showed that definition of search strategy parameters based on both geographical coordinates and sample values improves cross-validation statistics when compared with that based on geographical coordinates alone. In the case of a variable search neighborhood for each estimation point, optimization of local search strategy parameters for an elliptical support domain—the orientation of which is dictated by anisotropic axes—via GA was able to capture the dynamics of piezometric head in west Texas/New Mexico in an efficient way.
Angular filter refractometry analysis using simulated annealing.
Angland, P; Haberberger, D; Ivancic, S T; Froula, D H
2017-10-01
Angular filter refractometry (AFR) is a novel technique used to characterize the density profiles of laser-produced, long-scale-length plasmas [Haberberger et al., Phys. Plasmas 21, 056304 (2014)]. A new method of analysis for AFR images was developed using an annealing algorithm to iteratively converge upon a solution. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on the minimization of the χ 2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in an average uncertainty in the density profile of 5%-20% in the region of interest.
Dual ant colony operational modal analysis parameter estimation method
NASA Astrophysics Data System (ADS)
Sitarz, Piotr; Powałka, Bartosz
2018-01-01
Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.
NASA Technical Reports Server (NTRS)
Jordan, J.
1985-01-01
This document is intended for users of the Local Area Network Extensible Simulator, version I. This simulator models the performance of a Fiber Optic network under a variety of loading conditions and network characteristics. The options available to the user for defining the network conditions are described in this document. Computer hardware and software requirements are also defined.
MARTe: A Multiplatform Real-Time Framework
NASA Astrophysics Data System (ADS)
Neto, André C.; Sartori, Filippo; Piccolo, Fabio; Vitelli, Riccardo; De Tommasi, Gianmaria; Zabeo, Luca; Barbalace, Antonio; Fernandes, Horacio; Valcarcel, Daniel F.; Batista, Antonio J. N.
2010-04-01
Development of real-time applications is usually associated with nonportable code targeted at specific real-time operating systems. The boundary between hardware drivers, system services, and user code is commonly not well defined, making the development in the target host significantly difficult. The Multithreaded Application Real-Time executor (MARTe) is a framework built over a multiplatform library that allows the execution of the same code in different operating systems. The framework provides the high-level interfaces with hardware, external configuration programs, and user interfaces, assuring at the same time hard real-time performances. End-users of the framework are required to define and implement algorithms inside a well-defined block of software, named Generic Application Module (GAM), that is executed by the real-time scheduler. Each GAM is reconfigurable with a set of predefined configuration meta-parameters and interchanges information using a set of data pipes that are provided as inputs and required as output. Using these connections, different GAMs can be chained either in series or parallel. GAMs can be developed and debugged in a non-real-time system and, only once the robustness of the code and correctness of the algorithm are verified, deployed to the real-time system. The software also supplies a large set of utilities that greatly ease the interaction and debugging of a running system. Among the most useful are a highly efficient real-time logger, HTTP introspection of real-time objects, and HTTP remote configuration. MARTe is currently being used to successfully drive the plasma vertical stabilization controller on the largest magnetic confinement fusion device in the world, with a control loop cycle of 50 ?s and a jitter under 1 ?s. In this particular project, MARTe is used with the Real-Time Application Interface (RTAI)/Linux operating system exploiting the new ?86 multicore processors technology.
Parkhurst, David L.; Appelo, C.A.J.
1999-01-01
PHREEQC version 2 is a computer program written in the C programming language that is designed to perform a wide variety of low-temperature aqueous geochemical calculations. PHREEQC is based on an ion-association aqueous model and has capabilities for (1) speciation and saturation-index calculations; (2) batch-reaction and one-dimensional (1D) transport calculations involving reversible reactions, which include aqueous, mineral, gas, solid-solution, surface-complexation, and ion-exchange equilibria, and irreversible reactions, which include specified mole transfers of reactants, kinetically controlled reactions, mixing of solutions, and temperature changes; and (3) inverse modeling, which finds sets of mineral and gas mole transfers that account for differences in composition between waters, within specified compositional uncertainty limits.New features in PHREEQC version 2 relative to version 1 include capabilities to simulate dispersion (or diffusion) and stagnant zones in 1D-transport calculations, to model kinetic reactions with user-defined rate expressions, to model the formation or dissolution of ideal, multicomponent or nonideal, binary solid solutions, to model fixed-volume gas phases in addition to fixed-pressure gas phases, to allow the number of surface or exchange sites to vary with the dissolution or precipitation of minerals or kinetic reactants, to include isotope mole balances in inverse modeling calculations, to automatically use multiple sets of convergence parameters, to print user-defined quantities to the primary output file and (or) to a file suitable for importation into a spreadsheet, and to define solution compositions in a format more compatible with spreadsheet programs. This report presents the equations that are the basis for chemical equilibrium, kinetic, transport, and inverse-modeling calculations in PHREEQC; describes the input for the program; and presents examples that demonstrate most of the program's capabilities.
NASA Technical Reports Server (NTRS)
Chevalier, C. T.; Herrmann, K. A.; Kory, C. L.; Wilson, J. D.; Cross, A. W.; Williams, W. D. (Technical Monitor)
2001-01-01
Previously, it was shown that MAFIA (solutions of Maxwell's equations by the Finite Integration Algorithm), a three-dimensional simulation code, can be used to produce accurate cold-test characteristics including frequency-phase dispersion, interaction impedance, and attenuation for traveling-wave tube (TWT) slow-wave structures. In an effort to improve user-friendliness and simulation time, a model was developed to compute the cold-test parameters using the electromagnetic field simulation software package CST MICROWAVE STUDIO (MWS). Cold-test parameters were calculated for several slow-wave circuits including a ferruled coupled-cavity, a folded waveguide, and a novel finned-ladder circuit using both MWS and MAFIA. Comparisons indicate that MWS provides more accurate cold-test data with significantly reduced simulation times. Both MAFIA and MWS are based on the finite integration (FI) method; however, MWS has several advantages over MAFIA. First, it has a Windows based interface for PC operation, making it very user-friendly, whereas MAFIA is UNIX based. MWS uses a new Perfect Boundary Approximation (PBA), which increases the accuracy of the simulations by avoiding stair step approximations associated with MAFIA's representation of structures. Finally, MWS includes a Visual Basic for Applications (VBA) compatible macro language that enables the simulation process to be automated and allows for the optimization of user-defined goal functions, such as interaction impedance.
NASA Technical Reports Server (NTRS)
1993-01-01
The information required by a programmer using the Minimum Hamiltonian AScent Trajectory Evaluation (MASTRE) Program is provided. This document enables the programmer to either modify the program or convert the program to computers other than the VAX computer. Documentation for each subroutine or function based on providing the definitions of the variables and a source listing are included. Questions concerning the equations, techniques, or input requirements should be answered by either the Engineering or User's manuals. Three appendices are also included which provide a listing of the Root-Sum-Square (RSS) program, a listing of subroutine names and definitions used in the MASTRE User Friendly Interface Program, and listing of the subroutine names and definitions used in the Mass Properties Program. The RSS Program is used to aid in the performance of dispersion analyses. The RSS program reads a file generated by the MASTRE Program, calculates dispersion parameters, and generates output tables and output plot files. UFI Program provides a screen user interface to aid the user in providing input to the model. The Mass Properties Program defines the mass properties data for the MASTRE program through the use of user interface software.
Scalable Computing of the Mesh Size Effect on Modeling Damage Mechanics in Woven Armor Composites
2008-12-01
manner of a user defined material subroutine to provide overall stress increments to, the parallel LS-DYNA3D a Lagrangian explicit code used in...finite element code, as a user defined material subroutine . The ability of this subroutine to model the effect of the progressions of a select number...is added as a user defined material subroutine to parallel LS-DYNA3D. The computations of the global mesh are handled by LS-DYNA3D and are spread
User-defined Material Model for Thermo-mechanical Progressive Failure Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
2008-01-01
Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.
A Q-GERT Model for Determining the Maintenance Crew Size for the SAC command Post Upgrade
1983-12-01
time that an equiprment fails. DAY3 A real variable corresponding to the day that an LRU is removed from the equipment. DAY4 A real variable...variable corresponding to the time that an LRU is repaired. TIM5 A real variable corresponaing to Lhe time that an equipment returns to service. TNOW...The current time . UF(IFN) User function IFN. UN(I) A sample from the uniform distri- bution defined by parameter set I. YIlN1 A real variable
Malaria Risk Assessment for the Republic of Korea Based on Models of Mosquito Distribution
2008-06-01
Yam;lda All. kleilli Rueda All. belellme Rueda VPH 0.8 • 0.6• ~ ~ 0.’ 0.2 0 H P V VPH Figure I, Illustration of the concept of the mal-area as it...the percentage of the sampled area that these parameters cover. The value for VPH could be used as a simplified index of malaria risk to compare...combinations of the VPH variables. These statistics will consist of the percentage of cells that contain a certain value for the user defined area
Kamauu, Aaron W C; DuVall, Scott L; Wiggins, Richard H; Avrin, David E
2008-09-01
In the creation of interesting radiological cases in a digital teaching file, it is necessary to adjust the window and level settings of an image to effectively display the educational focus. The web-based applet described in this paper presents an effective solution for real-time window and level adjustments without leaving the picture archiving and communications system workstation. Optimized images are created, as user-defined parameters are passed between the applet and a servlet on the Health Insurance Portability and Accountability Act-compliant teaching file server.
The NASTRAN user's manual (level 17.0)
NASA Technical Reports Server (NTRS)
1979-01-01
NASTRAN embodies a lumped element approach, wherein the distributed physical properties of a structure are represented by a model consisting of a finite number of idealized substructures or elements that are interconnected at a finite of grid points, to which loads are applied. All input and output data pertain to the idealized structural model. The general procedures for defining structural models are described and instructions are given for each of the bulk data cards and case control cards. Additional information on the case control cards and use of parameters is included for each rigid format.
Rational-spline approximation with automatic tension adjustment
NASA Technical Reports Server (NTRS)
Schiess, J. R.; Kerr, P. A.
1984-01-01
An algorithm for weighted least-squares approximation with rational splines is presented. A rational spline is a cubic function containing a distinct tension parameter for each interval defined by two consecutive knots. For zero tension, the rational spline is identical to a cubic spline; for very large tension, the rational spline is a linear function. The approximation algorithm incorporates an algorithm which automatically adjusts the tension on each interval to fulfill a user-specified criterion. Finally, an example is presented comparing results of the rational spline with those of the cubic spline.
NASA Technical Reports Server (NTRS)
Hua, Chongyu; Volakis, John L.
1990-01-01
AUTOMESH-2D is a computer program specifically designed as a preprocessor for the scattering analysis of two dimensional bodies by the finite element method. This program was developed due to a need for reproducing the effort required to define and check the geometry data, element topology, and material properties. There are six modules in the program: (1) Parameter Specification; (2) Data Input; (3) Node Generation; (4) Element Generation; (5) Mesh Smoothing; and (5) Data File Generation.
Superfluid Helium Tanker (SFHT) study
NASA Technical Reports Server (NTRS)
1988-01-01
The accomplishments and recommendations of the two-phase Superfluid Helium Tanker (SFHT) study are presented. During the first phase of the study, the emphasis was on defining a comprehensive set of user requirements, establishing SFHT interface parameters and design requirements, and selecting a fluid subsystem design concept. During the second phase, an overall system design concept was constructed based on appropriate analyses and more detailed definition of requirements. Modifications needed to extend the baseline for use with cryogens other than SFHT have been determined, and technology development needs related to the recommended design have been assessed.
Collaboration in a Wireless Grid Innovation Testbed by Virtual Consortium
NASA Astrophysics Data System (ADS)
Treglia, Joseph; Ramnarine-Rieks, Angela; McKnight, Lee
This paper describes the formation of the Wireless Grid Innovation Testbed (WGiT) coordinated by a virtual consortium involving academic and non-academic entities. Syracuse University and Virginia Tech are primary university partners with several other academic, government, and corporate partners. Objectives include: 1) coordinating knowledge sharing, 2) defining key parameters for wireless grids network applications, 3) dynamically connecting wired and wireless devices, content and users, 4) linking to VT-CORNET, Virginia Tech Cognitive Radio Network Testbed, 5) forming ad hoc networks or grids of mobile and fixed devices without a dedicated server, 6) deepening understanding of wireless grid application, device, network, user and market behavior through academic, trade and popular publications including online media, 7) identifying policy that may enable evaluated innovations to enter US and international markets and 8) implementation and evaluation of the international virtual collaborative process.
Archer, Charles J.; Blocksom, Michael A.; Ratterman, Joseph D.; Smith, Brian E.; Xue, Hanghon
2016-02-02
A parallel computer executes a number of tasks, each task includes a number of endpoints and the endpoints are configured to support collective operations. In such a parallel computer, establishing a group of endpoints receiving a user specification of a set of endpoints included in a global collection of endpoints, where the user specification defines the set in accordance with a predefined virtual representation of the endpoints, the predefined virtual representation is a data structure setting forth an organization of tasks and endpoints included in the global collection of endpoints and the user specification defines the set of endpoints without a user specification of a particular endpoint; and defining a group of endpoints in dependence upon the predefined virtual representation of the endpoints and the user specification.
Establishing a group of endpoints in a parallel computer
Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.; Xue, Hanhong
2016-02-02
A parallel computer executes a number of tasks, each task includes a number of endpoints and the endpoints are configured to support collective operations. In such a parallel computer, establishing a group of endpoints receiving a user specification of a set of endpoints included in a global collection of endpoints, where the user specification defines the set in accordance with a predefined virtual representation of the endpoints, the predefined virtual representation is a data structure setting forth an organization of tasks and endpoints included in the global collection of endpoints and the user specification defines the set of endpoints without a user specification of a particular endpoint; and defining a group of endpoints in dependence upon the predefined virtual representation of the endpoints and the user specification.
NASA Astrophysics Data System (ADS)
Volk, J. M.; Turner, M. A.; Huntington, J. L.; Gardner, M.; Tyler, S.; Sheneman, L.
2016-12-01
Many distributed models that simulate watershed hydrologic processes require a collection of multi-dimensional parameters as input, some of which need to be calibrated before the model can be applied. The Precipitation Runoff Modeling System (PRMS) is a physically-based and spatially distributed hydrologic model that contains a considerable number of parameters that often need to be calibrated. Modelers can also benefit from uncertainty analysis of these parameters. To meet these needs, we developed a modular framework in Python to conduct PRMS parameter optimization, uncertainty analysis, interactive visual inspection of parameters and outputs, and other common modeling tasks. Here we present results for multi-step calibration of sensitive parameters controlling solar radiation, potential evapo-transpiration, and streamflow in a PRMS model that we applied to the snow-dominated Dry Creek watershed in Idaho. We also demonstrate how our modular approach enables the user to use a variety of parameter optimization and uncertainty methods or easily define their own, such as Monte Carlo random sampling, uniform sampling, or even optimization methods such as the downhill simplex method or its commonly used, more robust counterpart, shuffled complex evolution.
Online, On Demand Access to Coastal Digital Elevation Models
NASA Astrophysics Data System (ADS)
Long, J.; Bristol, S.; Long, D.; Thompson, S.
2014-12-01
Process-based numerical models for coastal waves, water levels, and sediment transport are initialized with digital elevation models (DEM) constructed by interpolating and merging bathymetric and topographic elevation data. These gridded surfaces must seamlessly span the land-water interface and may cover large regions where the individual raw data sources are collected at widely different spatial and temporal resolutions. In addition, the datasets are collected from different instrument platforms with varying accuracy and may or may not overlap in coverage. The lack of available tools and difficulties in constructing these DEMs lead scientists to 1) rely on previously merged, outdated, or over-smoothed DEMs; 2) discard more recent data that covers only a portion of the DEM domain; and 3) use inconsistent methodologies to generate DEMs. The objective of this work is to address the immediate need of integrating land and water-based elevation data sources and streamline the generation of a seamless data surface that spans the terrestrial-marine boundary. To achieve this, the U.S. Geological Survey (USGS) is developing a web processing service to format and initialize geoprocessing tasks designed to create coastal DEMs. The web processing service is maintained within the USGS ScienceBase data management system and has an associated user interface. Through the map-based interface, users define a geographic region that identifies the bounds of the desired DEM and a time period of interest. This initiates a query for elevation datasets within federal science agency data repositories. A geoprocessing service is then triggered to interpolate, merge, and smooth the data sources creating a DEM based on user-defined configuration parameters. Uncertainty and error estimates for the DEM are also returned by the geoprocessing service. Upon completion, the information management platform provides access to the final gridded data derivative and saves the configuration parameters for future reference. The resulting products and tools developed here could be adapted to future data sources and projects beyond the coastal environment.
Modified weighted fair queuing for packet scheduling in mobile WiMAX networks
NASA Astrophysics Data System (ADS)
Satrya, Gandeva B.; Brotoharsono, Tri
2013-03-01
The increase of user mobility and the need for data access anytime also increases the interest in broadband wireless access (BWA). The best available quality of experience for mobile data service users are assured for IEEE 802.16e based users. The main problem of assuring a high QOS value is how to allocate available resources among users in order to meet the QOS requirement for criteria such as delay, throughput, packet loss and fairness. There is no specific standard scheduling mechanism stated by IEEE standards, which leaves it for implementer differentiation. There are five QOS service classes defined by IEEE 802.16: Unsolicited Grant Scheme (UGS), Extended Real Time Polling Service (ertPS), Real Time Polling Service (rtPS), Non Real Time Polling Service (nrtPS) and Best Effort Service (BE). Each class has different QOS parameter requirements for throughput and delay/jitter constraints. This paper proposes Modified Weighted Fair Queuing (MWFQ) scheduling scenario which was based on Weighted Round Robin (WRR) and Weighted Fair Queuing (WFQ). The performance of MWFQ was assessed by using above five QoS criteria. The simulation shows that using the concept of total packet size calculation improves the network's performance.
Technical parameters for specifying imagery requirements
NASA Technical Reports Server (NTRS)
Coan, Paul P.; Dunnette, Sheri J.
1994-01-01
Providing visual information acquired from remote events to various operators, researchers, and practitioners has become progressively more important as the application of special skills in alien or hazardous situations increases. To provide an understanding of the technical parameters required to specify imagery, we have identified, defined, and discussed seven salient characteristics of images: spatial resolution, linearity, luminance resolution, spectral discrimination, temporal discrimination, edge definition, and signal-to-noise ratio. We then describe a generalizing imaging system and identified how various parts of the system affect the image data. To emphasize the different applications of imagery, we have constrasted the common television system with the significant parameters of a televisual imaging system for technical applications. Finally, we have established a method by which the required visual information can be specified by describing certain technical parameters which are directly related to the information content of the imagery. This method requires the user to complete a form listing all pertinent data requirements for the imagery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salajegheh, Nima; Abedrabbo, Nader; Pourboghrat, Farhang
An efficient integration algorithm for continuum damage based elastoplastic constitutive equations is implemented in LS-DYNA. The isotropic damage parameter is defined as the ratio of the damaged surface area over the total cross section area of the representative volume element. This parameter is incorporated into the integration algorithm as an internal variable. The developed damage model is then implemented in the FEM code LS-DYNA as user material subroutine (UMAT). Pure stretch experiments of a hemispherical punch are carried out for copper sheets and the results are compared against the predictions of the implemented damage model. Evaluation of damage parameters ismore » carried out and the optimized values that correctly predicted the failure in the sheet are reported. Prediction of failure in the numerical analysis is performed through element deletion using the critical damage value. The set of failure parameters which accurately predict the failure behavior in copper sheets compared to experimental data is reported as well.« less
Li, Guipeng; Li, Ming; Zhang, Yiwei; Wang, Dong; Li, Rong; Guimerà, Roger; Gao, Juntao Tony; Zhang, Michael Q
2014-01-01
Rapidly increasing amounts of (physical and genetic) protein-protein interaction (PPI) data are produced by various high-throughput techniques, and interpretation of these data remains a major challenge. In order to gain insight into the organization and structure of the resultant large complex networks formed by interacting molecules, using simulated annealing, a method based on the node connectivity, we developed ModuleRole, a user-friendly web server tool which finds modules in PPI network and defines the roles for every node, and produces files for visualization in Cytoscape and Pajek. For given proteins, it analyzes the PPI network from BioGRID database, finds and visualizes the modules these proteins form, and then defines the role every node plays in this network, based on two topological parameters Participation Coefficient and Z-score. This is the first program which provides interactive and very friendly interface for biologists to find and visualize modules and roles of proteins in PPI network. It can be tested online at the website http://www.bioinfo.org/modulerole/index.php, which is free and open to all users and there is no login requirement, with demo data provided by "User Guide" in the menu Help. Non-server application of this program is considered for high-throughput data with more than 200 nodes or user's own interaction datasets. Users are able to bookmark the web link to the result page and access at a later time. As an interactive and highly customizable application, ModuleRole requires no expert knowledge in graph theory on the user side and can be used in both Linux and Windows system, thus a very useful tool for biologist to analyze and visualize PPI networks from databases such as BioGRID. ModuleRole is implemented in Java and C, and is freely available at http://www.bioinfo.org/modulerole/index.php. Supplementary information (user guide, demo data) is also available at this website. API for ModuleRole used for this program can be obtained upon request.
Pathway collages: personalized multi-pathway diagrams.
Paley, Suzanne; O'Maille, Paul E; Weaver, Daniel; Karp, Peter D
2016-12-13
Metabolic pathway diagrams are a classical way of visualizing a linked cascade of biochemical reactions. However, to understand some biochemical situations, viewing a single pathway is insufficient, whereas viewing the entire metabolic network results in information overload. How do we enable scientists to rapidly construct personalized multi-pathway diagrams that depict a desired collection of interacting pathways that emphasize particular pathway interactions? We define software for constructing personalized multi-pathway diagrams called pathway-collages using a combination of manual and automatic layouts. The user specifies a set of pathways of interest for the collage from a Pathway/Genome Database. Layouts for the individual pathways are generated by the Pathway Tools software, and are sent to a Javascript Pathway Collage application implemented using Cytoscape.js. That application allows the user to re-position pathways; define connections between pathways; change visual style parameters; and paint metabolomics, gene expression, and reaction flux data onto the collage to obtain a desired multi-pathway diagram. We demonstrate the use of pathway collages in two application areas: a metabolomics study of pathogen drug response, and an Escherichia coli metabolic model. Pathway collages enable facile construction of personalized multi-pathway diagrams.
5G: rethink mobile communications for 2020+.
Chih-Lin, I; Han, Shuangfeng; Xu, Zhikun; Sun, Qi; Pan, Zhengang
2016-03-06
The 5G network is anticipated to meet the challenging requirements of mobile traffic in the 2020s, which are characterized by super high data rate, low latency, high mobility, high energy efficiency and high traffic density. This paper provides an overview of China Mobile's 5G vision and potential solutions. Three key characteristics of 5G are analysed, i.e. super fast, soft and green. The main 5G R&D themes are further elaborated, which include five fundamental rethinkings of the traditional design methodologies. The 5G network design considerations are also discussed, with cloud radio access network, ultra-dense network, software defined network and network function virtualization examined as key potential solutions towards a green and soft 5G network. The paradigm shift to user-centric network operation from the traditional cell-centric operation is also investigated, where the decoupled downlink and uplink, control and data, and adaptive multiple connections provide sufficient means to achieve a user-centric 5G network with 'no more cells'. The software defined air interface is investigated under a uniform framework and can adaptively adapt the parameters to well satisfy various requirements in different 5G scenarios. © 2016 The Author(s).
User modeling techniques for enhanced usability of OPSMODEL operations simulation software
NASA Technical Reports Server (NTRS)
Davis, William T.
1991-01-01
The PC based OPSMODEL operations software for modeling and simulation of space station crew activities supports engineering and cost analyses and operations planning. Using top-down modeling, the level of detail required in the data base can be limited to being commensurate with the results required of any particular analysis. To perform a simulation, a resource environment consisting of locations, crew definition, equipment, and consumables is first defined. Activities to be simulated are then defined as operations and scheduled as desired. These operations are defined within a 1000 level priority structure. The simulation on OPSMODEL, then, consists of the following: user defined, user scheduled operations executing within an environment of user defined resource and priority constraints. Techniques for prioritizing operations to realistically model a representative daily scenario of on-orbit space station crew activities are discussed. The large number of priority levels allows priorities to be assigned commensurate with the detail necessary for a given simulation. Several techniques for realistic modeling of day-to-day work carryover are also addressed.
NASA Astrophysics Data System (ADS)
Idehara, H.; Carbon, D. F.
2004-12-01
We present two new, publicly available tools to support the examination and interpretation of spectra. SCAMP is a specialized graphical user interface for MATLAB. It allows researchers to rapidly intercompare sets of observational, theoretical, and/or laboratory spectra. Users have extensive control over the colors and placement of individual spectra, and over spectrum normalization from one spectral region to another. Spectra can be interactively assigned to user-defined groups and the groupings recalled at a later time. The user can measure/record positions and intensities of spectral features, interactively spline-fit spectra, and normalize spectra by fitted splines. User-defined wavelengths can be automatically highlighted in SCAMP plots. The user can save/print annotated graphical output suitable for a scientific notebook depicting the work at any point. The ASP is a WWW portal that provides interactive access to two spectrum data sets: a library of synthetic stellar spectra and a library of laboratory PAH spectra. The synthetic stellar spectra in the ASP are appropriate to the giant branch with an assortment of compositions. Each spectrum spans the full range from 2 to 600 microns at a variety of resolutions. The ASP is designed to allow users to quickly identify individual features at any resolution that arise from any of the included isotopic species. The user may also retrieve the depth of formation of individual features at any resolution. PAH spectra accessible through the ASP are drawn from the extensive library of spectra measured by the NASA Ames Astrochemistry Laboratory. The user may interactively choose any subset of PAHs in the data set, combine them with user-defined weights and temperatures, and view/download the resultant spectrum at any user-defined resolution. This work was funded by the NASA Advanced Supercomputing Division, NASA Ames Research Center.
NASA Astrophysics Data System (ADS)
Caton, R. G.; Colman, J. J.; Parris, R. T.; Nickish, L.; Bullock, G.
2017-12-01
The Air Force Research Laboratory, in collaboration with NorthWest Research Associates, is developing advanced software capabilities for high fidelity simulations of high frequency (HF) sky wave propagation and performance analysis of HF systems. Based on the HiCIRF (High-frequency Channel Impulse Response Function) platform [Nickisch et. al, doi:10.1029/2011RS004928], the new Air Force Coverage Analysis Program (AFCAP) provides the modular capabilities necessary for a comprehensive sensitivity study of the large number of variables which define simulations of HF propagation modes. In this paper, we report on an initial exercise of AFCAP to analyze the sensitivities of the tool to various environmental and geophysical parameters. Through examination of the channel scattering function and amplitude-range-Doppler output on two-way propagation paths with injected target signals, we will compare simulated returns over a range of geophysical conditions as well as varying definitions for environmental noise, meteor clutter, and sea state models for Bragg backscatter. We also investigate the impacts of including clutter effects due to field-aligned backscatter from small scale ionization structures at varied levels of severity as defined by the climatologically WideBand Model (WBMOD). In the absence of additional user provided information, AFCAP relies on International Reference Ionosphere (IRI) model to define the ionospheric state for use in 2D ray tracing algorithms. Because the AFCAP architecture includes the option for insertion of a user defined gridded ionospheric representation, we compare output from the tool using the IRI and ionospheric definitions from assimilative models such as GPSII (GPS Ionospheric Inversion).
The DREO Elint Browser Utility (DEBU) reference manual
NASA Astrophysics Data System (ADS)
Ford, Barbara; Jones, David
1992-04-01
An electronic intelligent database browsing tool called DEBU has been developed that allows databases such as ELP, Kilting, EWIR, and AFEWC to be reviewed and analyzed from a user-friendly environment on a personal computer. DEBU's basic function is to allow users to examine the contents of user-selected subfiles of user-selected emitters of user-selected databases. DEBU augments this functionality with support for selecting (filtering) and combining subsets of emitters by user-selected attributes such as name, parameter type, or parameter value. DEBU provides facilities for examining histograms and x-y plots of selected parameters, for doing ambiguity analysis and mode level analysis, and for generating and printing a variety of reports. A manual is provided for users of DEBU, including descriptions and illustrations of menus and windows.
Vibrotactile display for mobile applications based on dielectric elastomer stack actuators
NASA Astrophysics Data System (ADS)
Matysek, Marc; Lotz, Peter; Flittner, Klaus; Schlaak, Helmut F.
2010-04-01
Dielectric elastomer stack actuators (DESA) offer the possibility to build actuator arrays at very high density. The driving voltage can be defined by the film thickness, ranging from 80 μm down to 5 μm and driving field strength of 30 V/μm. In this paper we present the development of a vibrotactile display based on multilayer technology. The display is used to present several operating conditions of a machine in form of haptic information to a human finger. As an example the design of a mp3-player interface is introduced. To build up an intuitive and user friendly interface several aspects of human haptic perception have to be considered. Using the results of preliminary user tests the interface is designed and an appropriate actuator layout is derived. Controlling these actuators is important because there are many possibilities to present different information, e.g. by varying the driving parameters. A built demonstrator is used to verify the concept: a high recognition rate of more than 90% validates the concept. A characterization of mechanical and electrical parameters proofs the suitability of dielectric elastomer stack actuators for the use in mobile applications.
Viger, R.J.
2008-01-01
The GIS Weasel is a freely available, open-source software package built on top of ArcInfo Workstation?? [ESRI, Inc., 2001, ArcInfo Workstation (8.1 ed.), Redlands, CA] for creating maps and parameters of geographic features used in environmental simulation models. The software has been designed to minimize the need for GIS expertise and automate the preparation of the geographic information as much as possible. Although many kinds of data can be exploited with the GIS Weasel, the only information required is a raster dataset of elevation for the user's area of interest (AOI). The user-defined AOI serves as a starting point from which to create maps of many different types of geographic features, including sub-watersheds, streams, elevation bands, land cover patches, land parcels, or anything else that can be discerned from the available data. The GIS Weasel has a library of over 200 routines that can be applied to any raster map of geographic features to generate information about shape, area, or topological association with other features of the same or different maps. In addition, a wide variety of parameters can be derived using ancillary data layers such as soil and vegetation maps.
YAMM - YET ANOTHER MENU MANAGER
NASA Technical Reports Server (NTRS)
Mazer, A. S.
1994-01-01
One of the most time-consuming yet necessary tasks of writing any piece of interactive software is the development of a user interface. Yet Another Menu Manager, YAMM, is an application independent menuing package, designed to remove much of the difficulty and save much of the time inherent in the implementation of the front ends for large packages. Written in C for UNIX-based operating systems, YAMM provides a complete menuing front end for a wide variety of applications, with provisions for terminal independence, user-specific configurations, and dynamic creation of menu trees. Applications running under the menu package consists of two parts: a description of the menu configuration and the body of application code. The menu configuration is used at runtime to define the menu structure and any non-standard keyboard mappings and terminal capabilities. Menu definitions define specific menus within the menu tree. The names used in a definition may be either a reference to an application function or the name of another menu defined within the menu configuration. Application parameters are entered using data entry screens which allow for required and optional parameters, tables, and legal-value lists. Both automatic and application-specific error checking are available. Help is available for both menu operation and specific applications. The YAMM program was written in C for execution on a Sun Microsystems workstation running SunOS, based on the Berkeley (4.2bsd) version of UNIX. During development, YAMM has been used on both 68020 and SPARC architectures, running SunOS versions 3.5 and 4.0. YAMM should be portable to most other UNIX-based systems. It has a central memory requirement of approximately 232K bytes. The standard distribution medium for this program is one .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. YAMM was developed in 1988 and last updated in 1990. YAMM is a copyrighted work with all copyright vested in NASA.
CHARMM-GUI ligand reader and modeler for CHARMM force field generation of small molecules.
Kim, Seonghoon; Lee, Jumin; Jo, Sunhwan; Brooks, Charles L; Lee, Hui Sun; Im, Wonpil
2017-06-05
Reading ligand structures into any simulation program is often nontrivial and time consuming, especially when the force field parameters and/or structure files of the corresponding molecules are not available. To address this problem, we have developed Ligand Reader & Modeler in CHARMM-GUI. Users can upload ligand structure information in various forms (using PDB ID, ligand ID, SMILES, MOL/MOL2/SDF file, or PDB/mmCIF file), and the uploaded structure is displayed on a sketchpad for verification and further modification. Based on the displayed structure, Ligand Reader & Modeler generates the ligand force field parameters and necessary structure files by searching for the ligand in the CHARMM force field library or using the CHARMM general force field (CGenFF). In addition, users can define chemical substitution sites and draw substituents in each site on the sketchpad to generate a set of combinatorial structure files and corresponding force field parameters for throughput or alchemical free energy simulations. Finally, the output from Ligand Reader & Modeler can be used in other CHARMM-GUI modules to build a protein-ligand simulation system for all supported simulation programs, such as CHARMM, NAMD, GROMACS, AMBER, GENESIS, LAMMPS, Desmond, OpenMM, and CHARMM/OpenMM. Ligand Reader & Modeler is available as a functional module of CHARMM-GUI at http://www.charmm-gui.org/input/ligandrm. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Predictive minimum description length principle approach to inferring gene regulatory networks.
Chaitankar, Vijender; Zhang, Chaoyang; Ghosh, Preetam; Gong, Ping; Perkins, Edward J; Deng, Youping
2011-01-01
Reverse engineering of gene regulatory networks using information theory models has received much attention due to its simplicity, low computational cost, and capability of inferring large networks. One of the major problems with information theory models is to determine the threshold that defines the regulatory relationships between genes. The minimum description length (MDL) principle has been implemented to overcome this problem. The description length of the MDL principle is the sum of model length and data encoding length. A user-specified fine tuning parameter is used as control mechanism between model and data encoding, but it is difficult to find the optimal parameter. In this work, we propose a new inference algorithm that incorporates mutual information (MI), conditional mutual information (CMI), and predictive minimum description length (PMDL) principle to infer gene regulatory networks from DNA microarray data. In this algorithm, the information theoretic quantities MI and CMI determine the regulatory relationships between genes and the PMDL principle method attempts to determine the best MI threshold without the need of a user-specified fine tuning parameter. The performance of the proposed algorithm is evaluated using both synthetic time series data sets and a biological time series data set (Saccharomyces cerevisiae). The results show that the proposed algorithm produced fewer false edges and significantly improved the precision when compared to existing MDL algorithm.
Enabling User Preferences Through Data Exchange
DOT National Transportation Integrated Search
1997-08-01
This paper describes a process, via user- air traffic management (ATM) data : exchange, for enabling user preferences in an ATM-based system. User : preferences may be defined in terms of a four-dimensional (4D) user-preferred : trajectory, or a seri...
Single crystal to polycrystal neutron transmission simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dessieux, Luc Lucius; Stoica, Alexandru Dan; Bingham, Philip R.
A collection of routines for calculation of the total cross section that determines the attenuation of neutrons by crystalline solids is presented. The total cross section is calculated semi-empirically as a function of crystal structure, neutron energy, temperature, and crystal orientation. The semi-empirical formula includes the contribution of parasitic Bragg scattering to the total cross section using both the crystal’s mosaic spread value and its orientation with respect to the neutron beam direction as parameters. These routines allow users to enter a distribution of crystal orientations for calculation of total cross sections of user defined powder or pseudo powder distributions,more » which enables simulation of non-uniformities such as texture and strain. In conclusion, the spectra for neutron transmission simulations in the neutron thermal energy range (2 meV–100 meV) are presented for single crystal and polycrystal samples and compared to measurements.« less
A self-adaptive-grid method with application to airfoil flow
NASA Technical Reports Server (NTRS)
Nakahashi, K.; Deiwert, G. S.
1985-01-01
A self-adaptive-grid method is described that is suitable for multidimensional steady and unsteady computations. Based on variational principles, a spring analogy is used to redistribute grid points in an optimal sense to reduce the overall solution error. User-specified parameters, denoting both maximum and minimum permissible grid spacings, are used to define the all-important constants, thereby minimizing the empiricism and making the method self-adaptive. Operator splitting and one-sided controls for orthogonality and smoothness are used to make the method practical, robust, and efficient. Examples are included for both steady and unsteady viscous flow computations about airfoils in two dimensions, as well as for a steady inviscid flow computation and a one-dimensional case. These examples illustrate the precise control the user has with the self-adaptive method and demonstrate a significant improvement in accuracy and quality of the solutions.
NASA Astrophysics Data System (ADS)
Chęciński, Jakub; Frankowski, Marek
2016-10-01
We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.
DEAN: A program for dynamic engine analysis
NASA Technical Reports Server (NTRS)
Sadler, G. G.; Melcher, K. J.
1985-01-01
The Dynamic Engine Analysis program, DEAN, is a FORTRAN code implemented on the IBM/370 mainframe at NASA Lewis Research Center for digital simulation of turbofan engine dynamics. DEAN is an interactive program which allows the user to simulate engine subsystems as well as a full engine systems with relative ease. The nonlinear first order ordinary differential equations which define the engine model may be solved by one of four integration schemes, a second order Runge-Kutta, a fourth order Runge-Kutta, an Adams Predictor-Corrector, or Gear's method for still systems. The numerical data generated by the model equations are displayed at specified intervals between which the user may choose to modify various parameters affecting the model equations and transient execution. Following the transient run, versatile graphics capabilities allow close examination of the data. DEAN's modeling procedure and capabilities are demonstrated by generating a model of simple compressor rig.
Single crystal to polycrystal neutron transmission simulation
Dessieux, Luc Lucius; Stoica, Alexandru Dan; Bingham, Philip R.
2018-02-02
A collection of routines for calculation of the total cross section that determines the attenuation of neutrons by crystalline solids is presented. The total cross section is calculated semi-empirically as a function of crystal structure, neutron energy, temperature, and crystal orientation. The semi-empirical formula includes the contribution of parasitic Bragg scattering to the total cross section using both the crystal’s mosaic spread value and its orientation with respect to the neutron beam direction as parameters. These routines allow users to enter a distribution of crystal orientations for calculation of total cross sections of user defined powder or pseudo powder distributions,more » which enables simulation of non-uniformities such as texture and strain. In conclusion, the spectra for neutron transmission simulations in the neutron thermal energy range (2 meV–100 meV) are presented for single crystal and polycrystal samples and compared to measurements.« less
Design document for the MOODS Data Management System (MDMS), version 1.0
NASA Technical Reports Server (NTRS)
1994-01-01
The MOODS Data Management System (MDMS) provides access to the Master Oceanographic Observation Data Set (MOODS) which is maintained by the Naval Oceanographic Office (NAVOCEANO). The MDMS incorporates database technology in providing seamless access to parameter (temperature, salinity, soundspeed) vs. depth observational profile data. The MDMS is an interactive software application with a graphical user interface (GUI) that supports user control of MDMS functional capabilities. The purpose of this document is to define and describe the structural framework and logical design of the software components/units which are integrated into the major computer software configuration item (CSCI) identified as MDMS, Version 1.0. The preliminary design is based on functional specifications and requirements identified in the governing Statement of Work prepared by the Naval Oceanographic Office (NAVOCEANO) and distributed as a request for proposal by the National Aeronautics and Space Administration (NASA).
Kristoffersen, Agnete Egilsdatter; Fønnebø, Vinjar; Norheim, Arne Johan
2008-10-01
Self-reported use of complementary and alternative medicine (CAM) among patients varies widely between studies, possibly because the definition of a CAM user is not comparable. This makes it difficult to compare studies. The aim of this study is to present a six-level model for classifying patients' reported exposure to CAM. Prayer, physical exercise, special diets, over-the-counter products/CAM techniques, and personal visits to a CAM practitioner are successively removed from the model in a reductive fashion. By applying the model to responses given by Norwegian patients with cancer, we found that 72% use CAM if the user was defined to include all types of CAM. This proportion was reduced successively to only 11% in the same patient group when a CAM user was defined as a user visiting a CAM practitioner four or more times. When considering a sample of 10 recently published studies of CAM use among patients with breast cancer, we found 98% use when the CAM user was defined to include all sorts of CAM. This proportion was reduced successively to only 20% when a CAM user was defined as a user of a CAM practitioner. We recommend future surveys of CAM use to report at more than one level and to clarify which intensity level of CAM use the report is based on.
HITRAN2016 : new and improved data and tools towards studies of planetary atmospheres
NASA Astrophysics Data System (ADS)
Gordon, Iouli; Rothman, Laurence S.; Wilzewski, Jonas S.; Kochanov, Roman V.; Hill, Christian; Tan, Yan; Wcislo, Piotr
2016-10-01
The HITRAN2016 molecular spectroscopic database is scheduled to be released this year. It will replace the current edition, HITRAN2012 [1], which has been in use, along with some intermediate updates, since 2012.We have added, revised, and improved many transitions and bands of molecular species and their isotopologues. Also, the amount of parameters has also been significantly increased, now incorporating, for instance, broadening by He, H2 and CO2 which are dominant in different planetary atmospheres [2]; non-Voigt line profiles [3]; and other phenomena. This poster will provide a summary of the updates, emphasizing details of some of the most important or drastic improvements or additions.To allow flexible incorporation of the new parameters and improve the efficiency of the database usage, the whole database has been reorganized into a relational database structure and presented to the user by means of a very powerful, easy-to-use internet program called HITRANonline [4] accessible at
NASA Technical Reports Server (NTRS)
Walowit, Jed A.
1994-01-01
A viewgraph presentation is made showing the capabilities of the computer code SPIRALI. Overall capabilities of SPIRALI include: computes rotor dynamic coefficients, flow, and power loss for cylindrical and face seals; treats turbulent, laminar, Couette, and Poiseuille dominated flows; fluid inertia effects are included; rotor dynamic coefficients in three (face) or four (cylindrical) degrees of freedom; includes effects of spiral grooves; user definable transverse film geometry including circular steps and grooves; independent user definable friction factor models for rotor and stator; and user definable loss coefficients for sudden expansions and contractions.
Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul S.; Martinez, Eric; Oppenheimer, David
2008-01-01
At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali
2007-01-01
C++ Programming Language: The C++ seminar covers the fundamentals of C++ programming language. The C++ fundamentals are grouped into three parts where each part includes both concept and programming examples aimed at for hands-on practice. The first part covers the functional aspect of C++ programming language with emphasis on function parameters and efficient memory utilization. The second part covers the essential framework of C++ programming language, the object-oriented aspects. Information necessary to evaluate various features of object-oriented programming; including encapsulation, polymorphism and inheritance will be discussed. The last part of the seminar covers template and generic programming. Examples include both user defined and standard templates.
Ultrasonic fatigue of SiC particle reinforced aluminum in the VHCF-regime
NASA Astrophysics Data System (ADS)
Wolf, M.; Wagner, G.; Eifler, D.
At the WKK ultrasonic testing facilities (UTF) are used to perform fatigue experiments in the VHCF regime with a frequency of 20 kHz. These systems allow an on-line characterization of the actual fatigue state by changes of different process parameters such as generator power, displacement, temperature or frequency-response characteristic. Moreover the experiments can be interrupted at user defined events in order to investigate variations of the surface microstructure or changes in the electrical resistance of the specimens. The fatigue tests were realized as load increase tests as well as constant amplitude tests.
Flexible Method for Inter-object Communication in C++
NASA Technical Reports Server (NTRS)
Curlett, Brian P.; Gould, Jack J.
1994-01-01
A method has been developed for organizing and sharing large amounts of information between objects in C++ code. This method uses a set of object classes to define variables and group them into tables. The variable tables presented here provide a convenient way of defining and cataloging data, as well as a user-friendly input/output system, a standardized set of access functions, mechanisms for ensuring data integrity, methods for interprocessor data transfer, and an interpretive language for programming relationships between parameters. The object-oriented nature of these variable tables enables the use of multiple data types, each with unique attributes and behavior. Because each variable provides its own access methods, redundant table lookup functions can be bypassed, thus decreasing access times while maintaining data integrity. In addition, a method for automatic reference counting was developed to manage memory safely.
Modeling long-term human activeness using recurrent neural networks for biometric data.
Kim, Zae Myung; Oh, Hyungrai; Kim, Han-Gyu; Lim, Chae-Gyun; Oh, Kyo-Joong; Choi, Ho-Jin
2017-05-18
With the invention of fitness trackers, it has been possible to continuously monitor a user's biometric data such as heart rates, number of footsteps taken, and amount of calories burned. This paper names the time series of these three types of biometric data, the user's "activeness", and investigates the feasibility in modeling and predicting the long-term activeness of the user. The dataset used in this study consisted of several months of biometric time-series data gathered by seven users independently. Four recurrent neural network (RNN) architectures-as well as a deep neural network and a simple regression model-were proposed to investigate the performance on predicting the activeness of the user under various length-related hyper-parameter settings. In addition, the learned model was tested to predict the time period when the user's activeness falls below a certain threshold. A preliminary experimental result shows that each type of activeness data exhibited a short-term autocorrelation; and among the three types of data, the consumed calories and the number of footsteps were positively correlated, while the heart rate data showed almost no correlation with neither of them. It is probably due to this characteristic of the dataset that although the RNN models produced the best results on modeling the user's activeness, the difference was marginal; and other baseline models, especially the linear regression model, performed quite admirably as well. Further experimental results show that it is feasible to predict a user's future activeness with precision, for example, a trained RNN model could predict-with the precision of 84%-when the user would be less active within the next hour given the latest 15 min of his activeness data. This paper defines and investigates the notion of a user's "activeness", and shows that forecasting the long-term activeness of the user is indeed possible. Such information can be utilized by a health-related application to proactively recommend suitable events or services to the user.
Lightweight application for generating clinical research information systems: MAGIC.
Leskošek, Brane; Pajntar, Marjan
2015-12-01
Our purpose was to build and test a lightweight solution for generating clinical research information systems (CRIS) that would allow non-IT professionals with basic knowledge of computer usage to quickly define and build a ready-to-use, safe and secure web-based clinical research system for data management. We use the acronym MAGIC (Medical Application Generator InteraCtive) for the system. The generated CRIS should be very easy to build and use, so a common LAMP (Linux Apache MySQL Perl) platform was used, which also enables short development cycles. The application was built and tested using eXtreme Programming (XP) principles by a small development team consisting of one informatics specialist, one physician and one graphical designer/programmer. The parameter and graphical user interface (GUI) definitions for the CRIS can be made by non-IT professionals using an intuitive English-language-like formalism called application definition language (ADL). From these definitions, the MAGIC builds an end-user CRIS that can be used on a wide variety of platforms (from standard workstations to hand-held devices). A working example of a national health-care-quality assessment program is presented to illustrate this process. The lightweight application for generating CRIS (MAGIC) has proven to be useful for both clinical and analytical users in real working environment. To achieve better performance and interoperability, we are planning to recompile the application using XML schemas (XSD) in HL7 CDA or openEHR archetypes formats used for parameters definition and for data interchange between different information systems.
MrGrid: A Portable Grid Based Molecular Replacement Pipeline
Reboul, Cyril F.; Androulakis, Steve G.; Phan, Jennifer M. N.; Whisstock, James C.; Goscinski, Wojtek J.; Abramson, David; Buckle, Ashley M.
2010-01-01
Background The crystallographic determination of protein structures can be computationally demanding and for difficult cases can benefit from user-friendly interfaces to high-performance computing resources. Molecular replacement (MR) is a popular protein crystallographic technique that exploits the structural similarity between proteins that share some sequence similarity. But the need to trial permutations of search models, space group symmetries and other parameters makes MR time- and labour-intensive. However, MR calculations are embarrassingly parallel and thus ideally suited to distributed computing. In order to address this problem we have developed MrGrid, web-based software that allows multiple MR calculations to be executed across a grid of networked computers, allowing high-throughput MR. Methodology/Principal Findings MrGrid is a portable web based application written in Java/JSP and Ruby, and taking advantage of Apple Xgrid technology. Designed to interface with a user defined Xgrid resource the package manages the distribution of multiple MR runs to the available nodes on the Xgrid. We evaluated MrGrid using 10 different protein test cases on a network of 13 computers, and achieved an average speed up factor of 5.69. Conclusions MrGrid enables the user to retrieve and manage the results of tens to hundreds of MR calculations quickly and via a single web interface, as well as broadening the range of strategies that can be attempted. This high-throughput approach allows parameter sweeps to be performed in parallel, improving the chances of MR success. PMID:20386612
Generation of Requirements for Simulant Measurements
NASA Technical Reports Server (NTRS)
Rickman, D. L.; Schrader, C. M.; Edmunson, J. E.
2010-01-01
This TM presents a formal, logical explanation of the parameters selected for the figure of merit (FoM) algorithm. The FoM algorithm is used to evaluate lunar regolith simulant. The objectives, requirements, assumptions, and analysis behind the parameters are provided. A requirement is derived to verify and validate simulant performance versus lunar regolith from NASA s objectives for lunar simulants. This requirement leads to a specification that comparative measurements be taken the same way on the regolith and the simulant. In turn, this leads to a set of nine criteria with which to evaluate comparative measurements. Many of the potential measurements of interest are not defensible under these criteria. For example, many geotechnical properties of interest were not explicitly measured during Apollo and they can only be measured in situ on the Moon. A 2005 workshop identified 32 properties of major interest to users. Virtually all of the properties are tightly constrained, though not predictable, if just four parameters are controlled. Three parameters (composition, size, and shape) are recognized as being definable at the particle level. The fourth parameter (density) is a bulk property. In recent work, a fifth parameter (spectroscopy) has been identified, which will need to be added to future releases of the FoM.
ShinyKGode: an interactive application for ODE parameter inference using gradient matching.
Wandy, Joe; Niu, Mu; Giurghita, Diana; Daly, Rónán; Rogers, Simon; Husmeier, Dirk
2018-07-01
Mathematical modelling based on ordinary differential equations (ODEs) is widely used to describe the dynamics of biological systems, particularly in systems and pathway biology. Often the kinetic parameters of these ODE systems are unknown and have to be inferred from the data. Approximate parameter inference methods based on gradient matching (which do not require performing computationally expensive numerical integration of the ODEs) have been getting popular in recent years, but many implementations are difficult to run without expert knowledge. Here, we introduce ShinyKGode, an interactive web application to perform fast parameter inference on ODEs using gradient matching. ShinyKGode can be used to infer ODE parameters on simulated and observed data using gradient matching. Users can easily load their own models in Systems Biology Markup Language format, and a set of pre-defined ODE benchmark models are provided in the application. Inferred parameters are visualized alongside diagnostic plots to assess convergence. The R package for ShinyKGode can be installed through the Comprehensive R Archive Network (CRAN). Installation instructions, as well as tutorial videos and source code are available at https://joewandy.github.io/shinyKGode. Supplementary data are available at Bioinformatics online.
Broadband spectral fitting of blazars using XSPEC
NASA Astrophysics Data System (ADS)
Sahayanathan, Sunder; Sinha, Atreyee; Misra, Ranjeev
2018-03-01
The broadband spectral energy distribution (SED) of blazars is generally interpreted as radiation arising from synchrotron and inverse Compton mechanisms. Traditionally, the underlying source parameters responsible for these emission processes, like particle energy density, magnetic field, etc., are obtained through simple visual reproduction of the observed fluxes. However, this procedure is incapable of providing confidence ranges for the estimated parameters. In this work, we propose an efficient algorithm to perform a statistical fit of the observed broadband spectrum of blazars using different emission models. Moreover, we use the observable quantities as the fit parameters, rather than the direct source parameters which govern the resultant SED. This significantly improves the convergence time and eliminates the uncertainty regarding initial guess parameters. This approach also has an added advantage of identifying the degenerate parameters, which can be removed by including more observable information and/or additional constraints. A computer code developed based on this algorithm is implemented as a user-defined routine in the standard X-ray spectral fitting package, XSPEC. Further, we demonstrate the efficacy of the algorithm by fitting the well sampled SED of blazar 3C 279 during its gamma ray flare in 2014.
Review of Soil Models and Their Implementation in Multibody System Algorithms
2012-02-01
models for use with ABAQUS . The constitutive models of the user defined materials can be programmed in the user subroutine UMAT. Many user defined...mechanical characteristics of mildly or moderately expansive unsaturated soils. As originally proposed by Alonso, utilizing a critical state framework...review of some of these programs is presented. ABAQUS ABAQUS is a popular FE analysis program that contains a wide variety of material models and
Adding and Removing Web Area Users, and Changing User Roles
Webmasters can add users to a web area, and assign or change roles, which define the actions a user is able to take in the web area. Non-webmasters must use a request form to add users and change roles.
Toward a Behavioral Approach to Privacy for Online Social Networks
NASA Astrophysics Data System (ADS)
Banks, Lerone D.; Wu, S. Felix
We examine the correlation between user interactions and self reported information revelation preferences for users of the popular Online Social Network (OSN), Facebook. Our primary goal is to explore the use of indicators of tie strength to inform localized, per-user privacy preferences for users and their ties within OSNs. We examine the limitations of such an approach and discuss future plans to incorporate this approach into the development of an automated system for helping users define privacy policy. As part of future work, we discuss how to define/expand policy to the entire social network. We also present additional collected data similar to other studies such as perceived tie strength and information revelation preferences for OSN users.
Multi-Mission Automated Task Invocation Subsystem
NASA Technical Reports Server (NTRS)
Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.
2009-01-01
Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,
García-Betances, Rebeca I.; Cabrera-Umpiérrez, María Fernanda; Ottaviano, Manuel; Pastorino, Matteo; Arredondo, María T.
2016-01-01
Despite the speedy evolution of Information and Computer Technology (ICT), and the growing recognition of the importance of the concept of universal design in all domains of daily living, mainstream ICT-based product designers and developers still work without any truly structured tools, guidance or support to effectively adapt their products and services to users’ real needs. This paper presents the approach used to define and evaluate parametric cognitive models that describe interaction and usage of ICT by people with aging- and disability-derived functional impairments. A multisensorial training platform was used to train, based on real user measurements in real conditions, the virtual parameterized user models that act as subjects of the test-bed during all stages of simulated disabilities-friendly ICT-based products design. An analytical study was carried out to identify the relevant cognitive functions involved, together with their corresponding parameters as related to aging- and disability-derived functional impairments. Evaluation of the final cognitive virtual user models in a real application has confirmed that the use of these models produce concrete valuable benefits to the design and testing process of accessible ICT-based applications and services. Parameterization of cognitive virtual user models allows incorporating cognitive and perceptual aspects during the design process. PMID:26907296
NASA Astrophysics Data System (ADS)
Lasbleis, M.; Day, E. A.; Waszek, L.
2017-12-01
The complex nature of inner core structure has been well-established from seismic studies, with heterogeneities at various length scales, both radially and laterally. Despite this, no geodynamic model has successfully explained all of the observed seismic features. To facilitate comparisons between seismic observations and geodynamic models of inner core growth we have developed a new, open access Python tool - GrowYourIC - that allows users to compare models of inner core structure. The code allows users to simulate different evolution models of the inner core, with user-defined rates of inner core growth, translation and rotation. Once the user has "grown" an inner core with their preferred parameters they can then explore the effect of "their" inner core's evolution on the relative age and growth rate in different regions of the inner core. The code will convert these parameters into seismic properties using either built-in mineral physics models, or user-supplied ones that calculate these seismic properties with users' own preferred mineralogical models. The 3D model of isotropic inner core properties can then be used to calculate the predicted seismic travel time anomalies for a random, or user-specified, set of seismic ray paths through the inner core. A real dataset of inner core body-wave differential travel times is included for the purpose of comparing user-generated models of inner core growth to actual observed travel time anomalies in the top 100km of the inner core. Here, we explore some of the possibilities of our code. We investigate the effect of the limited illumination of the inner core by seismic waves on the robustness of kinematic model interpretation. We test the impact on seismic differential travel time observations of several kinematic models of inner core growth: fast lateral translation; slow differential growth; and inner core super-rotation. We find that a model of inner core evolution incorporating both differential growth and slow super-rotation is able to recreate some of the more intricate details of the seismic observations. Specifically we are able to "grow" an inner core that has an asymmetric shift in isotropic hemisphere boundaries with increasing depth in the inner core.
Support vector machines-based modelling of seismic liquefaction potential
NASA Astrophysics Data System (ADS)
Pal, Mahesh
2006-08-01
This paper investigates the potential of support vector machines (SVM)-based classification approach to assess the liquefaction potential from actual standard penetration test (SPT) and cone penetration test (CPT) field data. SVMs are based on statistical learning theory and found to work well in comparison to neural networks in several other applications. Both CPT and SPT field data sets is used with SVMs for predicting the occurrence and non-occurrence of liquefaction based on different input parameter combination. With SPT and CPT test data sets, highest accuracy of 96 and 97%, respectively, was achieved with SVMs. This suggests that SVMs can effectively be used to model the complex relationship between different soil parameter and the liquefaction potential. Several other combinations of input variable were used to assess the influence of different input parameters on liquefaction potential. Proposed approach suggest that neither normalized cone resistance value with CPT data nor the calculation of standardized SPT value is required with SPT data. Further, SVMs required few user-defined parameters and provide better performance in comparison to neural network approach.
Edla, Shwetha; Kovvali, Narayan; Papandreou-Suppappola, Antonia
2012-01-01
Constructing statistical models of electrocardiogram (ECG) signals, whose parameters can be used for automated disease classification, is of great importance in precluding manual annotation and providing prompt diagnosis of cardiac diseases. ECG signals consist of several segments with different morphologies (namely the P wave, QRS complex and the T wave) in a single heart beat, which can vary across individuals and diseases. Also, existing statistical ECG models exhibit a reliance upon obtaining a priori information from the ECG data by using preprocessing algorithms to initialize the filter parameters, or to define the user-specified model parameters. In this paper, we propose an ECG modeling technique using the sequential Markov chain Monte Carlo (SMCMC) filter that can perform simultaneous model selection, by adaptively choosing from different representations depending upon the nature of the data. Our results demonstrate the ability of the algorithm to track various types of ECG morphologies, including intermittently occurring ECG beats. In addition, we use the estimated model parameters as the feature set to classify between ECG signals with normal sinus rhythm and four different types of arrhythmia.
Development and parameter identification of a visco-hyperelastic model for the periodontal ligament.
Huang, Huixiang; Tang, Wencheng; Tan, Qiyan; Yan, Bin
2017-04-01
The present study developed and implemented a new visco-hyperelastic model that is capable of predicting the time-dependent biomechanical behavior of the periodontal ligament. The constitutive model has been implemented into the finite element package ABAQUS by means of a user-defined material subroutine (UMAT). The stress response is decomposed into two constitutive parts in parallel which are a hyperelastic and a time-dependent viscoelastic stress response. In order to identify the model parameters, the indentation equation based on V-W hyperelastic model and the indentation creep model are developed. Then the parameters are determined by fitting them to the corresponding nanoindentation experimental data of the PDL. The nanoindentation experiment was simulated by finite element analysis to validate the visco-hyperelastic model. The simulated results are in good agreement with the experimental data, which demonstrates that the visco-hyperelastic model developed is able to accurately predict the time-dependent mechanical behavior of the PDL. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Khalili, Ashkan; Jha, Ratneshwar; Samaratunga, Dulip
2016-11-01
Wave propagation analysis in 2-D composite structures is performed efficiently and accurately through the formulation of a User-Defined Element (UEL) based on the wavelet spectral finite element (WSFE) method. The WSFE method is based on the first-order shear deformation theory which yields accurate results for wave motion at high frequencies. The 2-D WSFE model is highly efficient computationally and provides a direct relationship between system input and output in the frequency domain. The UEL is formulated and implemented in Abaqus (commercial finite element software) for wave propagation analysis in 2-D composite structures with complexities. Frequency domain formulation of WSFE leads to complex valued parameters, which are decoupled into real and imaginary parts and presented to Abaqus as real values. The final solution is obtained by forming a complex value using the real number solutions given by Abaqus. Five numerical examples are presented in this article, namely undamaged plate, impacted plate, plate with ply drop, folded plate and plate with stiffener. Wave motions predicted by the developed UEL correlate very well with Abaqus simulations. The results also show that the UEL largely retains computational efficiency of the WSFE method and extends its ability to model complex features.
NASA Astrophysics Data System (ADS)
Zhao, Runchen; Ientilucci, Emmett J.
2017-05-01
Hyperspectral remote sensing systems provide spectral data composed of hundreds of narrow spectral bands. Spectral remote sensing systems can be used to identify targets, for example, without physical interaction. Often it is of interested to characterize the spectral variability of targets or objects. The purpose of this paper is to identify and characterize the LWIR spectral variability of targets based on an improved earth observing statistical performance model, known as the Forecasting and Analysis of Spectroradiometric System Performance (FASSP) model. FASSP contains three basic modules including a scene model, sensor model and a processing model. Instead of using mean surface reflectance only as input to the model, FASSP transfers user defined statistical characteristics of a scene through the image chain (i.e., from source to sensor). The radiative transfer model, MODTRAN, is used to simulate the radiative transfer based on user defined atmospheric parameters. To retrieve class emissivity and temperature statistics, or temperature / emissivity separation (TES), a LWIR atmospheric compensation method is necessary. The FASSP model has a method to transform statistics in the visible (ie., ELM) but currently does not have LWIR TES algorithm in place. This paper addresses the implementation of such a TES algorithm and its associated transformation of statistics.
AESOP- INTERACTIVE DESIGN OF LINEAR QUADRATIC REGULATORS AND KALMAN FILTERS
NASA Technical Reports Server (NTRS)
Lehtinen, B.
1994-01-01
AESOP was developed to solve a number of problems associated with the design of controls and state estimators for linear time-invariant systems. The systems considered are modeled in state-variable form by a set of linear differential and algebraic equations with constant coefficients. Two key problems solved by AESOP are the linear quadratic regulator (LQR) design problem and the steady-state Kalman filter design problem. AESOP is designed to be used in an interactive manner. The user can solve design problems and analyze the solutions in a single interactive session. Both numerical and graphical information are available to the user during the session. The AESOP program is structured around a list of predefined functions. Each function performs a single computation associated with control, estimation, or system response determination. AESOP contains over sixty functions and permits the easy inclusion of user defined functions. The user accesses these functions either by inputting a list of desired functions in the order they are to be performed, or by specifying a single function to be performed. The latter case is used when the choice of function and function order depends on the results of previous functions. The available AESOP functions are divided into several general areas including: 1) program control, 2) matrix input and revision, 3) matrix formation, 4) open-loop system analysis, 5) frequency response, 6) transient response, 7) transient function zeros, 8) LQR and Kalman filter design, 9) eigenvalues and eigenvectors, 10) covariances, and 11) user-defined functions. The most important functions are those that design linear quadratic regulators and Kalman filters. The user interacts with AESOP when using these functions by inputting design weighting parameters and by viewing displays of designed system response. Support functions obtain system transient and frequency responses, transfer functions, and covariance matrices. AESOP can also provide the user with open-loop system information including stability, controllability, and observability. The AESOP program is written in FORTRAN IV for interactive execution and has been implemented on an IBM 3033 computer using TSS 370. As currently configured, AESOP has a central memory requirement of approximately 2 Megs of 8 bit bytes. Memory requirements can be reduced by redimensioning arrays in the AESOP program. Graphical output requires adaptation of the AESOP plot routines to whatever device is available. The AESOP program was developed in 1984.
The FTS atomic spectrum tool (FAST) for rapid analysis of line spectra
NASA Astrophysics Data System (ADS)
Ruffoni, M. P.
2013-07-01
The FTS Atomic Spectrum Tool (FAST) is an interactive graphical program designed to simplify the analysis of atomic emission line spectra obtained from Fourier transform spectrometers. Calculated, predicted and/or known experimental line parameters are loaded alongside experimentally observed spectral line profiles for easy comparison between new experimental data and existing results. Many such line profiles, which could span numerous spectra, may be viewed simultaneously to help the user detect problems from line blending or self-absorption. Once the user has determined that their experimental line profile fits are good, a key feature of FAST is the ability to calculate atomic branching fractions, transition probabilities, and oscillator strengths-and their uncertainties-which is not provided by existing analysis packages. Program SummaryProgram title: FAST: The FTS Atomic Spectrum Tool Catalogue identifier: AEOW_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEOW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 293058 No. of bytes in distributed program, including test data, etc.: 13809509 Distribution format: tar.gz Programming language: C++. Computer: Intel x86-based systems. Operating system: Linux/Unix/Windows. RAM: 8 MB minimum. About 50-200 MB for a typical analysis. Classification: 2.2, 2.3, 21.2. Nature of problem: Visualisation of atomic line spectra including the comparison of theoretical line parameters with experimental atomic line profiles. Accurate intensity calibration of experimental spectra, and the determination of observed relative line intensities that are needed for calculating atomic branching fractions and oscillator strengths. Solution method: FAST is centred around a graphical interface, where a user may view sets of experimental line profiles and compare them to calculated data (such as from the Kurucz database [1]), predicted line parameters, and/or previously known experimental results. With additional information on the spectral response of the spectrometer, obtained from a calibrated standard light source, FT spectra may be intensity calibrated. In turn, this permits the user to calculate atomic branching fractions and oscillator strengths, and their respective uncertainties. Running time: Open ended. Defined by the user. References: [1] R.L. Kurucz (2007). URL http://kurucz.harvard.edu/atoms/.
Defining Information Needs of Computer Users: A Human Communication Problem.
ERIC Educational Resources Information Center
Kimbrough, Kenneth L.
This exploratory investigation of the process of defining the information needs of computer users and the impact of that process on information retrieval focuses on communication problems. Six sites were visited that used computers to process data or to provide information, including the California Department of Transportation, the California…
Application of new type of distributed multimedia databases to networked electronic museum
NASA Astrophysics Data System (ADS)
Kuroda, Kazuhide; Komatsu, Naohisa; Komiya, Kazumi; Ikeda, Hiroaki
1999-01-01
Recently, various kinds of multimedia application systems have actively been developed based on the achievement of advanced high sped communication networks, computer processing technologies, and digital contents-handling technologies. Under this background, this paper proposed a new distributed multimedia database system which can effectively perform a new function of cooperative retrieval among distributed databases. The proposed system introduces a new concept of 'Retrieval manager' which functions as an intelligent controller so that the user can recognize a set of distributed databases as one logical database. The logical database dynamically generates and performs a preferred combination of retrieving parameters on the basis of both directory data and the system environment. Moreover, a concept of 'domain' is defined in the system as a managing unit of retrieval. The retrieval can effectively be performed by cooperation of processing among multiple domains. Communication language and protocols are also defined in the system. These are used in every action for communications in the system. A language interpreter in each machine translates a communication language into an internal language used in each machine. Using the language interpreter, internal processing, such internal modules as DBMS and user interface modules can freely be selected. A concept of 'content-set' is also introduced. A content-set is defined as a package of contents. Contents in the content-set are related to each other. The system handles a content-set as one object. The user terminal can effectively control the displaying of retrieved contents, referring to data indicating the relation of the contents in the content- set. In order to verify the function of the proposed system, a networked electronic museum was experimentally built. The results of this experiment indicate that the proposed system can effectively retrieve the objective contents under the control to a number of distributed domains. The result also indicate that the system can effectively work even if the system becomes large.
BGFit: management and automated fitting of biological growth curves.
Veríssimo, André; Paixão, Laura; Neves, Ana Rute; Vinga, Susana
2013-09-25
Existing tools to model cell growth curves do not offer a flexible integrative approach to manage large datasets and automatically estimate parameters. Due to the increase of experimental time-series from microbiology and oncology, the need for a software that allows researchers to easily organize experimental data and simultaneously extract relevant parameters in an efficient way is crucial. BGFit provides a web-based unified platform, where a rich set of dynamic models can be fitted to experimental time-series data, further allowing to efficiently manage the results in a structured and hierarchical way. The data managing system allows to organize projects, experiments and measurements data and also to define teams with different editing and viewing permission. Several dynamic and algebraic models are already implemented, such as polynomial regression, Gompertz, Baranyi, Logistic and Live Cell Fraction models and the user can add easily new models thus expanding current ones. BGFit allows users to easily manage their data and models in an integrated way, even if they are not familiar with databases or existing computational tools for parameter estimation. BGFit is designed with a flexible architecture that focus on extensibility and leverages free software with existing tools and methods, allowing to compare and evaluate different data modeling techniques. The application is described in the context of bacterial and tumor cells growth data fitting, but it is also applicable to any type of two-dimensional data, e.g. physical chemistry and macroeconomic time series, being fully scalable to high number of projects, data and model complexity.
Performance evaluation of power control algorithms in wireless cellular networks
NASA Astrophysics Data System (ADS)
Temaneh-Nyah, C.; Iita, V.
2014-10-01
Power control in a mobile communication network intents to control the transmission power levels in such a way that the required quality of service (QoS) for the users is guaranteed with lowest possible transmission powers. Most of the studies of power control algorithms in the literature are based on some kind of simplified assumptions which leads to compromise in the validity of the results when applied in a real environment. In this paper, a CDMA network was simulated. The real environment was accounted for by defining the analysis area and the network base stations and mobile stations are defined by their geographical coordinates, the mobility of the mobile stations is accounted for. The simulation also allowed for a number of network parameters including the network traffic, and the wireless channel models to be modified. Finally, we present the simulation results of a convergence speed based comparative analysis of three uplink power control algorithms.
Method of Simulating Flow-Through Area of a Pressure Regulator
NASA Technical Reports Server (NTRS)
Hass, Neal E. (Inventor); Schallhorn, Paul A. (Inventor)
2011-01-01
The flow-through area of a pressure regulator positioned in a branch of a simulated fluid flow network is generated. A target pressure is defined downstream of the pressure regulator. A projected flow-through area is generated as a non-linear function of (i) target pressure, (ii) flow-through area of the pressure regulator for a current time step and a previous time step, and (iii) pressure at the downstream location for the current time step and previous time step. A simulated flow-through area for the next time step is generated as a sum of (i) flow-through area for the current time step, and (ii) a difference between the projected flow-through area and the flow-through area for the current time step multiplied by a user-defined rate control parameter. These steps are repeated for a sequence of time steps until the pressure at the downstream location is approximately equal to the target pressure.
Observation Planning Made Simple with Science Opportunity Analyzer (SOA)
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Polanskey, Carol A.
2004-01-01
As NASA undertakes the exploration of the Moon and Mars as well as the rest of the Solar System while continuing to investigate Earth's oceans, winds, atmosphere, weather, etc., the ever-existing need to allow operations users to easily define their observations increases. Operation teams need to be able to determine the best time to perform an observation, as well as its duration and other parameters such as the observation target. In addition, operations teams need to be able to check the observation for validity against objectives and intent as well as spacecraft constraints such as turn rates and acceleration or pointing exclusion zones. Science Opportunity Analyzer (SOA), in development for the last six years, is a multi-mission toolset that has been built to meet those needs. The operations team can follow six simple steps and define his/her observation without having to know the complexities of orbital mechanics, coordinate transformations, or the spacecraft itself.
A Driving Behaviour Model of Electrical Wheelchair Users
Hamam, Y.; Djouani, K.; Daachi, B.; Steyn, N.
2016-01-01
In spite of the presence of powered wheelchairs, some of the users still experience steering challenges and manoeuvring difficulties that limit their capacity of navigating effectively. For such users, steering support and assistive systems may be very necessary. To appreciate the assistance, there is need that the assistive control is adaptable to the user's steering behaviour. This paper contributes to wheelchair steering improvement by modelling the steering behaviour of powered wheelchair users, for integration into the control system. More precisely, the modelling is based on the improved Directed Potential Field (DPF) method for trajectory planning. The method has facilitated the formulation of a simple behaviour model that is also linear in parameters. To obtain the steering data for parameter identification, seven individuals participated in driving the wheelchair in different virtual worlds on the augmented platform. The obtained data facilitated the estimation of user parameters, using the ordinary least square method, with satisfactory regression analysis results. PMID:27148362
A Literature Review: Website Design and User Engagement.
Garett, Renee; Chiu, Jason; Zhang, Ly; Young, Sean D
2016-07-01
Proper design has become a critical element needed to engage website and mobile application users. However, little research has been conducted to define the specific elements used in effective website and mobile application design. We attempt to review and consolidate research on effective design and to define a short list of elements frequently used in research. The design elements mentioned most frequently in the reviewed literature were navigation, graphical representation, organization, content utility, purpose, simplicity, and readability. We discuss how previous studies define and evaluate these seven elements. This review and the resulting short list of design elements may be used to help designers and researchers to operationalize best practices for facilitating and predicting user engagement.
A Literature Review: Website Design and User Engagement
Garett, Renee; Chiu, Jason; Zhang, Ly; Young, Sean D.
2015-01-01
Proper design has become a critical element needed to engage website and mobile application users. However, little research has been conducted to define the specific elements used in effective website and mobile application design. We attempt to review and consolidate research on effective design and to define a short list of elements frequently used in research. The design elements mentioned most frequently in the reviewed literature were navigation, graphical representation, organization, content utility, purpose, simplicity, and readability. We discuss how previous studies define and evaluate these seven elements. This review and the resulting short list of design elements may be used to help designers and researchers to operationalize best practices for facilitating and predicting user engagement. PMID:27499833
Effect of the initial configuration for user-object reputation systems
NASA Astrophysics Data System (ADS)
Wu, Ying-Ying; Guo, Qiang; Liu, Jian-Guo; Zhang, Yi-Cheng
2018-07-01
Identifying the user reputation accurately is significant for the online social systems. For different fair rating parameter q, by changing the parameter values α and β of the beta probability distribution (RBPD) for ranking online user reputation, we investigate the effect of the initial configuration of the RBPD method for the online user ranking performance. Experimental results for the Netflix and MovieLens data sets show that when the parameter q equals to 0.8 and 0.9, the accuracy value AUC would increase about 4.5% and 3.5% for the Netflix data set, while the AUC value increases about 1.5% for the MovieLens data set when the parameter q is 0.9. Furthermore, we investigate the evolution characteristics of the AUC value for different α and β, and find that as the rating records increase, the AUC value increases about 0.2 and 0.16 for the Netflix and MovieLens data sets, indicating that online users' reputations will increase as they rate more and more objects.
AITRAC: Augmented Interactive Transient Radiation Analysis by Computer. User's information manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1977-10-01
AITRAC is a program designed for on-line, interactive, DC, and transient analysis of electronic circuits. The program solves linear and nonlinear simultaneous equations which characterize the mathematical models used to predict circuit response. The program features 100 external node--200 branch capability; conversional, free-format input language; built-in junction, FET, MOS, and switch models; sparse matrix algorithm with extended-precision H matrix and T vector calculations, for fast and accurate execution; linear transconductances: beta, GM, MU, ZM; accurate and fast radiation effects analysis; special interface for user-defined equations; selective control of multiple outputs; graphical outputs in wide and narrow formats; and on-line parametermore » modification capability. The user describes the problem by entering the circuit topology and part parameters. The program then automatically generates and solves the circuit equations, providing the user with printed or plotted output. The circuit topology and/or part values may then be changed by the user, and a new analysis, requested. Circuit descriptions may be saved on disk files for storage and later use. The program contains built-in standard models for resistors, voltage and current sources, capacitors, inductors including mutual couplings, switches, junction diodes and transistors, FETS, and MOS devices. Nonstandard models may be constructed from standard models or by using the special equations interface. Time functions may be described by straight-line segments or by sine, damped sine, and exponential functions. 42 figures, 1 table. (RWR)« less
Thieler, E. Robert; Himmelstoss, Emily A.; Zichichi, Jessica L.; Ergul, Ayhan
2009-01-01
The Digital Shoreline Analysis System (DSAS) version 4.0 is a software extension to ESRI ArcGIS v.9.2 and above that enables a user to calculate shoreline rate-of-change statistics from multiple historic shoreline positions. A user-friendly interface of simple buttons and menus guides the user through the major steps of shoreline change analysis. Components of the extension and user guide include (1) instruction on the proper way to define a reference baseline for measurements, (2) automated and manual generation of measurement transects and metadata based on user-specified parameters, and (3) output of calculated rates of shoreline change and other statistical information. DSAS computes shoreline rates of change using four different methods: (1) endpoint rate, (2) simple linear regression, (3) weighted linear regression, and (4) least median of squares. The standard error, correlation coefficient, and confidence interval are also computed for the simple and weighted linear-regression methods. The results of all rate calculations are output to a table that can be linked to the transect file by a common attribute field. DSAS is intended to facilitate the shoreline change-calculation process and to provide rate-of-change information and the statistical data necessary to establish the reliability of the calculated results. The software is also suitable for any generic application that calculates positional change over time, such as assessing rates of change of glacier limits in sequential aerial photos, river edge boundaries, land-cover changes, and so on.
NASA Astrophysics Data System (ADS)
Dyer, T.; Brodie, K. L.; Spore, N.
2016-02-01
Modern LIDAR systems, while capable of providing highly accurate and dense datasets, introduce significant challenges in data processing and end-user accessibility. At the United States Army Corps of Engineers Field Research Facility in Duck, North Carolina, we have developed a stationary LIDAR tower for the continuous monitoring of the ocean, beach, and foredune, as well as an automated workflow capable of providing scientific data products from the LIDAR scanner in near real-time through an online data portal. The LIDAR performs hourly scans, taking approximately 50 minutes to complete and producing datasets on the order of 1GB. Processing of the LIDAR data includes coordinate transformations, data rectification and coregistration, filtering to remove noise and unwanted objects, gridding, and time-series analysis to generate products for use by end-users. Examples of these products include water levels and significant wave heights, virtual wave gauge time-series and FFTs, wave runup, foreshore elevations and slopes, and bare earth DEMs. Immediately after processing, data products are combined with ISO compliant metadata and stored using the NetCDF-4 file format, making them easily discoverable through a web portal which provides an interactive map that allows users to explore datasets both spatially and temporally. End-users can download datasets in user-defined time intervals, which can be used, for example, as forcing or validation parameters in numerical models. Funded by the USACE Coastal Ocean Data Systems Program.
Optimization techniques applied to spectrum management for communications satellites
NASA Astrophysics Data System (ADS)
Ottey, H. R.; Sullivan, T. M.; Zusman, F. S.
This paper describes user requirements, algorithms and software design features for the application of optimization techniques to the management of the geostationary orbit/spectrum resource. Relevant problems include parameter sensitivity analyses, frequency and orbit position assignment coordination, and orbit position allotment planning. It is shown how integer and nonlinear programming as well as heuristic search techniques can be used to solve these problems. Formalized mathematical objective functions that define the problems are presented. Constraint functions that impart the necessary solution bounds are described. A versatile program structure is outlined, which would allow problems to be solved in stages while varying the problem space, solution resolution, objective function and constraints.
Fels, S S; Hinton, G E
1997-01-01
Glove-Talk II is a system which translates hand gestures to speech through an adaptive interface. Hand gestures are mapped continuously to ten control parameters of a parallel formant speech synthesizer. The mapping allows the hand to act as an artificial vocal tract that produces speech in real time. This gives an unlimited vocabulary in addition to direct control of fundamental frequency and volume. Currently, the best version of Glove-Talk II uses several input devices, a parallel formant speech synthesizer, and three neural networks. The gesture-to-speech task is divided into vowel and consonant production by using a gating network to weight the outputs of a vowel and a consonant neural network. The gating network and the consonant network are trained with examples from the user. The vowel network implements a fixed user-defined relationship between hand position and vowel sound and does not require any training examples from the user. Volume, fundamental frequency, and stop consonants are produced with a fixed mapping from the input devices. With Glove-Talk II, the subject can speak slowly but with far more natural sounding pitch variations than a text-to-speech synthesizer.
CONSOLE: A CAD tandem for optimization-based design interacting with user-supplied simulators
NASA Technical Reports Server (NTRS)
Fan, Michael K. H.; Wang, Li-Sheng; Koninckx, Jan; Tits, Andre L.
1989-01-01
CONSOLE employs a recently developed design methodology (International Journal of Control 43:1693-1721) which provides the designer with a congenial environment to express his problem as a multiple ojective constrained optimization problem and allows him to refine his characterization of optimality when a suboptimal design is approached. To this end, in CONSOLE, the designed formulates the design problem using a high-level language and performs design task and explores tradeoff through a few short and clearly defined commands. The range of problems that can be solved efficiently using a CAD tools depends very much on the ability of this tool to be interfaced with user-supplied simulators. For instance, when designing a control system one makes use of the characteristics of the plant, and therefore, a model of the plant under study has to be made available to the CAD tool. CONSOLE allows for an easy interfacing of almost any simulator the user has available. To date CONSOLE has already been used successfully in many applications, including the design of controllers for a flexible arm and for a robotic manipulator and the solution of a parameter selection problem for a neural network.
SMART (Sandia's Modular Architecture for Robotics and Teleoperation) Ver. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Robert
"SMART Ver. 0.8 Beta" provides a system developer with software tools to create a telerobotic control system, i.e., a system whereby an end-user can interact with mechatronic equipment. It consists of three main components: the SMART Editor (tsmed), the SMART Real-time kernel (rtos), and the SMART Supervisor (gui). The SMART Editor is a graphical icon-based code generation tool for creating end-user systems, given descriptions of SMART modules. The SMART real-time kernel implements behaviors that combine modules representing input devices, sensors, constraints, filters, and robotic devices. Included with this software release is a number of core modules, which can be combinedmore » with additional project and device specific modules to create a telerobotic controller. The SMART Supervisor is a graphical front-end for running a SMART system. It is an optional component of the SMART Environment and utilizes the TeVTk windowing and scripting environment. Although the code contained within this release is complete, and can be utilized for defining, running, and interfacing to a sample end-user SMART system, most systems will include additional project and hardware specific modules developed either by the system developer or obtained independently from a SMART module developer. SMART is a software system designed to integrate the different robots, input devices, sensors and dynamic elements required for advanced modes of telerobotic control. "SMART Ver. 0.8 Beta" defines and implements a telerobotic controller. A telerobotic system consists of combinations of modules that implement behaviors. Each real-time module represents an input device, robot device, sensor, constraint, connection or filter. The underlying theory utilizes non-linear discretized multidimensional network elements to model each individual module, and guarantees that upon a valid connection, the resulting system will perform in a stable fashion. Different combinations of modules implement different behaviors. Each module must have at a minimum an initialization routine, a parameter adjustment routine, and an update routine. The SMART runtime kernel runs continuously within a real-time embedded system. Each module is first set-up by the kernel, initialized, and then updated at a fixed rate whenever it is in context. The kernel responds to operator directed commands by changing the state of the system, changing parameters on individual modules, and switching behavioral modes. The SMART Editor is a tool used to define, verify, configure and generate source code for a SMART control system. It uses icon representations of the modules, code patches from valid configurations of the modules, and configuration files describing how a module can be connected into a system to lead the end-user in through the steps needed to create a final system. The SMART Supervisor serves as an interface to a SMART run-time system. It provides an interface on a host computer that connects to the embedded system via TCPIIP ASCII commands. It utilizes a scripting language (Tel) and a graphics windowing environment (Tk). This system can either be customized to fit an end-user's needs or completely replaced as needed.« less
NASA Technical Reports Server (NTRS)
Choi, H. J.; Su, Y. T.
1986-01-01
The User Constraint Measurement System (UCMS) is a hardware/software package developed by NASA Goddard to measure the signal parameter constraints of the user transponder in the TDRSS environment by means of an all-digital signal sampling technique. An account is presently given of the features of UCMS design and of its performance capabilities and applications; attention is given to such important aspects of the system as RF interface parameter definitions, hardware minimization, the emphasis on offline software signal processing, and end-to-end link performance. Applications to the measurement of other signal parameters are also discussed.
MASTOS: Mammography Simulation Tool for design Optimization Studies.
Spyrou, G; Panayiotakis, G; Tzanakos, G
2000-01-01
Mammography is a high quality imaging technique for the detection of breast lesions, which requires dedicated equipment and optimum operation. The design parameters of a mammography unit have to be decided and evaluated before the construction of such a high cost of apparatus. The optimum operational parameters also must be defined well before the real breast examination. MASTOS is a software package, based on Monte Carlo methods, that is designed to be used as a simulation tool in mammography. The input consists of the parameters that have to be specified when using a mammography unit, and also the parameters specifying the shape and composition of the breast phantom. In addition, the input may specify parameters needed in the design of a new mammographic apparatus. The main output of the simulation is a mammographic image and calculations of various factors that describe the image quality. The Monte Carlo simulation code is PC-based and is driven by an outer shell of a graphical user interface. The entire software package is a simulation tool for mammography and can be applied in basic research and/or in training in the fields of medical physics and biomedical engineering as well as in the performance evaluation of new designs of mammography units and in the determination of optimum standards for the operational parameters of a mammography unit.
Karpievitch, Yuliya V; Almeida, Jonas S
2006-01-01
Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707
Karpievitch, Yuliya V; Almeida, Jonas S
2006-03-15
Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.
Angland, P.; Haberberger, D.; Ivancic, S. T.; ...
2017-10-30
Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of themore » $$\\chi$$2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angland, P.; Haberberger, D.; Ivancic, S. T.
Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of themore » $$\\chi$$2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.« less
Decision support for operations and maintenance (DSOM) system
Jarrell, Donald B [Kennewick, WA; Meador, Richard J [Richland, WA; Sisk, Daniel R [Richland, WA; Hatley, Darrel D [Kennewick, WA; Brown, Daryl R [Richland, WA; Keibel, Gary R [Richland, WA; Gowri, Krishnan [Richland, WA; Reyes-Spindola, Jorge F [Richland, WA; Adams, Kevin J [San Bruno, CA; Yates, Kenneth R [Lake Oswego, OR; Eschbach, Elizabeth J [Fort Collins, CO; Stratton, Rex C [Richland, WA
2006-03-21
A method for minimizing the life cycle cost of processes such as heating a building. The method utilizes sensors to monitor various pieces of equipment used in the process, for example, boilers, turbines, and the like. The method then performs the steps of identifying a set optimal operating conditions for the process, identifying and measuring parameters necessary to characterize the actual operating condition of the process, validating data generated by measuring those parameters, characterizing the actual condition of the process, identifying an optimal condition corresponding to the actual condition, comparing said optimal condition with the actual condition and identifying variances between the two, and drawing from a set of pre-defined algorithms created using best engineering practices, an explanation of at least one likely source and at least one recommended remedial action for selected variances, and providing said explanation as an output to at least one user.
Earth Survey Applications Division. [a bibliography
NASA Technical Reports Server (NTRS)
Carpenter, L. (Editor)
1981-01-01
Accomplishments of research and data analysis conducted to study physical parameters and processes inside the Earth and on the Earth's surface, to define techniques and systems for remotely sensing the processes and measuring the parameters of scientific and applications interest, and the transfer of promising operational applications techniques to the user community of Earth resources monitors, managers, and decision makers are described. Research areas covered include: geobotany, magnetic field modeling, crustal studies, crustal dynamics, sea surface topography, land resources, remote sensing of vegetation and soils, and hydrological sciences. Major accomplishments include: production of global maps of magnetic anomalies using Magsat data; computation of the global mean sea surface using GEOS-3 and Seasat altimetry data; delineation of the effects of topography on the interpretation of remotely-sensed data; application of snowmelt runoff models to water resources management; and mapping of snow depth over wheat growing areas using Nimbus microwave data.
Sampling ARG of multiple populations under complex configurations of subdivision and admixture.
Carrieri, Anna Paola; Utro, Filippo; Parida, Laxmi
2016-04-01
Simulating complex evolution scenarios of multiple populations is an important task for answering many basic questions relating to population genomics. Apart from the population samples, the underlying Ancestral Recombinations Graph (ARG) is an additional important means in hypothesis checking and reconstruction studies. Furthermore, complex simulations require a plethora of interdependent parameters making even the scenario-specification highly non-trivial. We present an algorithm SimRA that simulates generic multiple population evolution model with admixture. It is based on random graphs that improve dramatically in time and space requirements of the classical algorithm of single populations.Using the underlying random graphs model, we also derive closed forms of expected values of the ARG characteristics i.e., height of the graph, number of recombinations, number of mutations and population diversity in terms of its defining parameters. This is crucial in aiding the user to specify meaningful parameters for the complex scenario simulations, not through trial-and-error based on raw compute power but intelligent parameter estimation. To the best of our knowledge this is the first time closed form expressions have been computed for the ARG properties. We show that the expected values closely match the empirical values through simulations.Finally, we demonstrate that SimRA produces the ARG in compact forms without compromising any accuracy. We demonstrate the compactness and accuracy through extensive experiments. SimRA (Simulation based on Random graph Algorithms) source, executable, user manual and sample input-output sets are available for downloading at: https://github.com/ComputationalGenomics/SimRA CONTACT: : parida@us.ibm.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
User's guide to HYPOINVERSE-2000, a Fortran program to solve for earthquake locations and magnitudes
Klein, Fred W.
2002-01-01
Hypoinverse is a computer program that processes files of seismic station data for an earthquake (like p wave arrival times and seismogram amplitudes and durations) into earthquake locations and magnitudes. It is one of a long line of similar USGS programs including HYPOLAYR (Eaton, 1969), HYPO71 (Lee and Lahr, 1972), and HYPOELLIPSE (Lahr, 1980). If you are new to Hypoinverse, you may want to start by glancing at the section “SOME SIMPLE COMMAND SEQUENCES” to get a feel of some simpler sessions. This document is essentially an advanced user’s guide, and reading it sequentially will probably plow the reader into more detail than he/she needs. Every user must have a crust model, station list and phase data input files, and glancing at these sections is a good place to begin. The program has many options because it has grown over the years to meet the needs of one the largest seismic networks in the world, but small networks with just a few stations do use the program and can ignore most of the options and commands. History and availability. Hypoinverse was originally written for the Eclipse minicomputer in 1978 (Klein, 1978). A revised version for VAX and Pro-350 computers (Klein, 1985) was later expanded to include multiple crustal models and other capabilities (Klein, 1989). This current report documents the expanded Y2000 version and it supercedes the earlier documents. It serves as a detailed user's guide to the current version running on unix and VAX-alpha computers, and to the version supplied with the Earthworm earthquake digitizing system. Fortran-77 source code (Sun and VAX compatible) and copies of this documentation is available via anonymous ftp from computers in Menlo Park. At present, the computer is swave.wr.usgs.gov and the directory is /ftp/pub/outgoing/klein/hyp2000. If you are running Hypoinverse on one of the Menlo Park EHZ or NCSN unix computers, the executable currently is ~klein/hyp2000/hyp2000. New features. The Y2000 version of Hypoinverse includes all of the previous capabilities, but adds Y2000 formats to those defined earlier. In most cases, the new formats add 2 digits to the year field to accommodate the century. Other fields are sometimes rearranged or expanded to accommodate a better field order. The Y2000 formats are invoked with the “200” command. When the Y2000 flag is turned on, all files are read and written in the new format and there is no mixing of format types in a single run. Some formats without a date field, like station files, have not changed. A separate program called 2000CONV has been written to convert old formats to new. Other new features, like expanded station names, calculating amplitude magnitudes from a variety of digital seismometers, station history files, interactive earthquake processing, and locations from CUSP (Caltech USGS Seismic Processing) binary files have been added. General features. Hypoinverse will locate any number of events in an input file, which can be in one of several different formats. Any or all of printout, summary or archive output may be produced. Hypoinverse is driven by user commands. The various commands define input and output files, set adjustable parameters, and solve for locations of a file of earthquake data using the parameters and files currently set. It is both interactive and "batch" in that commands may be executed either from the keyboard or from a file. You execute the commands in a file by typing @filename at the Hypoinverse prompt. Users may either supply parameters on the command line, or omit them and are prompted interactively. The current parameter values are displayed and may be taken as defaults by pressing just the RETURN key after the prompt. This makes the program very easy to use, providing you can remember the names of the commands. Combining commands with and without their required parameters into a command file permits a variety of customized procedures such as automatic input of crustal model and station data, but prompting for a different phase file each time. All commands are 3 letters long and most require one or more parameters or file names. If they appear on a line with a command, character strings such as filenames must be enclosed in apostrophes (single quotes). Appendix 1 gives this and other free-format rules for supplying parameters, which are parsed in Fortran. When several parameters are required following a command, any of them may be omitted by replacing them with null fields (see appendix 1). A null field leaves that parameter unchanged from its current or default value. When you start HYPOINVERSE, default values are in effect for all parameters except file names. Hypoinverse is a complicated program with many features and options. Many of these "advanced" or seldom used features are documented here, but are more detailed than a typical user needs to read about when first starting with the program. I have put some of this material in smaller type so that a first time user can concentrate on the more important information.
NASA Technical Reports Server (NTRS)
Johnson, Charles S.
1986-01-01
It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.
Nurse Scheduling by Cooperative GA with Effective Mutation Operator
NASA Astrophysics Data System (ADS)
Ohki, Makoto
In this paper, we propose an effective mutation operators for Cooperative Genetic Algorithm (CGA) to be applied to a practical Nurse Scheduling Problem (NSP). The nurse scheduling is a very difficult task, because NSP is a complex combinatorial optimizing problem for which many requirements must be considered. In real hospitals, the schedule changes frequently. The changes of the shift schedule yields various problems, for example, a fall in the nursing level. We describe a technique of the reoptimization of the nurse schedule in response to a change. The conventional CGA is superior in ability for local search by means of its crossover operator, but often stagnates at the unfavorable situation because it is inferior to ability for global search. When the optimization stagnates for long generation cycle, a searching point, population in this case, would be caught in a wide local minimum area. To escape such local minimum area, small change in a population should be required. Based on such consideration, we propose a mutation operator activated depending on the optimization speed. When the optimization stagnates, in other words, when the optimization speed decreases, the mutation yields small changes in the population. Then the population is able to escape from a local minimum area by means of the mutation. However, this mutation operator requires two well-defined parameters. This means that user have to consider the value of these parameters carefully. To solve this problem, we propose a periodic mutation operator which has only one parameter to define itself. This simplified mutation operator is effective over a wide range of the parameter value.
A parallel coordinates style interface for exploratory volume visualization.
Tory, Melanie; Potts, Simeon; Möller, Torsten
2005-01-01
We present a user interface, based on parallel coordinates, that facilitates exploration of volume data. By explicitly representing the visualization parameter space, the interface provides an overview of rendering options and enables users to easily explore different parameters. Rendered images are stored in an integrated history bar that facilitates backtracking to previous visualization options. Initial usability testing showed clear agreement between users and experts of various backgrounds (usability, graphic design, volume visualization, and medical physics) that the proposed user interface is a valuable data exploration tool.
van den Berg, Sanne W; Peters, Esmee J; Kraaijeveld, J Frank; Gielissen, Marieke F M; Prins, Judith B
2013-08-19
Generic fully automated Web-based self-management interventions are upcoming, for example, for the growing number of breast cancer survivors. It is hypothesized that the use of these interventions is more individualized and that users apply a large amount of self-tailoring. However, technical usage evaluations of these types of interventions are scarce and practical guidelines are lacking. To gain insight into meaningful usage parameters to evaluate the use of generic fully automated Web-based interventions by assessing how breast cancer survivors use a generic self-management website. Final aim is to propose practical recommendations for researchers and information and communication technology (ICT) professionals who aim to design and evaluate the use of similar Web-based interventions. The BREAst cancer ehealTH (BREATH) intervention is a generic unguided fully automated website with stepwise weekly access and a fixed 4-month structure containing 104 intervention ingredients (ie, texts, tasks, tests, videos). By monitoring https-server requests, technical usage statistics were recorded for the intervention group of the randomized controlled trial. Observed usage was analyzed by measures of frequency, duration, and activity. Intervention adherence was defined as continuous usage, or the proportion of participants who started using the intervention and continued to log in during all four phases. By comparing observed to minimal intended usage (frequency and activity), different user groups were defined. Usage statistics for 4 months were collected from 70 breast cancer survivors (mean age 50.9 years). Frequency of logins/person ranged from 0 to 45, total duration/person from 0 to 2324 minutes (38.7 hours), and activity from opening none to all intervention ingredients. 31 participants continued logging in to all four phases resulting in an intervention adherence rate of 44.3% (95% CI 33.2-55.9). Nine nonusers (13%), 30 low users (43%), and 31 high users (44%) were defined. Low and high users differed significantly on frequency (P<.001), total duration (P<.001), session duration (P=.009), and activity (P<.001). High users logged in an average of 21 times, had a mean session duration of 33 minutes, and opened on average 91% of all ingredients. Signing the self-help contract (P<.001), reporting usefulness of ingredients (P=.003), overall satisfaction (P=.028), and user friendliness evaluation (P=.003) were higher in high users. User groups did not differ on age, education, and baseline distress. By reporting the usage of a self-management website for breast cancer survivors, the present study gained first insight into the design of usage evaluations of generic fully automated Web-based interventions. It is recommended to (1) incorporate usage statistics that reflect the amount of self-tailoring applied by users, (2) combine technical usage statistics with self-reported usefulness, and (3) use qualitative measures. Also, (4) a pilot usage evaluation should be a fixed step in the development process of novel Web-based interventions, and (5) it is essential for researchers to gain insight into the rationale of recorded and nonrecorded usage statistics. Netherlands Trial Register (NTR): 2935; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=2935 (Archived by WebCite at http://www.webcitation.org/6IkX1ADEV).
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1990-01-01
The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.
Major Design Drivers for LEO Space Surveillance in Europe and Solution Concepts
NASA Astrophysics Data System (ADS)
Krag, Holger; Flohrer, Tim; Klinkrad, Heiner
Europe is preparing for the development of an autonomous system for space situational aware-ness. One important segment of this new system will be dedicated to surveillance and tracking of space objects in Earth orbits. First concept and capability analysis studies have led to a draft system proposal. This proposal foresees, in a first deployment step, a groundbased system consisting of radar sensors and a network of optical telescopes. These sensors will be designed to have the capability of building up and maintaining a catalogue of space objects. A number of related services will be provided, including collision avoidance and the prediction of uncontrolled reentry events. Currently, the user requirements are consolidated, defining the different services, and the related accuracy and timeliness of the derived products. In this consolidation process parameters like the lower diameter limit above which catalogue coverage is to be achieved, the degree of population coverage in various orbital regions and the accuracy of the orbit data maintained in the catalogue are important design drivers for the selection of number and location of the sensors, and the definition of the required sensor performance. Further, the required minimum time for the detection of a manoeuvre, a newly launched object or a fragmentation event, significantly determines the required surveillance performance. In the requirement consolidation process the performance to be specified has to be based on a careful analysis which takes into account accuracy constraints of the services to be provided, the technical feasibility, complexity and costs. User requirements can thus not be defined with-out understanding the consequences they would pose on the system design. This paper will outline the design definition process for the surveillance and tracking segment of the European space situational awareness system. The paper will focus on the low-Earth orbits (LEO). It will present the core user requirements and the definition of the derived services. The de-sired performance parameters will be explained together with presenting their rationale and justification. This will be followed by an identification of the resulting major design drivers. The influence of these drivers on the system design will be analysed, including limiting object diameter, population coverage, orbit maintenance accuracy, and the minimum time to detect events like manoeuvres or breakups. The underlying simulation and verification concept will be explained. Finally, a first compilation of performance parameters for the surveillance and tracking segment will be presented and discussed.
Slater, Helen; Briggs, Andrew; Stinson, Jennifer; Campbell, Jared M
2017-08-01
The objective of this review is to systematically identify, review and synthesize relevant qualitative research on end user and implementer experiences of mobile health (mHealth) technologies developed for noncommunicable chronic disease management in young adults. "End users" are defined as young people aged 15-24 years, and "implementers" are defined as health service providers, clinicians, policy makers and administrators.The two key questions we wish to systematically explore from identified relevant qualitative studies or studies with qualitative components are.
User Modeling in Adaptive Hypermedia Educational Systems
ERIC Educational Resources Information Center
Martins, Antonio Constantino; Faria, Luiz; Vaz de Carvalho, Carlos; Carrapatoso, Eurico
2008-01-01
This document is a survey in the research area of User Modeling (UM) for the specific field of Adaptive Learning. The aims of this document are: To define what it is a User Model; To present existing and well known User Models; To analyze the existent standards related with UM; To compare existing systems. In the scientific area of User Modeling…
Interaction Junk: User Interaction-Based Evaluation of Visual Analytic Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; North, Chris
2012-10-14
With the growing need for visualization to aid users in understanding large, complex datasets, the ability for users to interact and explore these datasets is critical. As visual analytic systems have advanced to leverage powerful computational models and data analytics capabilities, the modes by which users engage and interact with the information are limited. Often, users are taxed with directly manipulating parameters of these models through traditional GUIs (e.g., using sliders to directly manipulate the value of a parameter). However, the purpose of user interaction in visual analytic systems is to enable visual data exploration – where users can focusmore » on their task, as opposed to the tool or system. As a result, users can engage freely in data exploration and decision-making, for the purpose of gaining insight. In this position paper, we discuss how evaluating visual analytic systems can be approached through user interaction analysis, where the goal is to minimize the cognitive translation between the visual metaphor and the mode of interaction (i.e., reducing the “Interactionjunk”). We motivate this concept through a discussion of traditional GUIs used in visual analytics for direct manipulation of model parameters, and the importance of designing interactions the support visual data exploration.« less
Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools
NASA Astrophysics Data System (ADS)
Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.
2011-12-01
Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate temporal evolution of the magnetic field structure and/or fast electron population implied by the electron acceleration and transport. This work was supported in part by NSF grants AGS-0961867, AST-0908344, and NASA grants NNX10AF27G and NNX11AB49G to New Jersey Institute of Technology, by a UK STFC rolling grant, STFC/PPARC Advanced Fellowship, and the Leverhulme Trust, UK. Financial support by the European Commission through the SOLAIRE and HESPE Networks is gratefully acknowledged.
Accidental Discovery of Information on the User-Defined Social Web: A Mixed-Method Study
ERIC Educational Resources Information Center
Lu, Chi-Jung
2012-01-01
Frequently interacting with other people or working in an information-rich environment can foster the "accidental discovery of information" (ADI) (Erdelez, 2000; McCay-Peet & Toms, 2010). With the increasing adoption of social web technologies, online user-participation communities and user-generated content have provided users the…
ERIC Educational Resources Information Center
Read, Aaron
2013-01-01
The rise of stakeholder centered software development has led to organizations engaging users early in the development process to help define system requirements. To facilitate user involvement in the requirements elicitation process, companies can use Group Support Systems (GSS) to conduct requirements elicitation workshops. The effectiveness of…
2007-03-01
task termine if in travel la ons, visual recognitio nd information proce visual recognitio uation will yiee Δ = (0accuracy .37 * 06463) + (0.63 * 0.11...mission Figure 2. User-defined stresso err int face . 8 Figure 3. Stressor levels in IMPRINT. Figure 4. Accuracy stressor definition
Cross-Layer Service Discovery Mechanism for OLSRv2 Mobile Ad Hoc Networks.
Vara, M Isabel; Campo, Celeste
2015-07-20
Service discovery plays an important role in mobile ad hoc networks (MANETs). The lack of central infrastructure, limited resources and high mobility make service discovery a challenging issue for this kind of network. This article proposes a new service discovery mechanism for discovering and advertising services integrated into the Optimized Link State Routing Protocol Version 2 (OLSRv2). In previous studies, we demonstrated the validity of a similar service discovery mechanism integrated into the previous version of OLSR (OLSRv1). In order to advertise services, we have added a new type-length-value structure (TLV) to the OLSRv2 protocol, called service discovery message (SDM), according to the Generalized MANET Packet/Message Format defined in Request For Comments (RFC) 5444. Each node in the ad hoc network only advertises its own services. The advertisement frequency is a user-configurable parameter, so that it can be modified depending on the user requirements. Each node maintains two service tables, one to store information about its own services and another one to store information about the services it discovers in the network. We present simulation results, that compare our service discovery integrated into OLSRv2 with the one defined for OLSRv1 and with the integration of service discovery in Ad hoc On-demand Distance Vector (AODV) protocol, in terms of service discovery ratio, service latency and network overhead.
Cross-Layer Service Discovery Mechanism for OLSRv2 Mobile Ad Hoc Networks
Vara, M. Isabel; Campo, Celeste
2015-01-01
Service discovery plays an important role in mobile ad hoc networks (MANETs). The lack of central infrastructure, limited resources and high mobility make service discovery a challenging issue for this kind of network. This article proposes a new service discovery mechanism for discovering and advertising services integrated into the Optimized Link State Routing Protocol Version 2 (OLSRv2). In previous studies, we demonstrated the validity of a similar service discovery mechanism integrated into the previous version of OLSR (OLSRv1). In order to advertise services, we have added a new type-length-value structure (TLV) to the OLSRv2 protocol, called service discovery message (SDM), according to the Generalized MANET Packet/Message Format defined in Request For Comments (RFC) 5444. Each node in the ad hoc network only advertises its own services. The advertisement frequency is a user-configurable parameter, so that it can be modified depending on the user requirements. Each node maintains two service tables, one to store information about its own services and another one to store information about the services it discovers in the network. We present simulation results, that compare our service discovery integrated into OLSRv2 with the one defined for OLSRv1 and with the integration of service discovery in Ad hoc On-demand Distance Vector (AODV) protocol, in terms of service discovery ratio, service latency and network overhead. PMID:26205272
Lazinski, David W; Camilli, Andrew
2013-01-01
The amplification of DNA fragments, cloned between user-defined 5' and 3' end sequences, is a prerequisite step in the use of many current applications including massively parallel sequencing (MPS). Here we describe an improved method, called homopolymer tail-mediated ligation PCR (HTML-PCR), that requires very little starting template, minimal hands-on effort, is cost-effective, and is suited for use in high-throughput and robotic methodologies. HTML-PCR starts with the addition of homopolymer tails of controlled lengths to the 3' termini of a double-stranded genomic template. The homopolymer tails enable the annealing-assisted ligation of a hybrid oligonucleotide to the template's recessed 5' ends. The hybrid oligonucleotide has a user-defined sequence at its 5' end. This primer, together with a second primer composed of a longer region complementary to the homopolymer tail and fused to a second 5' user-defined sequence, are used in a PCR reaction to generate the final product. The user-defined sequences can be varied to enable compatibility with a wide variety of downstream applications. We demonstrate our new method by constructing MPS libraries starting from nanogram and sub-nanogram quantities of Vibrio cholerae and Streptococcus pneumoniae genomic DNA.
AXAF user interfaces for heterogeneous analysis environments
NASA Technical Reports Server (NTRS)
Mandel, Eric; Roll, John; Ackerman, Mark S.
1992-01-01
The AXAF Science Center (ASC) will develop software to support all facets of data center activities and user research for the AXAF X-ray Observatory, scheduled for launch in 1999. The goal is to provide astronomers with the ability to utilize heterogeneous data analysis packages, that is, to allow astronomers to pick the best packages for doing their scientific analysis. For example, ASC software will be based on IRAF, but non-IRAF programs will be incorporated into the data system where appropriate. Additionally, it is desired to allow AXAF users to mix ASC software with their own local software. The need to support heterogeneous analysis environments is not special to the AXAF project, and therefore finding mechanisms for coordinating heterogeneous programs is an important problem for astronomical software today. The approach to solving this problem has been to develop two interfaces that allow the scientific user to run heterogeneous programs together. The first is an IRAF-compatible parameter interface that provides non-IRAF programs with IRAF's parameter handling capabilities. Included in the interface is an application programming interface to manipulate parameters from within programs, and also a set of host programs to manipulate parameters at the command line or from within scripts. The parameter interface has been implemented to support parameter storage formats other than IRAF parameter files, allowing one, for example, to access parameters that are stored in data bases. An X Windows graphical user interface called 'agcl' has been developed, layered on top of the IRAF-compatible parameter interface, that provides a standard graphical mechanism for interacting with IRAF and non-IRAF programs. Users can edit parameters and run programs for both non-IRAF programs and IRAF tasks. The agcl interface allows one to communicate with any command line environment in a transparent manner and without any changes to the original environment. For example, the authors routinely layer the GUI on top of IRAF, ksh, SMongo, and IDL. The agcl, based on the facilities of a system called Answer Garden, also has sophisticated support for examining documentation and help files, asking questions of experts, and developing a knowledge base of frequently required information. Thus, the GUI becomes a total environment for running programs, accessing information, examining documents, and finding human assistance. Because the agcl can communicate with any command-line environment, most projects can make use of it easily. New applications are continually being found for these interfaces. It is the authors' intention to evolve the GUI and its underlying parameter interface in response to these needs - from users as well as developers - throughout the astronomy community. This presentation describes the capabilities and technology of the above user interface mechanisms and tools. It also discusses the design philosophies guiding the work, as well as hopes for the future.
Generation of Requirements for Simulant Measurements. Revised, May 30, 2010
NASA Technical Reports Server (NTRS)
Rickman, Doug; Edmunson, Jennifer
2010-01-01
This document provides a formal, logical explanation of the parameters selected for the Figure of Merit algorithm used to evaluate lunar regolith simulant. The objectives, requirements, assumptions and analysis behind the parameters is provided. From NASA's objectives for lunar simulants a requirement is derived to verify and validate simulant performance versus lunar regolith. This requirement leads to a specification that comparative measurements be taken the same way on the regolith and the simulant. In turn this leads to a set of 9 criteria with which to evaluate comparative measurement. Many of the potential measurements of interest are not defensible under these criteria, for example many geotechnical properties of interest were not explicitly measured during Apollo and they can only be measured in situ on the Moon. A 2005 workshop identified 32 properties of major interest to users (Sibille Carpenter Schlagheck, and French, 2006). Virtually all of the properties are tightly constrained, though not predictable, if just four parameters are controlled. Three: composition, size and shape, are recognized as being definable at the particle level. The fourth, density, is a bulk property. In recent work a fifth parameter has been identified, which will need to be added to future releases of the Figure of Merit: spectroscopy.
Utility of coupling nonlinear optimization methods with numerical modeling software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, M.J.
1996-08-05
Results of using GLO (Global Local Optimizer), a general purpose nonlinear optimization software package for investigating multi-parameter problems in science and engineering is discussed. The package consists of the modular optimization control system (GLO), a graphical user interface (GLO-GUI), a pre-processor (GLO-PUT), a post-processor (GLO-GET), and nonlinear optimization software modules, GLOBAL & LOCAL. GLO is designed for controlling and easy coupling to any scientific software application. GLO runs the optimization module and scientific software application in an iterative loop. At each iteration, the optimization module defines new values for the set of parameters being optimized. GLO-PUT inserts the new parametermore » values into the input file of the scientific application. GLO runs the application with the new parameter values. GLO-GET determines the value of the objective function by extracting the results of the analysis and comparing to the desired result. GLO continues to run the scientific application over and over until it finds the ``best`` set of parameters by minimizing (or maximizing) the objective function. An example problem showing the optimization of material model is presented (Taylor cylinder impact test).« less
The GIS weasel - An interface for the development of spatial information in modeling
Viger, R.J.; Markstrom, S.M.; Leavesley, G.H.; ,
2005-01-01
The GIS Weasel is a map and Graphical User Interface (GUI) driven tool that has been developed as an aid to modelers in the delineation, characterization of geographic features, and their parameterization for use in distributed or lumped parameter physical process models. The interface does not require user expertise in geographic information systems (GIS). The user does need knowledge of how the model will use the output from the GIS Weasel. The GIS Weasel uses Workstation ArcInfo and its the Grid extension. The GIS Weasel will run on all platforms that Workstation ArcInfo runs (i.e. numerous flavors of Unix and Microsoft Windows).The GIS Weasel requires an input ArcInfo grid of some topographical description of the Area of Interest (AOI). This is normally a digital elevation model, but can be the surface of a ground water table or any other data that flow direction can be resolved from. The user may define the AOI as a custom drainage area based on an interactively specified watershed outlet point, or use a previously created map. The user is then able to use any combination of the GIS Weasel's tool set to create one or more maps for depicting different kinds of geographic features. Once the spatial feature maps have been prepared, then the GIS Weasel s many parameterization routines can be used to create descriptions of each element in each of the user s created maps. Over 200 parameterization routines currently exist, generating information about shape, area, and topological association with other features of the same or different maps, as well many types of information based on ancillary data layers such as soil and vegetation properties. These tools easily integrate other similarly formatted data sets.
Sequential addition of short DNA oligos in DNA-polymerase-based synthesis reactions
Gardner, Shea N [San Leandro, CA; Mariella, Jr., Raymond P.; Christian, Allen T [Tracy, CA; Young, Jennifer A [Berkeley, CA; Clague, David S [Livermore, CA
2011-01-18
A method of fabricating a DNA molecule of user-defined sequence. The method comprises the steps of preselecting a multiplicity of DNA sequence segments that will comprise the DNA molecule of user-defined sequence, separating the DNA sequence segments temporally, and combining the multiplicity of DNA sequence segments with at least one polymerase enzyme wherein the multiplicity of DNA sequence segments join to produce the DNA molecule of user-defined sequence. Sequence segments may be of length n, where n is an even or odd integer. In one embodiment the length of desired hybridizing overlap is specified by the user and the sequences and the protocol for combining them are guided by computational (bioinformatics) predictions. In one embodiment sequence segments are combined from multiple reading frames to span the same region of a sequence, so that multiple desired hybridizations may occur with different overlap lengths. In one embodiment starting sequence fragments are of different lengths, n, n+1, n+2, etc.
AOIPS data base management systems support for GARP data sets
NASA Technical Reports Server (NTRS)
Gary, J. P.
1977-01-01
A data base management system is identified, developed to provide flexible access to data sets produced by GARP during its data systems tests. The content and coverage of the data base are defined and a computer-aided, interactive information storage and retrieval system, implemented to facilitate access to user specified data subsets, is described. The computer programs developed to provide the capability were implemented on the highly interactive, minicomputer-based AOIPS and are referred to as the data retrieval system (DRS). Implemented as a user interactive but menu guided system, the DRS permits users to inventory the data tape library and create duplicate or subset data sets based on a user selected window defined by time and latitude/longitude boundaries. The DRS permits users to select, display, or produce formatted hard copy of individual data items contained within the data records.
Charliecloud: Unprivileged containers for user-defined software stacks in HPC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Priedhorsky, Reid; Randles, Timothy C.
Supercomputing centers are seeing increasing demand for user-defined software stacks (UDSS), instead of or in addition to the stack provided by the center. These UDSS support user needs such as complex dependencies or build requirements, externally required configurations, portability, and consistency. The challenge for centers is to provide these services in a usable manner while minimizing the risks: security, support burden, missing functionality, and performance. We present Charliecloud, which uses the Linux user and mount namespaces to run industry-standard Docker containers with no privileged operations or daemons on center resources. Our simple approach avoids most security risks while maintaining accessmore » to the performance and functionality already on offer, doing so in less than 500 lines of code. Charliecloud promises to bring an industry-standard UDSS user workflow to existing, minimally altered HPC resources.« less
Modelling of Local Necking and Fracture in Aluminium Alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achani, D.; Eriksson, M.; Hopperstad, O. S.
2007-05-17
Non-linear Finite Element simulations are extensively used in forming and crashworthiness studies of automotive components and structures in which fracture need to be controlled. For thin-walled ductile materials, the fracture-related phenomena that must be properly represented are thinning instability, ductile fracture and through-thickness shear instability. Proper representation of the fracture process relies on the accuracy of constitutive and fracture models and their parameters that need to be calibrated through well defined experiments. The present study focuses on local necking and fracture which is of high industrial importance, and uses a phenomenological criterion for modelling fracture in aluminium alloys. As anmore » accurate description of plastic anisotropy is important, advanced phenomenological constitutive equations based on the yield criterion YLD2000/YLD2003 are used. Uniaxial tensile tests and disc compression tests are performed for identification of the constitutive model parameters. Ductile fracture is described by the Cockcroft-Latham fracture criterion and an in-plane shear tests is performed to identify the fracture parameter. The reason is that in a well designed in-plane shear test no thinning instability should occur and it thus gives more direct information about the phenomenon of ductile fracture. Numerical simulations have been performed using a user-defined material model implemented in the general-purpose non-linear FE code LS-DYNA. The applicability of the model is demonstrated by correlating the predicted and experimental response in the in-plane shear tests and additional plane strain tension tests.« less
GEMPAK5 user's guide, version 5.0
NASA Technical Reports Server (NTRS)
Desjardins, Mary L.; Brill, Keith F.; Schotz, Steven S.
1991-01-01
GEMPAK is a general meteorological software package used to analyze and display conventional meteorological data as well as satellite derived parameters. The User's Guide describes the GEMPAK5 programs and input parameters and details the algorithms used for the meteorological computations.
Semantic solutions to Heliophysics data access
NASA Astrophysics Data System (ADS)
Narock, T. W.; Vandegriff, J. D.; Weigel, R. S.
2011-12-01
Within the domain of Heliophysics, data discovery is being actively addressed. However, data diversity in the returned results has proven to be a significant barrier to integrated multi-mission analysis. Software is being actively developed (e.g. Vandergriff and Brown, 2008) that is data format and measurement type agnostic. However, such approaches rely on an a priori definition of common baseline parameters, units, and coordinate systems onto which all data will be mapped. In this work, we describe our efforts at utilizing a task ontology (Guarino, 1998) to model the steps involved in data transformation within Heliophysics. Thus, given Heliophysics logic and heterogeneous input data, we are able to develop software that is able to infer the set of steps required to compute user specified parameters. Such a framework offers flexibility by allowing users to define their own preferred sets of parameters, units, and coordinate systems they would like in their analysis. In addition, the storage of this information as ontology instances means they are external to source code and are easily shareable and extensible. The additional inclusion of a provenance ontology allows us to capture the historical record of each data analysis session for future review. We describe our use of existing task and provenance ontologies and provide example use cases as well as potential future applications. References J. Vandegriff and L. Brown, (2010), A framework for reading and unifying heliophysics time series data, Earth Science Informatics, Volume 3, Numbers 1-2, Pages 75-86 N. Guarino, (1998), Formal Ontology in Information Systems, Proceedings of FOIS'98, Trento, Italy, 6-8 June 1998. Amsterdam, IOS Press, pp. 3-15.
IVS Working Group 2 for Product Specification and Observing Programs
NASA Astrophysics Data System (ADS)
Schuh, H.; Charlot, P.; Hase, H.; Himwich, E.; Kingham, K.; Klatt, C.; Ma, C.; Malkin, Z.; Niell, A.; Nothnagel, A.; Schluter, W.; Takashima, K.; Vandenberg, N.
2002-03-01
After the scientific rationale is given in the introduction the Terms of Reference and the proceeding of IVS Working Group 2 are presented. Then the present status and future goals of all international activities within IVS are described. In particular the current products of IVS are described in terms of accuracy, reliability, frequency of observing sessions, temporal resolution of the parameters estimated by VLBI data analysis, time delay from observing to product, i.e. time which has passed after the end of the last session included in the VLBI solution till availability of the final products and frequency of solution (in the case of "global solutions";, when all existing or a high number of VLBI sessions are used to determine so-called global parameters). All IVS products and their potential users are covered in the report. This includes the Earth orientation parameters (EOP), the reference frames (TRF and CRF), geodynamical and geophysical parameters and physical parameters. Measures which should be taken within IVS to meet the goals defined in the first steps are presented. As most of the measures are related to the observing programs, these are the main focus for improving the current status of IVS products. The report shows that due to various requirements of the different users of IVS products the following aspects must be accomplished: - significant improvement of the accuracy of VLBI products, - shorter time delay from observation to availability of results, - almost continuous temporal coverage by VLBI sessions. A first scenario of the IVS observing program for 2002 and 2003 considers an increase of observing time by about 30%-40% and includes sessions carried out by S2 and K4 technology. The midterm observing program for the next 4-5 years seems to be rather ambitious. However, it appears feasible if all efforts are concentrated and the necessary resources are made available. From: Bidzina Kapanadze (Ilia State Iniversity) Address: bidzina_kapanadze@iliauni.edu.ge Database: ast
NASA Technical Reports Server (NTRS)
Franks, Shannon; Masek, Jeffrey G.; Headley, Rachel M.; Gasch, John; Arvidson, Terry
2009-01-01
The Global Land Survey (GLS) 2005 is a cloud-free, orthorectified collection of Landsat imagery acquired during the 2004-2007 epoch intended to support global land-cover and ecological monitoring. Due to the numerous complexities in selecting imagery for the GLS2005, NASA and the U.S. Geological Survey (USGS) sponsored the development of an automated scene selection tool, the Large Area Scene Selection Interface (LASSI), to aid in the selection of imagery for this data set. This innovative approach to scene selection applied a user-defined weighting system to various scene parameters: image cloud cover, image vegetation greenness, choice of sensor, and the ability of the Landsat 7 Scan Line Corrector (SLC)-off pair to completely fill image gaps, among others. The parameters considered in scene selection were weighted according to their relative importance to the data set, along with the algorithm's sensitivity to that weight. This paper describes the methodology and analysis that established the parameter weighting strategy, as well as the post-screening processes used in selecting the optimal data set for GLS2005.
Identifying Bearing Rotodynamic Coefficients Using an Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Miller, Brad A.; Howard, Samuel A.
2008-01-01
An Extended Kalman Filter is developed to estimate the linearized direct and indirect stiffness and damping force coefficients for bearings in rotor dynamic applications from noisy measurements of the shaft displacement in response to imbalance and impact excitation. The bearing properties are modeled as stochastic random variables using a Gauss-Markov model. Noise terms are introduced into the system model to account for all of the estimation error, including modeling errors and uncertainties and the propagation of measurement errors into the parameter estimates. The system model contains two user-defined parameters that can be tuned to improve the filter's performance; these parameters correspond to the covariance of the system and measurement noise variables. The filter is also strongly influenced by the initial values of the states and the error covariance matrix. The filter is demonstrated using numerically simulated data for a rotor bearing system with two identical bearings, which reduces the number of unknown linear dynamic coefficients to eight. The filter estimates for the direct damping coefficients and all four stiffness coefficients correlated well with actual values, whereas the estimates for the cross-coupled damping coefficients were the least accurate.
FAST: Fitting and Assessment of Synthetic Templates
NASA Astrophysics Data System (ADS)
Kriek, Mariska; van Dokkum, Pieter G.; Labbé, Ivo; Franx, Marijn; Illingworth, Garth D.; Marchesini, Danilo; Quadri, Ryan F.; Aird, James; Coil, Alison L.; Georgakakis, Antonis
2018-03-01
FAST (Fitting and Assessment of Synthetic Templates) fits stellar population synthesis templates to broadband photometry and/or spectra. FAST is compatible with the photometric redshift code EAzY (ascl:1010.052) when fitting broadband photometry; it uses the photometric redshifts derived by EAzY, and the input files (for examply, photometric catalog and master filter file) are the same. FAST fits spectra in combination with broadband photometric data points or simultaneously fits two components, allowing for an AGN contribution in addition to the host galaxy light. Depending on the input parameters, FAST outputs the best-fit redshift, age, dust content, star formation timescale, metallicity, stellar mass, star formation rate (SFR), and their confidence intervals. Though some of FAST's functions overlap with those of HYPERZ (ascl:1108.010), it differs by fitting fluxes instead of magnitudes, allows the user to completely define the grid of input stellar population parameters and easily input photometric redshifts and their confidence intervals, and calculates calibrated confidence intervals for all parameters. Note that FAST is not a photometric redshift code, though it can be used as one.
NASA Astrophysics Data System (ADS)
Manipon, G. J. M.; Hua, H.; Owen, S. E.; Sacco, G. F.; Agram, P. S.; Moore, A. W.; Yun, S. H.; Fielding, E. J.; Lundgren, P.; Rosen, P. A.; Webb, F.; Liu, Z.; Smith, A. T.; Wilson, B. D.; Simons, M.; Poland, M. P.; Cervelli, P. F.
2014-12-01
The Hybrid Science Data System (HySDS) scalably powers the ingestion, metadata extraction, cataloging, high-volume data processing, and publication of the geodetic data products for the Advanced Rapid Imaging & Analysis for Monitoring Hazard (ARIA-MH) project at JPL. HySDS uses a heterogeneous set of worker nodes from private & public clouds as well as virtual & bare-metal machines to perform every aspect of the traditional science data system. For our science data users, the forefront of HySDS is the facet search interface, FacetView, which allows them to browse, filter, and access the published products. Users are able to explore the collection of product metadata information and apply multiple filters to constrain the result set down to their particular interests. It allows them to download these faceted products for further analysis and generation of derived products. However, we have also employed a novel approach to faceting where it is also used to apply constraints for custom monitoring of products, system resources, and triggers for automated data processing. The power of the facet search interface is well documented across various domains and its usefulness is rooted in the current state of existence of metadata. However, user needs usually extend beyond what is currently present in the data system. A user interested in synthetic aperture radar (SAR) data over Kilauea will download them from FacetView but would also want email notification of future incoming scenes. The user may even want that data pushed to a remote workstation for automated processing. Better still, these future products could trigger HySDS to run the user's analysis on its array of worker nodes, on behalf of the user, and ingest the resulting derived products. We will present our findings in integrating an ancillary, user-defined, system-driven processing system for HySDS that allows users to define faceted rules based on facet constraints and triggers actions when new SAR data products arrive that match the constraints. We will discuss use cases where users have defined rules for the automated generation of InSAR derived products: interferograms for California and Kilauea, time-series analyses, and damage proxy maps. These findings are relevant for science data system development of the proposed NASA-ISRO SAR mission.
A Measurement Plane for Optical Networks to Manage Emergency Events
NASA Astrophysics Data System (ADS)
Tego, E.; Carciofi, C.; Grazioso, P.; Petrini, V.; Pompei, S.; Matera, F.; Attanasio, V.; Nastri, E.; Restuccia, E.
2017-11-01
In this work, we show a wide geographical area optical network test bed, adopting the mPlane measurement plane for monitoring its performance and to manage software defined network approaches, with some specific tests and procedures dedicated to respond to disaster events and to support emergency networks. Such a test bed includes FTTX accesses, and it is currently implemented to support future 5G wireless services with slicing procedures based on Carrier Ethernet. The characteristics of this platform have been experimentally tested in the case of a damage-causing link failure and traffic congestion, showing a fast reactions to these disastrous events, allowing the user to recharge the initial QoS parameters.
Xanthos – A Global Hydrologic Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.
Xanthos is an open-source hydrologic model, written in Python, designed to quantify and analyse global water availability. Xanthos simulates historical and future global water availability on a monthly time step at a spatial resolution of 0.5 geographic degrees. Xanthos was designed to be extensible and used by scientists that study global water supply and work with the Global Change Assessment Model (GCAM). Xanthos uses a user-defined configuration file to specify model inputs, outputs and parameters. Xanthos has been tested using actual global data sets and the model is able to provide historical observations and future estimates of renewable freshwater resourcesmore » in the form of total runoff.« less
Xanthos – A Global Hydrologic Model
Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.; ...
2017-09-11
Xanthos is an open-source hydrologic model, written in Python, designed to quantify and analyse global water availability. Xanthos simulates historical and future global water availability on a monthly time step at a spatial resolution of 0.5 geographic degrees. Xanthos was designed to be extensible and used by scientists that study global water supply and work with the Global Change Assessment Model (GCAM). Xanthos uses a user-defined configuration file to specify model inputs, outputs and parameters. Xanthos has been tested using actual global data sets and the model is able to provide historical observations and future estimates of renewable freshwater resourcesmore » in the form of total runoff.« less
The evolution of phase holographic imaging from a research idea to publicly traded company
NASA Astrophysics Data System (ADS)
Egelberg, Peter
2018-02-01
Recognizing the value and unmet need for label-free kinetic cell analysis, Phase Holograhic Imaging defines its market segment as automated, easy to use and affordable time-lapse cytometry. The process of developing new technology, meeting customer expectations, sources of corporate funding and R&D adjustments prompted by field experience will be reviewed. Additionally, it is discussed how relevant biological information can be extracted from a sequence of quantitative phase images, with negligible user assistance and parameter tweaking, to simultaneously provide cell culture characteristics such as cell growth rate, viability, division rate, mitosis duration, phagocytosis rate, migration, motility and cell-cell adherence without requiring any artificial cell manipulation.
A novel Bayesian change-point algorithm for genome-wide analysis of diverse ChIPseq data types.
Xing, Haipeng; Liao, Willey; Mo, Yifan; Zhang, Michael Q
2012-12-10
ChIPseq is a widely used technique for investigating protein-DNA interactions. Read density profiles are generated by using next-sequencing of protein-bound DNA and aligning the short reads to a reference genome. Enriched regions are revealed as peaks, which often differ dramatically in shape, depending on the target protein(1). For example, transcription factors often bind in a site- and sequence-specific manner and tend to produce punctate peaks, while histone modifications are more pervasive and are characterized by broad, diffuse islands of enrichment(2). Reliably identifying these regions was the focus of our work. Algorithms for analyzing ChIPseq data have employed various methodologies, from heuristics(3-5) to more rigorous statistical models, e.g. Hidden Markov Models (HMMs)(6-8). We sought a solution that minimized the necessity for difficult-to-define, ad hoc parameters that often compromise resolution and lessen the intuitive usability of the tool. With respect to HMM-based methods, we aimed to curtail parameter estimation procedures and simple, finite state classifications that are often utilized. Additionally, conventional ChIPseq data analysis involves categorization of the expected read density profiles as either punctate or diffuse followed by subsequent application of the appropriate tool. We further aimed to replace the need for these two distinct models with a single, more versatile model, which can capably address the entire spectrum of data types. To meet these objectives, we first constructed a statistical framework that naturally modeled ChIPseq data structures using a cutting edge advance in HMMs(9), which utilizes only explicit formulas-an innovation crucial to its performance advantages. More sophisticated then heuristic models, our HMM accommodates infinite hidden states through a Bayesian model. We applied it to identifying reasonable change points in read density, which further define segments of enrichment. Our analysis revealed how our Bayesian Change Point (BCP) algorithm had a reduced computational complexity-evidenced by an abridged run time and memory footprint. The BCP algorithm was successfully applied to both punctate peak and diffuse island identification with robust accuracy and limited user-defined parameters. This illustrated both its versatility and ease of use. Consequently, we believe it can be implemented readily across broad ranges of data types and end users in a manner that is easily compared and contrasted, making it a great tool for ChIPseq data analysis that can aid in collaboration and corroboration between research groups. Here, we demonstrate the application of BCP to existing transcription factor(10,11) and epigenetic data(12) to illustrate its usefulness.
User's manual for the HYPGEN hyperbolic grid generator and the HGUI graphical user interface
NASA Technical Reports Server (NTRS)
Chan, William M.; Chiu, Ing-Tsau; Buning, Pieter G.
1993-01-01
The HYPGEN program is used to generate a 3-D volume grid over a user-supplied single-block surface grid. This is accomplished by solving the 3-D hyperbolic grid generation equations consisting of two orthogonality relations and one cell volume constraint. In this user manual, the required input files and parameters and output files are described. Guidelines on how to select the input parameters are given. Illustrated examples are provided showing a variety of topologies and geometries that can be treated. HYPGEN can be used in stand-alone mode as a batch program or it can be called from within a graphical user interface HGUI that runs on Silicon Graphics workstations. This user manual provides a description of the menus, buttons, sliders, and typein fields in HGUI for users to enter the parameters needed to run HYPGEN. Instructions are given on how to configure the interface to allow HYPGEN to run either locally or on a faster remote machine through the use of shell scripts on UNIX operating systems. The volume grid generated is copied back to the local machine for visualization using a built-in hook to PLOT3D.
Maeda, Yukihide; Sugaya, Akiko; Nagayasu, Rie; Nakagawa, Atsuko; Nishizaki, Kazunori
2016-09-01
Audiological parameters alone do not determine the choice to use hearing aids (HA). Subjective hearing-related QoL is a major factor that determines whether or not an older person will continue to wear HA. This study aimed to identify which audiological parameters and quality-of-life (QoL) measures determine whether or not older persons will continue wearing HA. Charts of 157 patients aged ≥65 years who attended the HA service unit at the Otolaryngology Department were retrospectively reviewed. After HA fitting and a trial, the patients were divided into groups, depending upon whether or not they wanted to continue wearing the HA (users, 58.2%; non-users, 41.8%) and then audiological parameters were compared between them. At least 4 months after the HA fitting, the self-reported QoL questionnaire, Hearing Handicap Inventory for the Elderly (HHIE), was mailed to all 157 patients and HHIE scores were compared between HA users and non-users. Speech discrimination score and dynamic range did not significantly differ between HA users and non-users. A difference in the average hearing threshold was marginally significant. The response rate to the HHIE was 65.2%. Total HHIE and emotional scores were higher (more impaired) among HA users than non-users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childs, K.W.
1993-02-01
HEATING is a general-purpose conduction heat transfer program written in Fortran 77. HEATING can solve steady-state and/or transient heat conduction problems in one-, two-, or three-dimensional Cartesian, cylindrical, or spherical coordinates. A model may include multiple materials, and the thermal conductivity, density, and specific heat of each material may be both time- and temperature-dependent. The thermal conductivity may also be anisotropic. Materials may undergo change of phase. Thermal properties of materials may be input or may be extracted from a material properties library. Heat-generation rates may be dependent on time, temperature, and position, and boundary temperatures may be time- andmore » position-dependent. The boundary conditions, which may be surface-to-environment or surface-to-surface, may be specified temperatures or any combination of prescribed heat flux, forced convection, natural convection, and radiation. The boundary condition parameters may be time- and/or temperature-dependent. General gray-body radiation problems may be modeled with user-defined factors for radiant exchange. The mesh spacing may be variable along each axis. HEATING uses a runtime memory allocation scheme to avoid having to recompile to match memory requirements for each specific problem. HEATING utilizes free-form input. Three steady-state solution techniques are available: point-successive-overrelaxation iterative method with extrapolation, direct-solution, and conjugate gradient. Transient problems may be solved using any one of several finite-difference schemes: Crank-Nicolson implicit, Classical Implicit Procedure (CIP), Classical Explicit Procedure (CEP), or Levy explicit method. The solution of the system of equations arising from the implicit techniques is accomplished by point-successive-overrelaxation iteration and includes procedures to estimate the optimum acceleration parameter.« less
Socially Relevant Knowledge Based Telemedicine
2010-10-01
potential to chang e behavior and/or attitude at different situations and different circumstances. Fogg mentions that there are many reasons that...finding appropriate way to pers uade users to perform various activities. Fogg [8] defines persuasive technologies as “intera ctive computing systems...persuasive technology tools (defined b y Fogg ), which we are using in our system is explained below: Tunneling It is a process in which users are
Nanoscale Transport Optimization
2008-12-04
could be argued that the advantage of using ABAQUS for this modeling construct has more to do with its ability to impose a user-defined subroutine that...finite element analysis. This is accomplished by employing a user defined subroutine for fluid properties at the interface within the finite element...package ABAQUS . Model Components: As noted above the governing equation for the material system is given as, ( ) ( ) 4484476444 8444 76
Updated Results for the Wake Vortex Inverse Model
NASA Technical Reports Server (NTRS)
Robins, Robert E.; Lai, David Y.; Delisi, Donald P.; Mellman, George R.
2008-01-01
NorthWest Research Associates (NWRA) has developed an Inverse Model for inverting aircraft wake vortex data. The objective of the inverse modeling is to obtain estimates of the vortex circulation decay and crosswind vertical profiles, using time history measurements of the lateral and vertical position of aircraft vortices. The Inverse Model performs iterative forward model runs using estimates of vortex parameters, vertical crosswind profiles, and vortex circulation as a function of wake age. Iterations are performed until a user-defined criterion is satisfied. Outputs from an Inverse Model run are the best estimates of the time history of the vortex circulation derived from the observed data, the vertical crosswind profile, and several vortex parameters. The forward model, named SHRAPA, used in this inverse modeling is a modified version of the Shear-APA model, and it is described in Section 2 of this document. Details of the Inverse Model are presented in Section 3. The Inverse Model was applied to lidar-observed vortex data at three airports: FAA acquired data from San Francisco International Airport (SFO) and Denver International Airport (DEN), and NASA acquired data from Memphis International Airport (MEM). The results are compared with observed data. This Inverse Model validation is documented in Section 4. A summary is given in Section 5. A user's guide for the inverse wake vortex model is presented in a separate NorthWest Research Associates technical report (Lai and Delisi, 2007a).
On the Role of Imitation on Adolescence Methamphetamine Abuse Dynamics.
Mushanyu, J; Nyabadza, F; Muchatibaya, G; Stewart, A G R
2017-03-01
Adolescence methamphetamine use is an issue of considerable concern due to its correlation with later delinquency, divorce, unemployment and health problems. Understanding how adolescents initiate methamphetamine abuse is important in developing effective prevention programs. We formulate a mathematical model for the spread of methamphetamine abuse using nonlinear ordinary differential equations. It is assumed that susceptibles are recruited into methamphetamine use through imitation. An epidemic threshold value, [Formula: see text], termed the abuse reproduction number, is proposed and defined herein in the drug-using context. The model is shown to exhibit the phenomenon of backward bifurcation. This means that methamphetamine problems may persist in the population even if [Formula: see text] is less than one. Sensitivity analysis of [Formula: see text] was performed to determine the relative importance of different parameters in methamphetamine abuse initiation. The model is then fitted to data on methamphetamine users less than 20 years old reporting methamphetamine as their primary substance of abuse in the treatment centres of Cape Town and parameter values that give the best fit are chosen. Results show that the proportion of methamphetamine users less than 20 years old reporting methamphetamine as their primary substance of abuse will continue to decrease in Cape Town of South Africa. The results suggest that intervention programs targeted at reducing adolescence methamphetamine abuse, are positively impacting methamphetamine abuse.
Web-based Toolkit for Dynamic Generation of Data Processors
NASA Astrophysics Data System (ADS)
Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.
2011-12-01
All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data structures and mappings defined by a user (an author), and allow the original author to modify them using standard authoring techniques. The users can change or define new mappings to create new data processors for download and use. In essence, when executed, the generated data processor binary file can take an input data file in a given format and output this data, possibly transformed, in a different file format. If they so desire, the users will be able modify directly the source code in order to define more complex mappings and transformations that are not currently supported by the toolkit. Initially aimed at supporting research in hydrology, the toolkit's functions and features can be either directly used or easily extended to other areas of climate-related research. The proposed web-based data processing toolkit will be able to generate various custom software processors for data conversion and transformation in a matter of seconds or minutes, saving a significant amount of researchers' time and allowing them to focus on core research issues.
VIP: A knowledge-based design aid for the engineering of space systems
NASA Technical Reports Server (NTRS)
Lewis, Steven M.; Bellman, Kirstie L.
1990-01-01
The Vehicles Implementation Project (VIP), a knowledge-based design aid for the engineering of space systems is described. VIP combines qualitative knowledge in the form of rules, quantitative knowledge in the form of equations, and other mathematical modeling tools. The system allows users rapidly to develop and experiment with models of spacecraft system designs. As information becomes available to the system, appropriate equations are solved symbolically and the results are displayed. Users may browse through the system, observing dependencies and the effects of altering specific parameters. The system can also suggest approaches to the derivation of specific parameter values. In addition to providing a tool for the development of specific designs, VIP aims at increasing the user's understanding of the design process. Users may rapidly examine the sensitivity of a given parameter to others in the system and perform tradeoffs or optimizations of specific parameters. A second major goal of VIP is to integrate the existing corporate knowledge base of models and rules into a central, symbolic form.
Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.
Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A
2005-04-07
Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.
Fels, S S; Hinton, G E
1998-01-01
Glove-TalkII is a system which translates hand gestures to speech through an adaptive interface. Hand gestures are mapped continuously to ten control parameters of a parallel formant speech synthesizer. The mapping allows the hand to act as an artificial vocal tract that produces speech in real time. This gives an unlimited vocabulary in addition to direct control of fundamental frequency and volume. Currently, the best version of Glove-TalkII uses several input devices (including a Cyberglove, a ContactGlove, a three-space tracker, and a foot pedal), a parallel formant speech synthesizer, and three neural networks. The gesture-to-speech task is divided into vowel and consonant production by using a gating network to weight the outputs of a vowel and a consonant neural network. The gating network and the consonant network are trained with examples from the user. The vowel network implements a fixed user-defined relationship between hand position and vowel sound and does not require any training examples from the user. Volume, fundamental frequency, and stop consonants are produced with a fixed mapping from the input devices. One subject has trained to speak intelligibly with Glove-TalkII. He speaks slowly but with far more natural sounding pitch variations than a text-to-speech synthesizer.
Multi-Topic Tracking Model for dynamic social network
NASA Astrophysics Data System (ADS)
Li, Yuhua; Liu, Changzheng; Zhao, Ming; Li, Ruixuan; Xiao, Hailing; Wang, Kai; Zhang, Jun
2016-07-01
The topic tracking problem has attracted much attention in the last decades. However, existing approaches rarely consider network structures and textual topics together. In this paper, we propose a novel statistical model based on dynamic bayesian network, namely Multi-Topic Tracking Model for Dynamic Social Network (MTTD). It takes influence phenomenon, selection phenomenon, document generative process and the evolution of textual topics into account. Specifically, in our MTTD model, Gibbs Random Field is defined to model the influence of historical status of users in the network and the interdependency between them in order to consider the influence phenomenon. To address the selection phenomenon, a stochastic block model is used to model the link generation process based on the users' interests to topics. Probabilistic Latent Semantic Analysis (PLSA) is used to describe the document generative process according to the users' interests. Finally, the dependence on the historical topic status is also considered to ensure the continuity of the topic itself in topic evolution model. Expectation Maximization (EM) algorithm is utilized to estimate parameters in the proposed MTTD model. Empirical experiments on real datasets show that the MTTD model performs better than Popular Event Tracking (PET) and Dynamic Topic Model (DTM) in generalization performance, topic interpretability performance, topic content evolution and topic popularity evolution performance.
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2000-01-01
Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.
Novel methods for parameter-based analysis of myocardial tissue in MR images
NASA Astrophysics Data System (ADS)
Hennemuth, A.; Behrens, S.; Kuehnel, C.; Oeltze, S.; Konrad, O.; Peitgen, H.-O.
2007-03-01
The analysis of myocardial tissue with contrast-enhanced MR yields multiple parameters, which can be used to classify the examined tissue. Perfusion images are often distorted by motion, while late enhancement images are acquired with a different size and resolution. Therefore, it is common to reduce the analysis to a visual inspection, or to the examination of parameters related to the 17-segment-model proposed by the American Heart Association (AHA). As this simplification comes along with a considerable loss of information, our purpose is to provide methods for a more accurate analysis regarding topological and functional tissue features. In order to achieve this, we implemented registration methods for the motion correction of the perfusion sequence and the matching of the late enhancement information onto the perfusion image and vice versa. For the motion corrected perfusion sequence, vector images containing the voxel enhancement curves' semi-quantitative parameters are derived. The resulting vector images are combined with the late enhancement information and form the basis for the tissue examination. For the exploration of data we propose different modes: the inspection of the enhancement curves and parameter distribution in areas automatically segmented using the late enhancement information, the inspection of regions segmented in parameter space by user defined threshold intervals and the topological comparison of regions segmented with different settings. Results showed a more accurate detection of distorted regions in comparison to the AHA-model-based evaluation.
Method and apparatus for assessing weld quality
Smartt, Herschel B.; Kenney, Kevin L.; Johnson, John A.; Carlson, Nancy M.; Clark, Denis E.; Taylor, Paul L.; Reutzel, Edward W.
2001-01-01
Apparatus for determining a quality of a weld produced by a welding device according to the present invention includes a sensor operatively associated with the welding device. The sensor is responsive to at least one welding process parameter during a welding process and produces a welding process parameter signal that relates to the at least one welding process parameter. A computer connected to the sensor is responsive to the welding process parameter signal produced by the sensor. A user interface operatively associated with the computer allows a user to select a desired welding process. The computer processes the welding process parameter signal produced by the sensor in accordance with one of a constant voltage algorithm, a short duration weld algorithm or a pulsed current analysis module depending on the desired welding process selected by the user. The computer produces output data indicative of the quality of the weld.
Structural Tailoring of Advanced Turboprops (STAT). Theoretical manual
NASA Technical Reports Server (NTRS)
Brown, K. W.
1992-01-01
This manual describes the theories in the Structural Tailoring of Advanced Turboprops (STAT) computer program, which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analyses include an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution (1-p) forced response life prediction capability. The STAT constraints include blade stresses, blade resonances, flutter, tip displacements, and a 1-P forced response life fraction. The STAT variables include all blade internal and external geometry parameters needed to define a composite material blade. The STAT objective function is dependent upon a blade baseline definition which the user supplies to describe a current blade design for cost optimization or for the tailoring of an aeroelastic scale model.
Structural Tailoring of Advanced Turboprops (STAT). Theoretical manual
NASA Astrophysics Data System (ADS)
Brown, K. W.
1992-10-01
This manual describes the theories in the Structural Tailoring of Advanced Turboprops (STAT) computer program, which was developed to perform numerical optimizations on highly swept propfan blades. The optimization procedure seeks to minimize an objective function, defined as either direct operating cost or aeroelastic differences between a blade and its scaled model, by tuning internal and external geometry variables that must satisfy realistic blade design constraints. The STAT analyses include an aerodynamic efficiency evaluation, a finite element stress and vibration analysis, an acoustic analysis, a flutter analysis, and a once-per-revolution (1-p) forced response life prediction capability. The STAT constraints include blade stresses, blade resonances, flutter, tip displacements, and a 1-P forced response life fraction. The STAT variables include all blade internal and external geometry parameters needed to define a composite material blade. The STAT objective function is dependent upon a blade baseline definition which the user supplies to describe a current blade design for cost optimization or for the tailoring of an aeroelastic scale model.
ODEion--a software module for structural identification of ordinary differential equations.
Gennemark, Peter; Wedelin, Dag
2014-02-01
In the systems biology field, algorithms for structural identification of ordinary differential equations (ODEs) have mainly focused on fixed model spaces like S-systems and/or on methods that require sufficiently good data so that derivatives can be accurately estimated. There is therefore a lack of methods and software that can handle more general models and realistic data. We present ODEion, a software module for structural identification of ODEs. Main characteristic features of the software are: • The model space is defined by arbitrary user-defined functions that can be nonlinear in both variables and parameters, such as for example chemical rate reactions. • ODEion implements computationally efficient algorithms that have been shown to efficiently handle sparse and noisy data. It can run a range of realistic problems that previously required a supercomputer. • ODEion is easy to use and provides SBML output. We describe the mathematical problem, the ODEion system itself, and provide several examples of how the system can be used. Available at: http://www.odeidentification.org.
How stimulation speed affects Event-Related Potentials and BCI performance.
Höhne, Johannes; Tangermann, Michael
2012-01-01
In most paradigms for Brain-Computer Interfaces (BCIs) that are based on Event-Related Potentials (ERPs), stimuli are presented with a pre-defined and constant speed. In order to boost BCI performance by optimizing the parameters of stimulation, this offline study investigates the impact of the stimulus onset asynchrony (SOA) on ERPs and the resulting classification accuracy. The SOA is defined as the time between the onsets of two consecutive stimuli, which represents a measure for stimulation speed. A simple auditory oddball paradigm was tested in 14 SOA conditions with a SOA between 50 ms and 1000 ms. Based on an offline ERP analysis, the BCI performance (quantified by the Information Transfer Rate, ITR in bits/min) was simulated. A great variability in the simulated BCI performance was observed within subjects (N=11). This indicates a potential increase in BCI performance (≥ 1.6 bits/min) for ERP-based paradigms, if the stimulation speed is specified for each user individually.
Viger, Roland J.
2008-01-01
This fact sheet provides a high-level description of the GIS Weasel, a software system designed to aid users in preparing spatial information as input to lumped and distributed parameter environmental simulation models (ESMs). The GIS Weasel provides geographic information system (GIS) tools to help create maps of geographic features relevant to the application of a user?s ESM and to generate parameters from those maps. The operation of the GIS Weasel does not require a user to be a GIS expert, only that a user has an understanding of the spatial information requirements of the model. The GIS Weasel software system provides a GIS-based graphical user interface (GUI), C programming language executables, and general utility scripts. The software will run on any computing platform where ArcInfo Workstation (version 8.1 or later) and the GRID extension are accessible. The user controls the GIS Weasel by interacting with menus, maps, and tables.
Hill, Mary Catherine
1992-01-01
This report documents a new version of the U.S. Geological Survey modular, three-dimensional, finite-difference, ground-water flow model (MODFLOW) which, with the new Parameter-Estimation Package that also is documented in this report, can be used to estimate parameters by nonlinear regression. The new version of MODFLOW is called MODFLOWP (pronounced MOD-FLOW*P), and functions nearly identically to MODFLOW when the ParameterEstimation Package is not used. Parameters are estimated by minimizing a weighted least-squares objective function by the modified Gauss-Newton method or by a conjugate-direction method. Parameters used to calculate the following MODFLOW model inputs can be estimated: Transmissivity and storage coefficient of confined layers; hydraulic conductivity and specific yield of unconfined layers; vertical leakance; vertical anisotropy (used to calculate vertical leakance); horizontal anisotropy; hydraulic conductance of the River, Streamflow-Routing, General-Head Boundary, and Drain Packages; areal recharge rates; maximum evapotranspiration; pumpage rates; and the hydraulic head at constant-head boundaries. Any spatial variation in parameters can be defined by the user. Data used to estimate parameters can include existing independent estimates of parameter values, observed hydraulic heads or temporal changes in hydraulic heads, and observed gains and losses along head-dependent boundaries (such as streams). Model output includes statistics for analyzing the parameter estimates and the model; these statistics can be used to quantify the reliability of the resulting model, to suggest changes in model construction, and to compare results of models constructed in different ways.
Conceptual Modeling via Logic Programming
1990-01-01
Define User Interface and Query Language L i1W= Ltl k.l 4. Define Procedures for Specifying Output S . Select Logic Programming Language 6. Develop ...baseline s change model. sessions and baselines. It was changed 6. Develop Methodology for C 31 Users. considerably with the advent of the window This...Model Development : Implica- for Conceptual Modeling Via Logic tions for Communications of a Cognitive Programming. Marina del Rey, Calif.: Analysis of
Application of MCT Failure Criterion using EFM
2010-03-26
because HELIUS:MCT™ does not facilitate this. Attempts have been made to use ABAQUS native thermal expansion model combined in addition to Helius-MCT... ABAQUS using a user defined element subroutine EFM. Comparisons have been made between the analysis results using EFM-MCT code and HELIUS:MCT™ code...using the Element-Failure Method (EFM) in ABAQUS . The EFM-MCT has been implemented in ABAQUS using a user defined element subroutine EFM. Comparisons
Lazinski, David W.; Camilli, Andrew
2013-01-01
The amplification of DNA fragments, cloned between user-defined 5′ and 3′ end sequences, is a prerequisite step in the use of many current applications including massively parallel sequencing (MPS). Here we describe an improved method, called homopolymer tail-mediated ligation PCR (HTML-PCR), that requires very little starting template, minimal hands-on effort, is cost-effective, and is suited for use in high-throughput and robotic methodologies. HTML-PCR starts with the addition of homopolymer tails of controlled lengths to the 3′ termini of a double-stranded genomic template. The homopolymer tails enable the annealing-assisted ligation of a hybrid oligonucleotide to the template's recessed 5′ ends. The hybrid oligonucleotide has a user-defined sequence at its 5′ end. This primer, together with a second primer composed of a longer region complementary to the homopolymer tail and fused to a second 5′ user-defined sequence, are used in a PCR reaction to generate the final product. The user-defined sequences can be varied to enable compatibility with a wide variety of downstream applications. We demonstrate our new method by constructing MPS libraries starting from nanogram and sub-nanogram quantities of Vibrio cholerae and Streptococcus pneumoniae genomic DNA. PMID:23311318
NASA Technical Reports Server (NTRS)
Kanning, G.
1975-01-01
A digital computer program written in FORTRAN is presented that implements the system identification theory for deterministic systems using input-output measurements. The user supplies programs simulating the mathematical model of the physical plant whose parameters are to be identified. The user may choose any one of three options. The first option allows for a complete model simulation for fixed input forcing functions. The second option identifies up to 36 parameters of the model from wind tunnel or flight measurements. The third option performs a sensitivity analysis for up to 36 parameters. The use of each option is illustrated with an example using input-output measurements for a helicopter rotor tested in a wind tunnel.
Traffic Generator (TrafficGen) Version 1.4.2: Users Guide
2016-06-01
events, the user has to enter them manually . We will research and implement a way to better define and organize the multicast addresses so they can be...the network with Transmission Control Protocol and User Datagram Protocol Internet Protocol traffic. Each node generating network traffic in an...TrafficGen Graphical User Interface (GUI) 3 3.1 Anatomy of the User Interface 3 3.2 Scenario Configuration and MGEN Files 4 4. Working with
Issues central to a useful image understanding environment
NASA Astrophysics Data System (ADS)
Beveridge, J. Ross; Draper, Bruce A.; Hanson, Allen R.; Riseman, Edward M.
1992-04-01
A recent DARPA initiative has sparked interested in software environments for computer vision. The goal is a single environment to support both basic research and technology transfer. This paper lays out six fundamental attributes such a system must possess: (1) support for both C and Lisp, (2) extensibility, (3) data sharing, (4) data query facilities tailored to vision, (5) graphics, and (6) code sharing. The first three attributes fundamentally constrain the system design. Support for both C and Lisp demands some form of database or data-store for passing data between languages. Extensibility demands that system support facilities, such as spatial retrieval of data, be readily extended to new user-defined datatypes. Finally, data sharing demands that data saved by one user, including data of a user-defined type, must be readable by another user.
NASA Astrophysics Data System (ADS)
von Boetticher, Albrecht; Rickenmann, Dieter; McArdell, Brian; Kirchner, James W.
2017-04-01
Debris flows are dense flowing mixtures of water, clay, silt, sand and coarser particles. They are a common natural hazard in mountain regions and frequently cause severe damage. Modeling debris flows to design protection measures is still challenging due to the complex interactions within the inhomogeneous material mixture, and the sensitivity of the flow process to the channel geometry. The open-source, OpenFOAM-based finite-volume debris flow model debrisInterMixing (von Boetticher et al, 2016) defines rheology parameters based on the material properties of the debris flow mixture to reduce the number of free model parameters. As a simplification in this first model version, gravel was treated as a Coulomb-viscoplastic fluid, neglecting grain-to-grain collisions and the coupling between the coarser gravel grains and the interstitial fluid. Here we present an extension of that solver, accounting for the particle-to-particle and particle-to-boundary contacts with a Lagrangian Particle Simulation composed of spherical grains and a user-defined grain size distribution. The grain collisions of the Lagrangian particles add granular flow behavior to the finite-volume simulation of the continuous phases. The two-way coupling exchanges momentum between the phase-averaged flow in a finite volume cell, and among all individual particles contained in that cell, allowing the user to choose from a number of different drag models. The momentum exchange is implemented in the momentum equation and in the pressure equation (ensuring continuity) of the so-called PISO-loop, resulting in a stable 4-way coupling (particle-to-particle, particle-to-boundary, particle-to-fluid and fluid-to-particle) that represents the granular and viscous flow behavior of debris flow material. We will present simulations that illustrate the relative benefits and drawbacks of explicitly representing grain collisions, compared to the original debrisInterMixing solver.
Knowledge system and method for simulating chemical controlled release device performance
Cowan, Christina E.; Van Voris, Peter; Streile, Gary P.; Cataldo, Dominic A.; Burton, Frederick G.
1991-01-01
A knowledge system for simulating the performance of a controlled release device is provided. The system includes an input device through which the user selectively inputs one or more data parameters. The data parameters comprise first parameters including device parameters, media parameters, active chemical parameters and device release rate; and second parameters including the minimum effective inhibition zone of the device and the effective lifetime of the device. The system also includes a judgemental knowledge base which includes logic for 1) determining at least one of the second parameters from the release rate and the first parameters and 2) determining at least one of the first parameters from the other of the first parameters and the second parameters. The system further includes a device for displaying the results of the determinations to the user.
Impulsivity in men with prescription of benzodiazepines and methadone in prison.
Moreno-Ramos, Luis; Fernández-Serrano, María José; Pérez-García, Miguel; Verdejo-García, Antonio
2016-06-14
Benzodiazepines and methadone use has been associated with various neuropsychological impairments. However, to the best of our knowledge, no studies have been carried out on the effect of these substances (either separately or combined) on impulsive personality, including studies in prisoners. The aim of this study is to examine the impulsive personality of a sample of 134 male prisoners using the Sensitivity to Punishment and Sensitivity to Reward Questionnaire (Torrubia, Avila, Molto, & Caseras, 2001) and the UPPS-P Scale (Cyders et al., 2007). Some of these were methadone users, methadone and benzodiazepines users, polydrug users in abstinence and non-dependent drug users. The results showed that drug users have greater sensitivity to reward, positive urgency, negative urgency and sensation seeking than non-dependent users. Methadone users showed more sensitivity to punishment and lack of perseverance with respect to other users. No differences were found between methadone+benzodiazepines users and other groups. The secondary aim is to examine which impulsive personality dimensions are related to the two motivational systems proposed by Gray (BIS-BAS) using exploratory factor analysis. Results showed two different components. One component was defined by the subscales sensitivity to reinforcement, positive urgency, negative urgency and sensation seeking. The second component was defined by the subscales sensitivity to punishment, lack of perseverance and lack of premeditation.
Using component technology to facilitate external software reuse in ground-based planning systems
NASA Technical Reports Server (NTRS)
Chase, A.
2003-01-01
APGEN (Activity Plan GENerator - 314), a multi-mission planning tool, must interface with external software to vest serve its users. AP-GEN's original method for incorporating external software, the User-Defined library mechanism, has been very successful in allowing APGEN users access to external software functionality.
Managing End User Computing in the Federal Government.
ERIC Educational Resources Information Center
General Services Administration, Washington, DC.
This report presents an initial approach developed by the General Services Administration for the management of end user computing in federal government agencies. Defined as technology used directly by individuals in need of information products, end user computing represents a new field encompassing such technologies as word processing, personal…
User's Manual for the Object User Interface (OUI): An Environmental Resource Modeling Framework
Markstrom, Steven L.; Koczot, Kathryn M.
2008-01-01
The Object User Interface is a computer application that provides a framework for coupling environmental-resource models and for managing associated temporal and spatial data. The Object User Interface is designed to be easily extensible to incorporate models and data interfaces defined by the user. Additionally, the Object User Interface is highly configurable through the use of a user-modifiable, text-based control file that is written in the eXtensible Markup Language. The Object User Interface user's manual provides (1) installation instructions, (2) an overview of the graphical user interface, (3) a description of the software tools, (4) a project example, and (5) specifications for user configuration and extension.
Competitive Cyber-Insurance and Internet Security
NASA Astrophysics Data System (ADS)
Shetty, Nikhil; Schwartz, Galina; Felegyhazi, Mark; Walrand, Jean
This paper investigates how competitive cyber-insurers affect network security and welfare of the networked society. In our model, a user's probability to incur damage (from being attacked) depends on both his security and the network security, with the latter taken by individual users as given. First, we consider cyberinsurers who cannot observe (and thus, affect) individual user security. This asymmetric information causes moral hazard. Then, for most parameters, no equilibrium exists: the insurance market is missing. Even if an equilibrium exists, the insurance contract covers only a minor fraction of the damage; network security worsens relative to the no-insurance equilibrium. Second, we consider insurers with perfect information about their users' security. Here, user security is perfectly enforceable (zero cost); each insurance contract stipulates the required user security. The unique equilibrium contract covers the entire user damage. Still, for most parameters, network security worsens relative to the no-insurance equilibrium. Although cyber-insurance improves user welfare, in general, competitive cyber-insurers fail to improve network security.
9th Annual CMMI Technology Conference and User Group-Tuesday
2009-11-19
evaluating and quantifying risk likelihood and severity risks. Step 4 Project defines thresholds for each risk category. Step 5 Project defines bounds on the...defines consistent criteria for evaluating and quantifying risk likelihood and severity risks in the Risk Management Plan. Step 4 Project defines
Mes-Krowinkel, Miranda G; Louwers, Yvonne V; Mulders, Annemarie G M G J; de Jong, Frank H; Fauser, Bart C J M; Laven, Joop S E
2014-06-01
To evaluate the influence of oral contraceptive pills (OCPs) on anthromorphometric, endocrine, and metabolic parameters in women with polycystic ovary syndrome (PCOS). Retrospective cross-sectional cohort study for the period 1993-2011. Tertiary university hospital. PCOS patients, who never, ever, or at time of screening were using OCPs were included. A total of 1,297 patients, of whom 827 were white, were included. All PCOS patients diagnosed according to the Rotterdam 2003 consensus criteria were divided into three groups: current users, (n = 76; 6% of total), ever users (n = 1,018; 78%), and never users (n = 203; 16%). Ever users were subdivided based on the OCP-free interval. None. Anthromorphometric (blood pressure, cycle duration) and ultrasound (follicle count, mean ovarian volume) parameters, endocrine (SHBG, testosterone, free androgen index, antimüllerian hormone [AMH]) and lipid profiles. Current users and ever users were compared with never users. In current users, SHBG was increased and androgen levels decreased. Patients with an OCP-free interval of <1 year had a higher mean follicle count, higher AMH level, and increased serum androgen level compared with never users. SHBG levels remained increased until 5-10 years after cessation of OCP use. OCP use causes a milder phenotypic presentation of PCOS regarding hyperandrogenism. However, it does not alter parameters associated with increased health risks. Copyright © 2014 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
QUEST Hanford Site Computer Users - What do they do?
DOE Office of Scientific and Technical Information (OSTI.GOV)
WITHERSPOON, T.T.
2000-03-02
The Fluor Hanford Chief Information Office requested that a computer-user survey be conducted to determine the user's dependence on the computer and its importance to their ability to accomplish their work. Daily use trends and future needs of Hanford Site personal computer (PC) users was also to be defined. A primary objective was to use the data to determine how budgets should be focused toward providing those services that are truly needed by the users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Richen; Guo, Hanqi; Yuan, Xiaoru
Most of the existing approaches to visualize vector field ensembles are to reveal the uncertainty of individual variables, for example, statistics, variability, etc. However, a user-defined derived feature like vortex or air mass is also quite significant, since they make more sense to domain scientists. In this paper, we present a new framework to extract user-defined derived features from different simulation runs. Specially, we use a detail-to-overview searching scheme to help extract vortex with a user-defined shape. We further compute the geometry information including the size, the geo-spatial location of the extracted vortexes. We also design some linked views tomore » compare them between different runs. At last, the temporal information such as the occurrence time of the feature is further estimated and compared. Results show that our method is capable of extracting the features across different runs and comparing them spatially and temporally.« less
Origin of Disagreements in Tandem Mass Spectra Interpretation by Search Engines.
Tessier, Dominique; Lollier, Virginie; Larré, Colette; Rogniaux, Hélène
2016-10-07
Several proteomic database search engines that interpret LC-MS/MS data do not identify the same set of peptides. These disagreements occur even when the scores of the peptide-to-spectrum matches suggest good confidence in the interpretation. Our study shows that these disagreements observed for the interpretations of a given spectrum are almost exclusively due to the variation of what we call the "peptide space", i.e., the set of peptides that are actually compared to the experimental spectra. We discuss the potential difficulties of precisely defining the "peptide space." Indeed, although several parameters that are generally reported in publications can easily be set to the same values, many additional parameters-with much less straightforward user access-might impact the "peptide space" used by each program. Moreover, in a configuration where each search engine identifies the same candidates for each spectrum, the inference of the proteins may remain quite different depending on the false discovery rate selected.
Hierarchical Scaffolding With Bambus
Pop, Mihai; Kosack, Daniel S.; Salzberg, Steven L.
2004-01-01
The output of a genome assembler generally comprises a collection of contiguous DNA sequences (contigs) whose relative placement along the genome is not defined. A procedure called scaffolding is commonly used to order and orient these contigs using paired read information. This ordering of contigs is an essential step when finishing and analyzing the data from a whole-genome shotgun project. Most recent assemblers include a scaffolding module; however, users have little control over the scaffolding algorithm or the information produced. We thus developed a general-purpose scaffolder, called Bambus, which affords users significant flexibility in controlling the scaffolding parameters. Bambus was used recently to scaffold the low-coverage draft dog genome data. Most significantly, Bambus enables the use of linking data other than that inferred from mate-pair information. For example, the sequence of a completed genome can be used to guide the scaffolding of a related organism. We present several applications of Bambus: support for finishing, comparative genomics, analysis of the haplotype structure of genomes, and scaffolding of a mammalian genome at low coverage. Bambus is available as an open-source package from our Web site. PMID:14707177
Hierarchical scaffolding with Bambus.
Pop, Mihai; Kosack, Daniel S; Salzberg, Steven L
2004-01-01
The output of a genome assembler generally comprises a collection of contiguous DNA sequences (contigs) whose relative placement along the genome is not defined. A procedure called scaffolding is commonly used to order and orient these contigs using paired read information. This ordering of contigs is an essential step when finishing and analyzing the data from a whole-genome shotgun project. Most recent assemblers include a scaffolding module; however, users have little control over the scaffolding algorithm or the information produced. We thus developed a general-purpose scaffolder, called Bambus, which affords users significant flexibility in controlling the scaffolding parameters. Bambus was used recently to scaffold the low-coverage draft dog genome data. Most significantly, Bambus enables the use of linking data other than that inferred from mate-pair information. For example, the sequence of a completed genome can be used to guide the scaffolding of a related organism. We present several applications of Bambus: support for finishing, comparative genomics, analysis of the haplotype structure of genomes, and scaffolding of a mammalian genome at low coverage. Bambus is available as an open-source package from our Web site.
NASA Technical Reports Server (NTRS)
1975-01-01
The trajectory simulation mode (SIMSEP) requires the namelist SIMSEP to follow TRAJ. The SIMSEP contains parameters which describe the scope of the simulation, expected dynamic errors, and cumulative statistics from previous SIMSEP runs. Following SIMSEP are a set of GUID namelists, one for each guidance correction maneuver. The GUID describes the strategy, knowledge or estimation uncertainties and cumulative statistics for that particular maneuver. The trajectory display mode (REFSEP) requires only the namelist TRAJ followed by scheduling cards, similar to those used in GODSEP. The fixed field schedule cards define: types of data displayed, span of interest, and frequency of printout. For those users who can vary the amount of blank common storage in their runs, a guideline to estimate the total MAPSEP core requirements is given. Blank common length is related directly to the dimension of the dynamic state (NDIM) used in transition matrix (STM) computation, and, the total augmented (knowledge) state (NAUG). The values of program and blank common must be added to compute the total decimal core for a CDC 6500. Other operating systems must scale these requirements appropriately.
GOSSIP, a New VO Compliant Tool for SED Fitting
NASA Astrophysics Data System (ADS)
Franzetti, P.; Scodeggio, M.; Garilli, B.; Fumana, M.; Paioro, L.
2008-08-01
We present GOSSIP (Galaxy Observed-Simulated SED Interactive Program), a new tool developed to perform SED fitting in a simple, user friendly and efficient way. GOSSIP automatically builds-up the observed SED of an object (or a large sample of objects) combining magnitudes in different bands and eventually a spectrum; then it performs a χ^2 minimization fitting procedure versus a set of synthetic models. The fitting results are used to estimate a number of physical parameters like the Star Formation History, absolute magnitudes, stellar mass and their Probability Distribution Functions. User defined models can be used, but GOSSIP is also able to load models produced by the most commonly used synthesis population codes. GOSSIP can be used interactively with other visualization tools using the PLASTIC protocol for communications. Moreover, since it has been developed with large data sets applications in mind, it will be extended to operate within the Virtual Observatory framework. GOSSIP is distributed to the astronomical community from the PANDORA group web site (http://cosmos.iasf-milano.inaf.it/pandora/gossip.html).
Operations analysis (study 2.1): Program manual and users guide for the LOVES computer code
NASA Technical Reports Server (NTRS)
Wray, S. T., Jr.
1975-01-01
Information is provided necessary to use the LOVES Computer Program in its existing state, or to modify the program to include studies not properly handled by the basic model. The Users Guide defines the basic elements assembled together to form the model for servicing satellites in orbit. As the program is a simulation, the method of attack is to disassemble the problem into a sequence of events, each occurring instantaneously and each creating one or more other events in the future. The main driving force of the simulation is the deterministic launch schedule of satellites and the subsequent failure of the various modules which make up the satellites. The LOVES Computer Program uses a random number generator to simulate the failure of module elements and therefore operates over a long span of time typically 10 to 15 years. The sequence of events is varied by making several runs in succession with different random numbers resulting in a Monte Carlo technique to determine statistical parameters of minimum value, average value, and maximum value.
NASA Technical Reports Server (NTRS)
Vanderplaats, G. N.; Chen, Xiang; Zhang, Ning-Tian
1988-01-01
The use of formal numerical optimization methods for the design of gears is investigated. To achieve this, computer codes were developed for the analysis of spur gears and spiral bevel gears. These codes calculate the life, dynamic load, bending strength, surface durability, gear weight and size, and various geometric parameters. It is necessary to calculate all such important responses because they all represent competing requirements in the design process. The codes developed here were written in subroutine form and coupled to the COPES/ADS general purpose optimization program. This code allows the user to define the optimization problem at the time of program execution. Typical design variables include face width, number of teeth and diametral pitch. The user is free to choose any calculated response as the design objective to minimize or maximize and may impose lower and upper bounds on any calculated responses. Typical examples include life maximization with limits on dynamic load, stress, weight, etc. or minimization of weight subject to limits on life, dynamic load, etc. The research codes were written in modular form for easy expansion and so that they could be combined to create a multiple reduction optimization capability in future.
SPLICER - A GENETIC ALGORITHM TOOL FOR SEARCH AND OPTIMIZATION, VERSION 1.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Wang, L.
1994-01-01
SPLICER is a genetic algorithm tool which can be used to solve search and optimization problems. Genetic algorithms are adaptive search procedures (i.e. problem solving methods) based loosely on the processes of natural selection and Darwinian "survival of the fittest." SPLICER provides the underlying framework and structure for building a genetic algorithm application. These algorithms apply genetically-inspired operators to populations of potential solutions in an iterative fashion, creating new populations while searching for an optimal or near-optimal solution to the problem at hand. SPLICER 1.0 was created using a modular architecture that includes a Genetic Algorithm Kernel, interchangeable Representation Libraries, Fitness Modules and User Interface Libraries, and well-defined interfaces between these components. The architecture supports portability, flexibility, and extensibility. SPLICER comes with all source code and several examples. For instance, a "traveling salesperson" example searches for the minimum distance through a number of cities visiting each city only once. Stand-alone SPLICER applications can be used without any programming knowledge. However, to fully utilize SPLICER within new problem domains, familiarity with C language programming is essential. SPLICER's genetic algorithm (GA) kernel was developed independent of representation (i.e. problem encoding), fitness function or user interface type. The GA kernel comprises all functions necessary for the manipulation of populations. These functions include the creation of populations and population members, the iterative population model, fitness scaling, parent selection and sampling, and the generation of population statistics. In addition, miscellaneous functions are included in the kernel (e.g., random number generators). Different problem-encoding schemes and functions are defined and stored in interchangeable representation libraries. This allows the GA kernel to be used with any representation scheme. The SPLICER tool provides representation libraries for binary strings and for permutations. These libraries contain functions for the definition, creation, and decoding of genetic strings, as well as multiple crossover and mutation operators. Furthermore, the SPLICER tool defines the appropriate interfaces to allow users to create new representation libraries. Fitness modules are the only component of the SPLICER system a user will normally need to create or alter to solve a particular problem. Fitness functions are defined and stored in interchangeable fitness modules which must be created using C language. Within a fitness module, a user can create a fitness (or scoring) function, set the initial values for various SPLICER control parameters (e.g., population size), create a function which graphically displays the best solutions as they are found, and provide descriptive information about the problem. The tool comes with several example fitness modules, while the process of developing a fitness module is fully discussed in the accompanying documentation. The user interface is event-driven and provides graphic output in windows. SPLICER is written in Think C for Apple Macintosh computers running System 6.0.3 or later and Sun series workstations running SunOS. The UNIX version is easily ported to other UNIX platforms and requires MIT's X Window System, Version 11 Revision 4 or 5, MIT's Athena Widget Set, and the Xw Widget Set. Example executables and source code are included for each machine version. The standard distribution media for the Macintosh version is a set of three 3.5 inch Macintosh format diskettes. The standard distribution medium for the UNIX version is a .25 inch streaming magnetic tape cartridge in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. SPLICER was developed in 1991.
Between-User Reliability of Tier 1 Exposure Assessment Tools Used Under REACH.
Lamb, Judith; Galea, Karen S; Miller, Brian G; Hesse, Susanne; Van Tongeren, Martie
2017-10-01
When applying simple screening (Tier 1) tools to estimate exposure to chemicals in a given exposure situation under the Registration, Evaluation, Authorisation and restriction of CHemicals Regulation 2006 (REACH), users must select from several possible input parameters. Previous studies have suggested that results from exposure assessments using expert judgement and from the use of modelling tools can vary considerably between assessors. This study aimed to investigate the between-user reliability of Tier 1 tools. A remote-completion exercise and in person workshop were used to identify and evaluate tool parameters and factors such as user demographics that may be potentially associated with between-user variability. Participants (N = 146) generated dermal and inhalation exposure estimates (N = 4066) from specified workplace descriptions ('exposure situations') and Tier 1 tool combinations (N = 20). Interactions between users, tools, and situations were investigated and described. Systematic variation associated with individual users was minor compared with random between-user variation. Although variation was observed between choices made for the majority of input parameters, differing choices of Process Category ('PROC') code/activity descriptor and dustiness level impacted most on the resultant exposure estimates. Exposure estimates ranging over several orders of magnitude were generated for the same exposure situation by different tool users. Such unpredictable between-user variation will reduce consistency within REACH processes and could result in under-estimation or overestimation of exposure, risking worker ill-health or the implementation of unnecessary risk controls, respectively. Implementation of additional support and quality control systems for all tool users is needed to reduce between-assessor variation and so ensure both the protection of worker health and avoidance of unnecessary business risk management expenditure. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
User interface user's guide for HYPGEN
NASA Technical Reports Server (NTRS)
Chiu, Ing-Tsau
1992-01-01
The user interface (UI) of HYPGEN is developed using Panel Library to shorten the learning curve for new users and provide easier ways to run HYPGEN for casual users as well as for advanced users. Menus, buttons, sliders, and type-in fields are used extensively in UI to allow users to point and click with a mouse to choose various available options or to change values of parameters. On-line help is provided to give users information on using UI without consulting the manual. Default values are set for most parameters and boundary conditions are determined by UI to further reduce the effort needed to run HYPGEN; however, users are free to make any changes and save it in a file for later use. A hook to PLOT3D is built in to allow graphics manipulation. The viewpoint and min/max box for PLOT3D windows are computed by UI and saved in a PLOT3D journal file. For large grids which take a long time to generate on workstations, the grid generator (HYPGEN) can be run on faster computers such as Crays, while UI stays at the workstation.
FLIP for FLAG model visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wooten, Hasani Omar
A graphical user interface has been developed for FLAG users. FLIP (FLAG Input deck Parser) provides users with an organized view of FLAG models and a means for efficiently and easily navigating and editing nodes, parameters, and variables.
Sketching Uncertainty into Simulations.
Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E
2012-12-01
In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.
Recommendations for a service framework to access astronomical archives
NASA Technical Reports Server (NTRS)
Travisano, J. J.; Pollizzi, J.
1992-01-01
There are a large number of astronomical archives and catalogs on-line for network access, with many different user interfaces and features. Some systems are moving towards distributed access, supplying users with client software for their home sites which connects to servers at the archive site. Many of the issues involved in defining a standard framework of services that archive/catalog suppliers can use to achieve a basic level of interoperability are described. Such a framework would simplify the development of client and server programs to access the wide variety of astronomical archive systems. The primary services that are supplied by current systems include: catalog browsing, dataset retrieval, name resolution, and data analysis. The following issues (and probably more) need to be considered in establishing a standard set of client/server interfaces and protocols: Archive Access - dataset retrieval, delivery, file formats, data browsing, analysis, etc.; Catalog Access - database management systems, query languages, data formats, synchronous/asynchronous mode of operation, etc.; Interoperability - transaction/message protocols, distributed processing mechanisms (DCE, ONC/SunRPC, etc), networking protocols, etc.; Security - user registration, authorization/authentication mechanisms, etc.; Service Directory - service registration, lookup, port/task mapping, parameters, etc.; Software - public vs proprietary, client/server software, standard interfaces to client/server functions, software distribution, operating system portability, data portability, etc. Several archive/catalog groups, notably the Astrophysics Data System (ADS), are already working in many of these areas. In the process of developing StarView, which is the user interface to the Space Telescope Data Archive and Distribution Service (ST-DADS), these issues and the work of others were analyzed. A framework of standard interfaces for accessing services on any archive system which would benefit archive user and supplier alike is proposed.
A new dynamic tactile display for reconfigurable braille: implementation and tests.
Motto Ros, Paolo; Dante, Vittorio; Mesin, Luca; Petetti, Erminio; Del Giudice, Paolo; Pasero, Eros
2014-01-01
Different tactile interfaces have been proposed to represent either text (braille) or, in a few cases, tactile large-area screens as replacements for visual displays. None of the implementations so far can be customized to match users' preferences, perceptual differences and skills. Optimal choices in these respects are still debated; we approach a solution by designing a flexible device allowing the user to choose key parameters of tactile transduction. We present here a new dynamic tactile display, a 8 × 8 matrix of plastic pins based on well-established and reliable piezoelectric technology to offer high resolution (pin gap 0.7mm) as well as tunable strength of the pins displacement, and refresh rate up to 50s(-1). It can reproduce arbitrary patterns, allowing it to serve the dual purpose of providing, depending on contingent user needs, tactile rendering of non-character information, and reconfigurable braille rendering. Given the relevance of the latter functionality for the expected average user, we considered testing braille encoding by volunteers a benchmark of primary importance. Tests were performed to assess the acceptance and usability with minimal training, and to check whether the offered flexibility was indeed perceived by the subject as an added value compared to conventional braille devices. Different mappings between braille dots and actual tactile pins were implemented to match user needs. Performances of eight experienced braille readers were defined as the fraction of correct identifications of rendered content. Different information contents were tested (median performance on random strings, words, sentences identification was about 75%, 85%, 98%, respectively, with a significant increase, p < 0.01), obtaining statistically significant improvements in performance during the tests (p < 0.05). Experimental results, together with qualitative ratings provided by the subjects, show a good acceptance and the effectiveness of the proposed solution.
A new dynamic tactile display for reconfigurable braille: implementation and tests
Motto Ros, Paolo; Dante, Vittorio; Mesin, Luca; Petetti, Erminio; Del Giudice, Paolo; Pasero, Eros
2014-01-01
Different tactile interfaces have been proposed to represent either text (braille) or, in a few cases, tactile large-area screens as replacements for visual displays. None of the implementations so far can be customized to match users' preferences, perceptual differences and skills. Optimal choices in these respects are still debated; we approach a solution by designing a flexible device allowing the user to choose key parameters of tactile transduction. We present here a new dynamic tactile display, a 8 × 8 matrix of plastic pins based on well-established and reliable piezoelectric technology to offer high resolution (pin gap 0.7mm) as well as tunable strength of the pins displacement, and refresh rate up to 50s−1. It can reproduce arbitrary patterns, allowing it to serve the dual purpose of providing, depending on contingent user needs, tactile rendering of non-character information, and reconfigurable braille rendering. Given the relevance of the latter functionality for the expected average user, we considered testing braille encoding by volunteers a benchmark of primary importance. Tests were performed to assess the acceptance and usability with minimal training, and to check whether the offered flexibility was indeed perceived by the subject as an added value compared to conventional braille devices. Different mappings between braille dots and actual tactile pins were implemented to match user needs. Performances of eight experienced braille readers were defined as the fraction of correct identifications of rendered content. Different information contents were tested (median performance on random strings, words, sentences identification was about 75%, 85%, 98%, respectively, with a significant increase, p < 0.01), obtaining statistically significant improvements in performance during the tests (p < 0.05). Experimental results, together with qualitative ratings provided by the subjects, show a good acceptance and the effectiveness of the proposed solution. PMID:24782756
Alizade, Elnur; Avci, Anil; Tabakcı, Mehmet Mustafa; Toprak, Cuneyt; Zehir, Regayip; Acar, Goksel; Kargin, Ramazan; Emiroğlu, Mehmet Yunas; Akçakoyun, Mustafa; Pala, Selçuk
2016-08-01
Right ventricular (RV) effects of long-term use of anabolic-androgenic steroids (AAS) are not clearly known. The aim of this study was to assess RV systolic functions by two-dimensional speckle tracking echocardiography (2DSTE) in AAS user and nonuser bodybuilders. A total of 33 competitive male bodybuilders (15 AAS users, 18 AAS nonusers) were assessed. To assess RV systolic functions, all participants underwent standard two-dimensional and Doppler echocardiography, and 2DSTE. Interventricular septal thickness, left ventricle posterior wall thickness, relative wall thickness, and left ventricle mass index were significantly higher in AAS users than nonusers. While standard diastolic parameters were not statistically different between the groups, tissue Doppler parameters including RV E' and E'/A' were lower in AAS users than nonusers (10.1 ± 2.0 vs. 12.7 ± 2.1; P = 0.001, 1.1 ± 0.1 vs. 1.5 ± 0.4; P = 0.009, respectively). Tricuspid annular plane systolic excursion, RV fractional area change, and RV S' were in normal ranges. However, RV S' was found to be lower in users than nonusers (12.2 ± 2.2 vs. 14.6 ± 2.8, P = 0.011). RV free wall longitudinal strain and strain rate were decreased in AAS users in comparison with nonusers (-20.2 ± 3.1 vs. -23.3 ± 3.5; P = 0.012, -3.2 ± 0.1 vs. -3.4 ± 0.1; P = 0.022, respectively). In addition, there were good correlations between 2DSTE parameters and RV S', E', and E'/A'. Despite normal standard systolic echo parameters, peak systolic RV free wall strain and strain rate were reduced in AAS user bodybuilders in comparison with nonusers. Strain and strain rate by 2DSTE may be useful for early determination of subclinical RV dysfunction in AAS user bodybuilders. © 2016, Wiley Periodicals, Inc.
DOT National Transportation Integrated Search
2009-10-01
This report reviews technology options for a mileage-based user fee system in the state of Texas. The report was : compiled based on input from a diverse range of sources, including a literature review of existing mileage-based : user fee technical w...
A medical digital library to support scenario and user-tailored information retrieval.
Chu, W W; Johnson, D B; Kangarloo, H
2000-06-01
Current large-scale information sources are designed to support general queries and lack the ability to support scenario-specific information navigation, gathering, and presentation. As a result, users are often unable to obtain desired specific information within a well-defined subject area. Today's information systems do not provide efficient content navigation, incremental appropriate matching, or content correlation. We are developing the following innovative technologies to remedy these problems: 1) scenario-based proxies, enabling the gathering and filtering of information customized for users within a pre-defined domain; 2) context-sensitive navigation and matching, providing approximate matching and similarity links when an exact match to a user's request is unavailable; 3) content correlation of documents, creating semantic links between documents and information sources; and 4) user models for customizing retrieved information and result presentation. A digital medical library is currently being constructed using these technologies to provide customized information for the user. The technologies are general in nature and can provide custom and scenario-specific information in many other domains (e.g., crisis management).
Implementing virtual reality interfaces for the geosciences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, W.; Jacobsen, J.; Austin, A.
1996-06-01
For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called {open_quotes}Virtual Reality{close_quotes} (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools. VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter threemore » or six-dimensional parameters. In the latter, three or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world, but the keyboard is the obstacle in that typing is cumbersome, error-prone and typically slow. In the latter, the user can interact with these parameters by means of motor skills which are highly developed. Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as a tool for computing streamlines via manipulation of a {open_quotes}rake.{close_quotes} The rake is presented to the user in the form of a {open_quotes}virtual well{close_quotes} icon, and provides parameters used by the streamlines algorithm.« less
Evaluation and integration of existing methods for computational prediction of allergens
2013-01-01
Background Allergy involves a series of complex reactions and factors that contribute to the development of the disease and triggering of the symptoms, including rhinitis, asthma, atopic eczema, skin sensitivity, even acute and fatal anaphylactic shock. Prediction and evaluation of the potential allergenicity is of importance for safety evaluation of foods and other environment factors. Although several computational approaches for assessing the potential allergenicity of proteins have been developed, their performance and relative merits and shortcomings have not been compared systematically. Results To evaluate and improve the existing methods for allergen prediction, we collected an up-to-date definitive dataset consisting of 989 known allergens and massive putative non-allergens. The three most widely used allergen computational prediction approaches including sequence-, motif- and SVM-based (Support Vector Machine) methods were systematically compared using the defined parameters and we found that SVM-based method outperformed the other two methods with higher accuracy and specificity. The sequence-based method with the criteria defined by FAO/WHO (FAO: Food and Agriculture Organization of the United Nations; WHO: World Health Organization) has higher sensitivity of over 98%, but having a low specificity. The advantage of motif-based method is the ability to visualize the key motif within the allergen. Notably, the performances of the sequence-based method defined by FAO/WHO and motif eliciting strategy could be improved by the optimization of parameters. To facilitate the allergen prediction, we integrated these three methods in a web-based application proAP, which provides the global search of the known allergens and a powerful tool for allergen predication. Flexible parameter setting and batch prediction were also implemented. The proAP can be accessed at http://gmobl.sjtu.edu.cn/proAP/main.html. Conclusions This study comprehensively evaluated sequence-, motif- and SVM-based computational prediction approaches for allergens and optimized their parameters to obtain better performance. These findings may provide helpful guidance for the researchers in allergen-prediction. Furthermore, we integrated these methods into a web application proAP, greatly facilitating users to do customizable allergen search and prediction. PMID:23514097
Evaluation and integration of existing methods for computational prediction of allergens.
Wang, Jing; Yu, Yabin; Zhao, Yunan; Zhang, Dabing; Li, Jing
2013-01-01
Allergy involves a series of complex reactions and factors that contribute to the development of the disease and triggering of the symptoms, including rhinitis, asthma, atopic eczema, skin sensitivity, even acute and fatal anaphylactic shock. Prediction and evaluation of the potential allergenicity is of importance for safety evaluation of foods and other environment factors. Although several computational approaches for assessing the potential allergenicity of proteins have been developed, their performance and relative merits and shortcomings have not been compared systematically. To evaluate and improve the existing methods for allergen prediction, we collected an up-to-date definitive dataset consisting of 989 known allergens and massive putative non-allergens. The three most widely used allergen computational prediction approaches including sequence-, motif- and SVM-based (Support Vector Machine) methods were systematically compared using the defined parameters and we found that SVM-based method outperformed the other two methods with higher accuracy and specificity. The sequence-based method with the criteria defined by FAO/WHO (FAO: Food and Agriculture Organization of the United Nations; WHO: World Health Organization) has higher sensitivity of over 98%, but having a low specificity. The advantage of motif-based method is the ability to visualize the key motif within the allergen. Notably, the performances of the sequence-based method defined by FAO/WHO and motif eliciting strategy could be improved by the optimization of parameters. To facilitate the allergen prediction, we integrated these three methods in a web-based application proAP, which provides the global search of the known allergens and a powerful tool for allergen predication. Flexible parameter setting and batch prediction were also implemented. The proAP can be accessed at http://gmobl.sjtu.edu.cn/proAP/main.html. This study comprehensively evaluated sequence-, motif- and SVM-based computational prediction approaches for allergens and optimized their parameters to obtain better performance. These findings may provide helpful guidance for the researchers in allergen-prediction. Furthermore, we integrated these methods into a web application proAP, greatly facilitating users to do customizable allergen search and prediction.
UAH/NASA Workshop on Space Science Platform
NASA Technical Reports Server (NTRS)
Wu, S. T. (Editor); Morgan, S. (Editor)
1978-01-01
The scientific user requirements for a space science platform were defined. The potential user benefits, technological implications and cost of space platforms were examined. Cost effectiveness of the platforms' capabilities were also examined.
User productivity as a function of AutoCAD interface design.
Mitta, D A; Flores, P L
1995-12-01
Increased operator productivity is a desired outcome of user-CAD interaction scenarios. Two objectives of this research were to (1) define a measure of operator productivity and (2) empirically investigate the potential effects of CAD interface design on operator productivity, where productivity is defined as the percentage of a drawing session correctly completed per unit time. Here, AutoCAD provides the CAD environment of interest. Productivity with respect to two AutoCAD interface designs (menu, template) and three task types (draw, dimension, display) was investigated. Analysis of user productivity data revealed significantly higher productivity under the menu interface condition than under the template interface condition. A significant effect of task type was also discovered, where user productivity under display tasks was higher than productivity under the draw and dimension tasks. Implications of these results are presented.
Plug nozzles: The ultimate customer driven propulsion system
NASA Technical Reports Server (NTRS)
Aukerman, Carl A.
1991-01-01
This paper presents the results of a study applying the plug cluster nozzle concept to the propulsion system for a typical lunar excursion vehicle. Primary attention for the design criteria is given to user defined factors such as reliability, low volume, and ease of propulsion system development. Total thrust and specific impulse are held constant in the study while other parameters are explored to minimize the design chamber pressure. A brief history of the plug nozzle concept is included to point out the advanced level of technology of the concept and the feasibility of exploiting the variables considered in this study. The plug cluster concept looks very promising as a candidate for consideration for the ultimate customer driven propulsion system.
NASA Technical Reports Server (NTRS)
Aukerman, Carl A.
1991-01-01
This paper presents the results of a study applying the plug cluster nozzle concept to the propulsion system for a typical lunar excursion vehicle. Primary attention for the design criteria is given to user defined factors such as reliability, low volume, and ease of propulsion system development. Total thrust and specific impulse are held constant in the study while other parameters are explored to minimize the design chamber pressure. A brief history of the plug nozzle concept is included to point out the advanced level of technology of the concept and the feasibility of exploiting the variables considered in the study. The plug cluster concept looks very promising as a candidate for consideration for the ultimate customer driven propulsion system.
Automated optimization of an aspheric light-emitting diode lens for uniform illumination.
Luo, Xiaoxia; Liu, Hua; Lu, Zhenwu; Wang, Yao
2011-07-10
In this paper, an automated optimization method in the sequential mode of ZEMAX is proposed in the design of an aspheric lens with uniform illuminance for an LED source. A feedback modification is introduced in the design for the LED extended source. The user-defined merit function is written out by using ZEMAX programming language macros language and, as an example, optimum parameters of an aspheric lens are obtained via running an optimization. The optical simulation results show that the illumination efficiency and uniformity can reach 83% and 90%, respectively, on a target surface of 40 mm diameter and at 60 mm away for a 1×1 mm LED source. © 2011 Optical Society of America
Fracture and Viscoelastic Characteristics of the Human Cervical Spine,
1986-01-01
to which the three hydraulic actuators are attached. Parameters ES and TS are input by the user to define the reference point T. The reference point T...S ILOLUS Z - - -~ -I-- :~ a I" eU ,. eS U wNiU ,’, -.. |-- • i *1 S U U, N. S S S S S U U a U S S 0 w * S ’I * & S * S Uz N* I p. z...a ’I 0m Me tno -$. satec Rotation (Psi. t~~5. P~i) O.T~a a1~ .~S lp 0 Angle I Max vert 2 Max hor:i d ’ (eg) range (mm) range (mm) I[@1 0 127 .00 330
The Envoy® Totally Implantable Hearing System, St. Croix Medical
Kroll, Kai; Grant, Iain L.; Javel, Eric
2002-01-01
The Totally Implantable Envoy® System is currently undergoing clinical trials in both the United States and Europe. The fully implantable hearing device is intended for use in patients with sensorineural hearing loss. The device employs piezoelectric transducers to sense ossicle motion and drive the stapes. Programmable signal processing parameters include amplification, compression, and variable frequency response. The fully implantable attribute allows users to take advantage of normal external ear resonances and head-related transfer functions, while avoiding undesirable earmold effects. The high sensitivity, low power consumption, and high fidelity attributes of piezoelectric transducers minimize acoustic feedback and maximize battery life (Gyo, 1996; Yanagihara, (1987) and 2001). The surgical procedure to install the device has been accurately defined and implantation is reversible. PMID:25425915
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petiteau, Antoine; Auger, Gerard; Halloin, Hubert
A new LISA simulator (LISACode) is presented. Its ambition is to achieve a new degree of sophistication allowing to map, as closely as possible, the impact of the different subsystems on the measurements. LISACode is not a detailed simulator at the engineering level but rather a tool whose purpose is to bridge the gap between the basic principles of LISA and a future, sophisticated end-to-end simulator. This is achieved by introducing, in a realistic manner, most of the ingredients that will influence LISA's sensitivity as well as the application of TDI combinations. Many user-defined parameters allow the code to studymore » different configurations of LISA thus helping to finalize the definition of the detector. Another important use of LISACode is in generating time-series for data analysis developments.« less
Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T
2015-01-01
Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.
NASA Astrophysics Data System (ADS)
Odijk, Dennis; Zhang, Baocheng; Khodabandeh, Amir; Odolinski, Robert; Teunissen, Peter J. G.
2016-01-01
The concept of integer ambiguity resolution-enabled Precise Point Positioning (PPP-RTK) relies on appropriate network information for the parameters that are common between the single-receiver user that applies and the network that provides this information. Most of the current methods for PPP-RTK are based on forming the ionosphere-free combination using dual-frequency Global Navigation Satellite System (GNSS) observations. These methods are therefore restrictive in the light of the development of new multi-frequency GNSS constellations, as well as from the point of view that the PPP-RTK user requires ionospheric corrections to obtain integer ambiguity resolution results based on short observation time spans. The method for PPP-RTK that is presented in this article does not have above limitations as it is based on the undifferenced, uncombined GNSS observation equations, thereby keeping all parameters in the model. Working with the undifferenced observation equations implies that the models are rank-deficient; not all parameters are unbiasedly estimable, but only combinations of them. By application of S-system theory the model is made of full rank by constraining a minimum set of parameters, or S-basis. The choice of this S-basis determines the estimability and the interpretation of the parameters that are transmitted to the PPP-RTK users. As this choice is not unique, one has to be very careful when comparing network solutions in different S-systems; in that case the S-transformation, which is provided by the S-system method, should be used to make the comparison. Knowing the estimability and interpretation of the parameters estimated by the network is shown to be crucial for a correct interpretation of the estimable PPP-RTK user parameters, among others the essential ambiguity parameters, which have the integer property which is clearly following from the interpretation of satellite phase biases from the network. The flexibility of the S-system method is furthermore demonstrated by the fact that all models in this article are derived in multi-epoch mode, allowing to incorporate dynamic model constraints on all or subsets of parameters.
Development of the FITS tools package for multiple software environments
NASA Technical Reports Server (NTRS)
Pence, W. D.; Blackburn, J. K.
1992-01-01
The HEASARC is developing a package of general purpose software for analyzing data files in FITS format. This paper describes the design philosophy which makes the software both machine-independent (it runs on VAXs, Suns, and DEC-stations) and software environment-independent. Currently the software can be compiled and linked to produce IRAF tasks, or alternatively, the same source code can be used to generate stand-alone tasks using one of two implementations of a user-parameter interface library. The machine independence of the software is achieved by writing the source code in ANSI standard Fortran or C, using the machine-independent FITSIO subroutine interface for all data file I/O, and using a standard user-parameter subroutine interface for all user I/O. The latter interface is based on the Fortran IRAF Parameter File interface developed at STScI. The IRAF tasks are built by linking to the IRAF implementation of this parameter interface library. Two other implementations of this parameter interface library, which have no IRAF dependencies, are now available which can be used to generate stand-alone executable tasks. These stand-alone tasks can simply be executed from the machine operating system prompt either by supplying all the task parameters on the command line or by entering the task name after which the user will be prompted for any required parameters. A first release of this FTOOLS package is now publicly available. The currently available tasks are described, along with instructions on how to obtain a copy of the software.
Kanneganti, Praveen; Huestis, Marilyn A.; Kolbrich, Erin A.; Robert, Goodwin; Ziegelstein, Roy C.; Gorelick, David A.
2008-01-01
Objectives 3,4-Methylenedioxymethamphetamine (MDMA, ecstasy) use has been associated with cardiac arrhythmias. Markers of ventricular late potentials (VLP), which may be a precursor to malignant ventricular arrhythmias, can be detected by signal-averaged electrocardiography (SA-ECG), but not by standard ECG. Methods We evaluated SA-ECG parameters in 21 physically healthy, recently abstinent MDMA users who also used cannabis (11 males, mean [SD] age 23.3 [4.6] years, 2.8 [2.0] years of use), 18 physically healthy cannabis users (8 males, mean [SD] age 26.6 [7.1] years, 11.2 [5.4] years of use) and 54 non-drug-using controls (21 males, mean [SD] age 28.4 [7.8] years). We analyzed three SA-ECG parameters considered markers of VLPs: duration of filtered QRS complex (fQRS), duration of low amplitude potentials during terminal 40 ms of QRS complex (LAS40), and root mean square voltage during terminal 40 ms of QRS complex (RMS40). Results MDMA users, cannabis users, and non-drug-using controls did not differ significantly from each other in fQRS, LAS40, or RMS40 values or in the proportion of subjects with abnormal SA-ECG parameters. There were significant gender differences among controls, but not among MDMA users. Conclusion These findings suggest that chronic MDMA use is neither quantitatively nor qualitatively associated with a high prevalence of abnormal SA-ECG parameters indicative of VLP markers. PMID:18855243
IoT-Based User-Driven Service Modeling Environment for a Smart Space Management System
Choi, Hoan-Suk; Rhee, Woo-Seop
2014-01-01
The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service. PMID:25420153
IoT-based user-driven service modeling environment for a smart space management system.
Choi, Hoan-Suk; Rhee, Woo-Seop
2014-11-20
The existing Internet environment has been extended to the Internet of Things (IoT) as an emerging new paradigm. The IoT connects various physical entities. These entities have communication capability and deploy the observed information to various service areas such as building management, energy-saving systems, surveillance services, and smart homes. These services are designed and developed by professional service providers. Moreover, users' needs have become more complicated and personalized with the spread of user-participation services such as social media and blogging. Therefore, some active users want to create their own services to satisfy their needs, but the existing IoT service-creation environment is difficult for the non-technical user because it requires a programming capability to create a service. To solve this problem, we propose the IoT-based user-driven service modeling environment to provide an easy way to create IoT services. Also, the proposed environment deploys the defined service to another user. Through the personalization and customization of the defined service, the value and dissemination of the service is increased. This environment also provides the ontology-based context-information processing that produces and describes the context information for the IoT-based user-driven service.
Large-area landslide susceptibility with optimized slope-units
NASA Astrophysics Data System (ADS)
Alvioli, Massimiliano; Marchesini, Ivan; Reichenbach, Paola; Rossi, Mauro; Ardizzone, Francesca; Fiorucci, Federica; Guzzetti, Fausto
2017-04-01
A Slope-Unit (SU) is a type of morphological terrain unit bounded by drainage and divide lines that maximize the within-unit homogeneity and the between-unit heterogeneity across distinct physical and geographical boundaries [1]. Compared to other terrain subdivisions, SU are morphological terrain unit well related to the natural (i.e., geological, geomorphological, hydrological) processes that shape and characterize natural slopes. This makes SU easily recognizable in the field or in topographic base maps, and well suited for environmental and geomorphological analysis, in particular for landslide susceptibility (LS) modelling. An optimal subdivision of an area into a set of SU depends on multiple factors: size and complexity of the study area, quality and resolution of the available terrain elevation data, purpose of the terrain subdivision, scale and resolution of the phenomena for which SU are delineated. We use the recently developed r.slopeunits software [2,3] for the automatic, parametric delineation of SU within the open source GRASS GIS based on terrain elevation data and a small number of user-defined parameters. The software provides subdivisions consisting of SU with different shapes and sizes, as a function of the input parameters. In this work, we describe a procedure for the optimal selection of the user parameters through the production of a large number of realizations of the LS model. We tested the software and the optimization procedure in a 2,000 km2 area in Umbria, Central Italy. For LS zonation we adopt a logistic regression model implemented in an well-known software [4,5], using about 50 independent variables. To select the optimal SU partition for LS zonation, we want to define a metric which is able to quantify simultaneously: (i) slope-unit internal homogeneity (ii) slope-unit external heterogeneity (iii) landslide susceptibility model performance. To this end, we define a comprehensive objective function S, as the product of three normalized objective functions dealing with the points (i)-(ii)-(iii) independently. We use an intra-segment variance function V, the Moran's autocorrelation index I and the AUCROC function R arising from the application of the logistic regression model. Maximization of the objective function S = f(I,V,R) as a function of the r.slopeunits input parameters provides an objective and reproducible way to select the optimal parameter combination for a proper SU subdivision for LS modelling. We further perform an analysis of the statistical significance of the LS models as a function of the r.slopeunits input parameters, focusing on the degree of coarseness of each subdivision. We find that the LRM, when applied to subdivisions with large average SU size, has a very poor statistical significance, resulting in only few (5%, typically lithological) variables being used in the regression due to the large heterogeneity of all variables within each unit, while up to 35% of the variables are used when SU are very small. This behavior was largely expected and provides further evidence that an objective method to select SU size is highly desirable. [1] Guzzetti, F. et al., Geomorphology 31, (1999) 181-216 [2] Alvioli, M. et al., Geoscientific Model Development 9 (2016), 3975-3991 [3] http://geomorphology.irpi.cnr.it/tools/slope-units [4] Rossi, M. et al., Geomorphology 114, (2010) 129-142 [5] Rossi, M. and Reichenbach, P., Geoscientific Model Development 9 (2016), 3533-3543
Ruefli, Terry; Rogers, Susan J
2004-01-01
Background Harm reduction is a relatively new and controversial model for treating drug users, with little formal research on its operation and effectiveness. In order to advance the study of harm reduction programs and our understanding of how drug users define their progress, qualitative research was conducted to develop outcomes of harm reduction programming that are culturally relevant, incremental, (i.e., capable of measuring change), and hierarchical (i.e., capable of showing how clients improve over time). Methods The study used nominal group technique (NGT) to develop the outcomes (phase 1) and focus group interviews to help validate the findings (phase 2). Study participants were recruited from a large harm-reduction program in New York City and involved approximately 120 clients in 10 groups in phase 1 and 120 clients in 10 focus groups in phase 2. Results Outcomes of 10 life areas important to drug users were developed that included between 10 to 15 incremental measures per outcome. The outcomes included ways of 1) making money; 2) getting something good to eat; 3) being housed/homeless; 4) relating to families; 5) getting needed programs/benefits/services; 6) handling health problems; 7) handling negative emotions; 8) handling legal problems; 9) improving oneself; and 10) handling drug-use problems. Findings also provided insights into drug users' lives and values, as well as a window into understanding how this population envisions a better quality of life. Results challenged traditional ways of measuring drug users based solely on quantity used and frequency of use. They suggest that more appropriate measures are based on the extent to which drug users organize their lives around drug use and how much drug use is integrated into their lives and negatively impacts other aspects of their lives. Conclusions Harm reduction and other programs serving active drug users and other marginalized people should not rely on institutionalized, provider-defined solutions to problems in living faced by their clients. PMID:15333130
2010-11-01
subsections discuss the design of the simulations. 3.12.1 Lanchester5D Simulation A Lanchester simulation was developed to conduct performance...benchmarks using the WarpIV Kernel and HyperWarpSpeed. The Lanchester simulation contains a user-definable number of grid cells in which blue and red...forces engage in battle using Lanchester equations. Having a user-definable number of grid cells enables the simulation to be stressed with high entity
Autopilot for frequency-modulation atomic force microscopy.
Kuchuk, Kfir; Schlesinger, Itai; Sivan, Uri
2015-10-01
One of the most challenging aspects of operating an atomic force microscope (AFM) is finding optimal feedback parameters. This statement applies particularly to frequency-modulation AFM (FM-AFM), which utilizes three feedback loops to control the cantilever excitation amplitude, cantilever excitation frequency, and z-piezo extension. These loops are regulated by a set of feedback parameters, tuned by the user to optimize stability, sensitivity, and noise in the imaging process. Optimization of these parameters is difficult due to the coupling between the frequency and z-piezo feedback loops by the non-linear tip-sample interaction. Four proportional-integral (PI) parameters and two lock-in parameters regulating these loops require simultaneous optimization in the presence of a varying unknown tip-sample coupling. Presently, this optimization is done manually in a tedious process of trial and error. Here, we report on the development and implementation of an algorithm that computes the control parameters automatically. The algorithm reads the unperturbed cantilever resonance frequency, its quality factor, and the z-piezo driving signal power spectral density. It analyzes the poles and zeros of the total closed loop transfer function, extracts the unknown tip-sample transfer function, and finds four PI parameters and two lock-in parameters for the frequency and z-piezo control loops that optimize the bandwidth and step response of the total system. Implementation of the algorithm in a home-built AFM shows that the calculated parameters are consistently excellent and rarely require further tweaking by the user. The new algorithm saves the precious time of experienced users, facilitates utilization of FM-AFM by casual users, and removes the main hurdle on the way to fully automated FM-AFM.
Autopilot for frequency-modulation atomic force microscopy
NASA Astrophysics Data System (ADS)
Kuchuk, Kfir; Schlesinger, Itai; Sivan, Uri
2015-10-01
One of the most challenging aspects of operating an atomic force microscope (AFM) is finding optimal feedback parameters. This statement applies particularly to frequency-modulation AFM (FM-AFM), which utilizes three feedback loops to control the cantilever excitation amplitude, cantilever excitation frequency, and z-piezo extension. These loops are regulated by a set of feedback parameters, tuned by the user to optimize stability, sensitivity, and noise in the imaging process. Optimization of these parameters is difficult due to the coupling between the frequency and z-piezo feedback loops by the non-linear tip-sample interaction. Four proportional-integral (PI) parameters and two lock-in parameters regulating these loops require simultaneous optimization in the presence of a varying unknown tip-sample coupling. Presently, this optimization is done manually in a tedious process of trial and error. Here, we report on the development and implementation of an algorithm that computes the control parameters automatically. The algorithm reads the unperturbed cantilever resonance frequency, its quality factor, and the z-piezo driving signal power spectral density. It analyzes the poles and zeros of the total closed loop transfer function, extracts the unknown tip-sample transfer function, and finds four PI parameters and two lock-in parameters for the frequency and z-piezo control loops that optimize the bandwidth and step response of the total system. Implementation of the algorithm in a home-built AFM shows that the calculated parameters are consistently excellent and rarely require further tweaking by the user. The new algorithm saves the precious time of experienced users, facilitates utilization of FM-AFM by casual users, and removes the main hurdle on the way to fully automated FM-AFM.
Autopilot for frequency-modulation atomic force microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuchuk, Kfir; Schlesinger, Itai; Sivan, Uri, E-mail: phsivan@tx.technion.ac.il
2015-10-15
One of the most challenging aspects of operating an atomic force microscope (AFM) is finding optimal feedback parameters. This statement applies particularly to frequency-modulation AFM (FM-AFM), which utilizes three feedback loops to control the cantilever excitation amplitude, cantilever excitation frequency, and z-piezo extension. These loops are regulated by a set of feedback parameters, tuned by the user to optimize stability, sensitivity, and noise in the imaging process. Optimization of these parameters is difficult due to the coupling between the frequency and z-piezo feedback loops by the non-linear tip-sample interaction. Four proportional-integral (PI) parameters and two lock-in parameters regulating these loopsmore » require simultaneous optimization in the presence of a varying unknown tip-sample coupling. Presently, this optimization is done manually in a tedious process of trial and error. Here, we report on the development and implementation of an algorithm that computes the control parameters automatically. The algorithm reads the unperturbed cantilever resonance frequency, its quality factor, and the z-piezo driving signal power spectral density. It analyzes the poles and zeros of the total closed loop transfer function, extracts the unknown tip-sample transfer function, and finds four PI parameters and two lock-in parameters for the frequency and z-piezo control loops that optimize the bandwidth and step response of the total system. Implementation of the algorithm in a home-built AFM shows that the calculated parameters are consistently excellent and rarely require further tweaking by the user. The new algorithm saves the precious time of experienced users, facilitates utilization of FM-AFM by casual users, and removes the main hurdle on the way to fully automated FM-AFM.« less
Configuration control of seven-degree-of-freedom arms
NASA Technical Reports Server (NTRS)
Seraji, Homayoun (Inventor); Long, Mark K. (Inventor); Lee, Thomas S. (Inventor)
1992-01-01
A seven degree of freedom robot arm with a six degree of freedom end effector is controlled by a processor employing a 6 by 7 Jacobian matrix for defining location and orientation of the end effector in terms of the rotation angles of the joints, a 1 (or more) by 7 Jacobian matrix for defining 1 (or more) user specified kinematic functions constraining location or movement of selected portions of the arm in terms of the joint angles, the processor combining the two Jacobian matrices to produce an augmented 7 (or more) by 7 Jacobian matrix, the processor effecting control by computing in accordance with forward kinematics from the augmented 7 by 7 Jacobian matrix and from the seven joint angles of the arm a set of seven desired joint angles for transmittal to the joint servo loops of the arm. One of the kinematic functions constraints the orientation of the elbow plane of the arm. Another one of the kinematic functions minimizes a sum of gravitational torques on the joints. Still another kinematic function constrains the location of the arm to perform collision avoidance. Generically, one kinematic function minimizes a sum of selected mechanical parameters of at least some of the joints associated with weighting coefficients which may be changed during arm movement. The mechanical parameters may be velocity errors or gravity torques associated with individual joints.
Configuration control of seven degree of freedom arms
NASA Technical Reports Server (NTRS)
Seraji, Homayoun (Inventor)
1995-01-01
A seven-degree-of-freedom robot arm with a six-degree-of-freedom end effector is controlled by a processor employing a 6-by-7 Jacobian matrix for defining location and orientation of the end effector in terms of the rotation angles of the joints, a 1 (or more)-by-7 Jacobian matrix for defining 1 (or more) user-specified kinematic functions constraining location or movement of selected portions of the arm in terms of the joint angles, the processor combining the two Jacobian matrices to produce an augmented 7 (or more)-by-7 Jacobian matrix, the processor effecting control by computing in accordance with forward kinematics from the augmented 7-by-7 Jacobian matrix and from the seven joint angles of the arm a set of seven desired joint angles for transmittal to the joint servo loops of the arms. One of the kinematic functions constrains the orientation of the elbow plane of the arm. Another one of the kinematic functions minimizing a sum of gravitational torques on the joints. Still another one of the kinematic functions constrains the location of the arm to perform collision avoidance. Generically, one of the kinematic functions minimizes a sum of selected mechanical parameters of at least some of the joints associated with weighting coefficients which may be changed during arm movement. The mechanical parameters may be velocity errors or position errors or gravity torques associated with individual joints.
Automatic user customization for improving the performance of a self-paced brain interface system.
Fatourechi, Mehrdad; Bashashati, Ali; Birch, Gary E; Ward, Rabab K
2006-12-01
Customizing the parameter values of brain interface (BI) systems by a human expert has the advantage of being fast and computationally efficient. However, as the number of users and EEG channels grows, this process becomes increasingly time consuming and exhausting. Manual customization also introduces inaccuracies in the estimation of the parameter values. In this paper, the performance of a self-paced BI system whose design parameter values were automatically user customized using a genetic algorithm (GA) is studied. The GA automatically estimates the shapes of movement-related potentials (MRPs), whose features are then extracted to drive the BI. Offline analysis of the data of eight subjects revealed that automatic user customization improved the true positive (TP) rate of the system by an average of 6.68% over that whose customization was carried out by a human expert, i.e., by visually inspecting the MRP templates. On average, the best improvement in the TP rate (an average of 9.82%) was achieved for four individuals with spinal cord injury. In this case, the visual estimation of the parameter values of the MRP templates was very difficult because of the highly noisy nature of the EEG signals. For four able-bodied subjects, for which the MRP templates were less noisy, the automatic user customization led to an average improvement of 3.58% in the TP rate. The results also show that the inter-subject variability of the TP rate is also reduced compared to the case when user customization is carried out by a human expert. These findings provide some primary evidence that automatic user customization leads to beneficial results in the design of a self-paced BI for individuals with spinal cord injury.
Identifying Bearing Rotordynamic Coefficients using an Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Miller, Brad A.; Howard, Samuel A.
2008-01-01
An Extended Kalman Filter is developed to estimate the linearized direct and indirect stiffness and damping force coefficients for bearings in rotor-dynamic applications from noisy measurements of the shaft displacement in response to imbalance and impact excitation. The bearing properties are modeled as stochastic random variables using a Gauss-Markov model. Noise terms are introduced into the system model to account for all of the estimation error, including modeling errors and uncertainties and the propagation of measurement errors into the parameter estimates. The system model contains two user-defined parameters that can be tuned to improve the filter s performance; these parameters correspond to the covariance of the system and measurement noise variables. The filter is also strongly influenced by the initial values of the states and the error covariance matrix. The filter is demonstrated using numerically simulated data for a rotor-bearing system with two identical bearings, which reduces the number of unknown linear dynamic coefficients to eight. The filter estimates for the direct damping coefficients and all four stiffness coefficients correlated well with actual values, whereas the estimates for the cross-coupled damping coefficients were the least accurate.
Fall, Kelsey A.; Harris, Courtney K.; Friedrichs, Carl T.; Rinehimer, J. Paul; Sherwood, Christopher R.
2014-01-01
The Community Sediment Transport Modeling System (CSTMS) cohesive bed sub-model that accounts for erosion, deposition, consolidation, and swelling was implemented in a three-dimensional domain to represent the York River estuary, Virginia. The objectives of this paper are to (1) describe the application of the three-dimensional hydrodynamic York Cohesive Bed Model, (2) compare calculations to observations, and (3) investigate sensitivities of the cohesive bed sub-model to user-defined parameters. Model results for summer 2007 showed good agreement with tidal-phase averaged estimates of sediment concentration, bed stress, and current velocity derived from Acoustic Doppler Velocimeter (ADV) field measurements. An important step in implementing the cohesive bed model was specification of both the initial and equilibrium critical shear stress profiles, in addition to choosing other parameters like the consolidation and swelling timescales. This model promises to be a useful tool for investigating the fundamental controls on bed erodibility and settling velocity in the York River, a classical muddy estuary, provided that appropriate data exists to inform the choice of model parameters.
Filter parameter tuning analysis for operational orbit determination support
NASA Technical Reports Server (NTRS)
Dunham, J.; Cox, C.; Niklewski, D.; Mistretta, G.; Hart, R.
1994-01-01
The use of an extended Kalman filter (EKF) for operational orbit determination support is being considered by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD). To support that investigation, analysis was performed to determine how an EKF can be tuned for operational support of a set of earth-orbiting spacecraft. The objectives of this analysis were to design and test a general purpose scheme for filter tuning, evaluate the solution accuracies, and develop practical methods to test the consistency of the EKF solutions in an operational environment. The filter was found to be easily tuned to produce estimates that were consistent, agreed with results from batch estimation, and compared well among the common parameters estimated for several spacecraft. The analysis indicates that there is not a sharply defined 'best' tunable parameter set, especially when considering only the position estimates over the data arc. The comparison of the EKF estimates for the user spacecraft showed that the filter is capable of high-accuracy results and can easily meet the current accuracy requirements for the spacecraft included in the investigation. The conclusion is that the EKF is a viable option for FDD operational support.
Extension of the PC version of VEPFIT with input and output routines running under Windows
NASA Astrophysics Data System (ADS)
Schut, H.; van Veen, A.
1995-01-01
The fitting program VEPFIT has been extended with applications running under the Microsoft-Windows environment facilitating the input and output of the VEPFIT fitting module. We have exploited the Microsoft-Windows graphical users interface by making use of dialog windows, scrollbars, command buttons, etc. The user communicates with the program simply by clicking and dragging with the mouse pointing device. Keyboard actions are limited to a minimum. Upon changing one or more input parameters the results of the modeling of the S-parameter and Ps fractions versus positron implantation energy are updated and displayed. This action can be considered as the first step in the fitting procedure upon which the user can decide to further adapt the input parameters or to forward these parameters as initial values to the fitting routine. The modeling step has proven to be helpful for designing positron beam experiments.
iMOSFLM: a new graphical interface for diffraction-image processing with MOSFLM
Battye, T. Geoff G.; Kontogiannis, Luke; Johnson, Owen; Powell, Harold R.; Leslie, Andrew G. W.
2011-01-01
iMOSFLM is a graphical user interface to the diffraction data-integration program MOSFLM. It is designed to simplify data processing by dividing the process into a series of steps, which are normally carried out sequentially. Each step has its own display pane, allowing control over parameters that influence that step and providing graphical feedback to the user. Suitable values for integration parameters are set automatically, but additional menus provide a detailed level of control for experienced users. The image display and the interfaces to the different tasks (indexing, strategy calculation, cell refinement, integration and history) are described. The most important parameters for each step and the best way of assessing success or failure are discussed. PMID:21460445
User's manual for MMLE3, a general FORTRAN program for maximum likelihood parameter estimation
NASA Technical Reports Server (NTRS)
Maine, R. E.; Iliff, K. W.
1980-01-01
A user's manual for the FORTRAN IV computer program MMLE3 is described. It is a maximum likelihood parameter estimation program capable of handling general bilinear dynamic equations of arbitrary order with measurement noise and/or state noise (process noise). The theory and use of the program is described. The basic MMLE3 program is quite general and, therefore, applicable to a wide variety of problems. The basic program can interact with a set of user written problem specific routines to simplify the use of the program on specific systems. A set of user routines for the aircraft stability and control derivative estimation problem is provided with the program.
Dairy Analytics and Nutrient Analysis (DANA) Prototype System User Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sam Alessi; Dennis Keiser
2012-10-01
This document is a user manual for the Dairy Analytics and Nutrient Analysis (DANA) model. DANA provides an analysis of dairy anaerobic digestion technology and allows users to calculate biogas production, co-product valuation, capital costs, expenses, revenue and financial metrics, for user customizable scenarios, dairy and digester types. The model provides results for three anaerobic digester types; Covered Lagoons, Modified Plug Flow, and Complete Mix, and three main energy production technologies; electricity generation, renewable natural gas generation, and compressed natural gas generation. Additional options include different dairy types, bedding types, backend treatment type as well as numerous production, and economicmore » parameters. DANA’s goal is to extend the National Market Value of Anaerobic Digester Products analysis (informa economics, 2012; Innovation Center, 2011) to include a greater and more flexible set of regional digester scenarios and to provide a modular framework for creation of a tool to support farmer and investor needs. Users can set up scenarios from combinations of existing parameters or add new parameters, run the model and view a variety of reports, charts and tables that are automatically produced and delivered over the web interface. DANA is based in the INL’s analysis architecture entitled Generalized Environment for Modeling Systems (GEMS) , which offers extensive collaboration, analysis, and integration opportunities and greatly speeds the ability construct highly scalable web delivered user-oriented decision tools. DANA’s approach uses server-based data processing and web-based user interfaces, rather a client-based spreadsheet approach. This offers a number of benefits over the client-based approach. Server processing and storage can scale up to handle a very large number of scenarios, so that analysis of county, even field level, across the whole U.S., can be performed. Server based databases allow dairy and digester parameters be held and managed in a single managed data repository, while allows users to customize standard values and perform individual analysis. Server-based calculations can be easily extended, versions and upgrades managed, and any changes are immediately available to all users. This user manual describes how to use and/or modify input database tables, run DANA, view and modify reports.« less
The HITRAN2016 molecular spectroscopic database
NASA Astrophysics Data System (ADS)
Gordon, I. E.; Rothman, L. S.; Hill, C.; Kochanov, R. V.; Tan, Y.; Bernath, P. F.; Birk, M.; Boudon, V.; Campargue, A.; Chance, K. V.; Drouin, B. J.; Flaud, J.-M.; Gamache, R. R.; Hodges, J. T.; Jacquemart, D.; Perevalov, V. I.; Perrin, A.; Shine, K. P.; Smith, M.-A. H.; Tennyson, J.; Toon, G. C.; Tran, H.; Tyuterev, V. G.; Barbe, A.; Császár, A. G.; Devi, V. M.; Furtenbacher, T.; Harrison, J. J.; Hartmann, J.-M.; Jolly, A.; Johnson, T. J.; Karman, T.; Kleiner, I.; Kyuberis, A. A.; Loos, J.; Lyulin, O. M.; Massie, S. T.; Mikhailenko, S. N.; Moazzen-Ahmadi, N.; Müller, H. S. P.; Naumenko, O. V.; Nikitin, A. V.; Polyansky, O. L.; Rey, M.; Rotger, M.; Sharpe, S. W.; Sung, K.; Starikova, E.; Tashkun, S. A.; Auwera, J. Vander; Wagner, G.; Wilzewski, J.; Wcisło, P.; Yu, S.; Zak, E. J.
2017-12-01
This paper describes the contents of the 2016 edition of the HITRAN molecular spectroscopic compilation. The new edition replaces the previous HITRAN edition of 2012 and its updates during the intervening years. The HITRAN molecular absorption compilation is composed of five major components: the traditional line-by-line spectroscopic parameters required for high-resolution radiative-transfer codes, infrared absorption cross-sections for molecules not yet amenable to representation in a line-by-line form, collision-induced absorption data, aerosol indices of refraction, and general tables such as partition sums that apply globally to the data. The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additional absorption phenomena, added line-shape formalisms, and validity. Moreover, molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth. Of considerable note, experimental IR cross-sections for almost 300 additional molecules important in different areas of atmospheric science have been added to the database. The compilation can be accessed through www.hitran.org. Most of the HITRAN data have now been cast into an underlying relational database structure that offers many advantages over the long-standing sequential text-based structure. The new structure empowers the user in many ways. It enables the incorporation of an extended set of fundamental parameters per transition, sophisticated line-shape formalisms, easy user-defined output formats, and very convenient searching, filtering, and plotting of data. A powerful application programming interface making use of structured query language (SQL) features for higher-level applications of HITRAN is also provided.
Perceived constraints by non-traditional users on the Mt. Baker-Snoqualmie National Forest
Elizabeth A. Covelli; Robert C. Burns; Alan Graefe
2007-01-01
The purpose of this study was to investigate the constraints that non-traditional users face, along with the negotiation strategies that are employed in order to start, continue, or increase participation in recreation on a national forest. Non-traditional users were defined as respondents who were not Caucasian. Additionally, both constraints and negotiation...
Perceived Security Determinants in E-Commerce among Turkish University Students
ERIC Educational Resources Information Center
Yenisey, M.M.; Ozok, A.A.; Salvendy, G.
2005-01-01
Perceived security is defined as the level of security that users feel while they are shopping on e-commerce sites. The aims of this study were to determine items that positively influence this feeling of security by users during shopping, and to develop guidelines for perceived security in e-commerce. An experiment allowed users with different…
Proba-V Mission Exploitation Platform
NASA Astrophysics Data System (ADS)
Goor, E.
2017-12-01
VITO and partners developed the Proba-V Mission Exploitation Platform (MEP) as an end-to-end solution to drastically improve the exploitation of the Proba-V (an EC Copernicus contributing mission) EO-data archive, the past mission SPOT-VEGETATION and derived vegetation parameters by researchers, service providers (e.g. the EC Copernicus Global Land Service) and end-users. The analysis of time series of data (PB range) is addressed, as well as the large scale on-demand processing of near real-time data on a powerful and scalable processing environment. New features are still developed, but the platform is yet fully operational since November 2016 and offers A time series viewer (browser web client and API), showing the evolution of Proba-V bands and derived vegetation parameters for any country, region, pixel or polygon defined by the user. Full-resolution viewing services for the complete data archive. On-demand processing chains on a powerfull Hadoop/Spark backend. Virtual Machines can be requested by users with access to the complete data archive mentioned above and pre-configured tools to work with this data, e.g. various toolboxes and support for R and Python. This allows users to immediately work with the data without having to install tools or download data, but as well to design, debug and test applications on the platform. Jupyter Notebooks is available with some examples python and R projects worked out to show the potential of the data. Today the platform is already used by several international third party projects to perform R&D activities on the data, and to develop/host data analysis toolboxes. From the Proba-V MEP, access to other data sources such as Sentinel-2 and landsat data is also addressed. Selected components of the MEP are also deployed on public cloud infrastructures in various R&D projects. Users can make use of powerful Web based tools and can self-manage virtual machines to perform their work on the infrastructure at VITO with access to the complete data archive. To realise this, private cloud technology (openStack) is used and a distributed processing environment is built based on Hadoop. The Hadoop ecosystem offers a lot of technologies (Spark, Yarn, Accumulo) which we integrate with several open-source components (e.g. Geotrellis).
Some implications of an event-based definition of exposure to the risk of road accident.
Elvik, Rune
2015-03-01
This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving the roadway. A typology of events representing a potential for an accident is proposed. Each event can be interpreted as a trial as defined in probability theory. Risk is the proportion of events that result in an accident. Defining exposure as events demanding the attention of road users implies that road users will learn from repeated exposure to these events, which in turn implies that there will normally be a negative relationship between exposure and risk. Four hypotheses regarding the relationship between exposure and risk are proposed. Preliminary tests support these hypotheses. Advantages and disadvantages of defining exposure as specific events are discussed. It is argued that developments in vehicle technology are likely to make events both observable and countable, thus ensuring that exposure is an operational concept. Copyright © 2014 Elsevier Ltd. All rights reserved.
Research Challenges in Managing and Using Service Level Agreements
NASA Astrophysics Data System (ADS)
Rana, Omer; Ziegler, Wolfgang
A Service Level Agreement (SLA) represents an agreement between a service user and a provider in the context of a particular service provision. SLAs contain Quality of Service properties that must be maintained by a provider, and as agreed between a provider and a user/client. These are generally defined as a set of Service Level Objectives (SLOs). These properties need to be measurable and must be monitored during the provision of the service that has been agreed in the SLA. The SLA must also contain a set of penalty clauses specifying what happens when service providers fail to deliver the pre-agreed quality. Hence, an SLA may be used by both a user and a provider - from a user perspective, an SLA defines what is required - often defined using non-functional attributes of service provision. From a providers perspective, an SLA may be used to support capacity planning - especially if a provider is making it's capability available to multiple users. An SLA may be used by a client and provider to manage their behaviour over time - for instance, to optimise their long running revenue (cost) or QoS attributes (such as execution time), for instance. The lifecycle of an SLA is outlined, along with various uses of SLAs to support infrastructure management. A discussion about WS-Agreement - the emerging standard for specifying SLAs - is also provided.
Measuring the Interestingness of Articles in a Limited User Environment Prospectus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pon, Raymond K.
2007-04-18
Search engines, such as Google, assign scores to news articles based on their relevancy to a query. However, not all relevant articles for the query may be interesting to a user. For example, if the article is old or yields little new information, the article would be uninteresting. Relevancy scores do not take into account what makes an article interesting, which would vary from user to user. Although methods such as collaborative filtering have been shown to be effective in recommendation systems, in a limited user environment there are not enough users that would make collaborative filtering effective. I presentmore » a general framework for defining and measuring the ''interestingness'' of articles, called iScore, incorporating user-feedback including tracking multiple topics of interest as well as finding interesting entities or phrases in a complex relationship network. I propose and have shown the validity of the following: 1. Filtering based on only topic relevancy is insufficient for identifying interesting articles. 2. No single feature can characterize the interestingness of an article for a user. It is the combination of multiple features that yields higher quality results. For each user, these features have different degrees of usefulness for predicting interestingness. 3. Through user-feedback, a classifier can combine features to predict interestingness for the user. 4. Current evaluation corpora, such as TREC, do not capture all aspects of personalized news filtering systems necessary for system evaluation. 5. Focusing on only specific evolving user interests instead of all topics allows for more efficient resource utilization while yielding high quality recommendation results. 6. Multiple profile vectors yield significantly better results than traditional methods, such as the Rocchio algorithm, for identifying interesting articles. Additionally, the addition of tracking multiple topics as a new feature in iScore, can improve iScore's classification performance. 7. Multiple topic tracking yields better results than the best results from the last TREC adaptive filtering run. As future work, I will address the following hypothesis: Entities and the relationship among these entities using current information extraction technology can be utilized to identify entities of interest and relationships of interest, using a scheme such as PageRank. And I will address one of the following two hypotheses: 1. By addressing the multiple reading roles that a single user may have, classification results can be improved. 2. By tailoring the operating parameters of MTT, better classification results can be achieved.« less
The Factor Finder CD-ROM is a user-friendly, searchable tool used to locate exposure factors and sociodemographic data for user-defined populations. Factor Finder improves the exposure assessors and risk assessors (etc.) ability to efficiently locate exposure-related informatio...
Computer systems and methods for the query and visualization of multidimensional databases
Stolte, Chris; Tang, Diane L; Hanrahan, Patrick
2015-03-03
A computer displays a graphical user interface on its display. The graphical user interface includes a schema information region and a data visualization region. The schema information region includes multiple operand names, each operand corresponding to one or more fields of a multi-dimensional database that includes at least one data hierarchy. The data visualization region includes a columns shelf and a rows shelf. The computer detects user actions to associate one or more first operands with the columns shelf and to associate one or more second operands with the rows shelf. The computer generates a visual table in the data visualization region in accordance with the user actions. The visual table includes one or more panes. Each pane has an x-axis defined based on data for the one or more first operands, and each pane has a y-axis defined based on data for the one or more second operands.
Computer systems and methods for the query and visualization of multidimensional databases
Stolte, Chris; Tang, Diane L.; Hanrahan, Patrick
2015-11-10
A computer displays a graphical user interface on its display. The graphical user interface includes a schema information region and a data visualization region. The schema information region includes a plurality of fields of a multi-dimensional database that includes at least one data hierarchy. The data visualization region includes a columns shelf and a rows shelf. The computer detects user actions to associate one or more first fields with the columns shelf and to associate one or more second fields with the rows shelf. The computer generates a visual table in the data visualization region in accordance with the user actions. The visual table includes one or more panes. Each pane has an x-axis defined based on data for the one or more first fields, and each pane has a y-axis defined based on data for the one or more second fields.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
Implementation and evaluation of LMS mobile application: scele mobile based on user-centered design
NASA Astrophysics Data System (ADS)
Banimahendra, R. D.; Santoso, H. B.
2018-03-01
The development of mobile technology is now increasing rapidly, demanding all activities including learning should be done on mobile devices. It shows that the implementation of mobile application as a learning medium needs to be done. This study describes the process of developing and evaluating the Moodle-based mobile Learning Management System (LMS) application called Student Centered e-Learning Environment (SCeLE). This study discusses the process of defining features, implementing features into the application, and evaluating the application. We define the features using user research and literature study, then we implement the application with user-centered design basis, at the last phase we evaluated the application using usability testing and system usability score (SUS). The purpose of this study is to determine the extent to which this application can help the users doing their tasks and provide recommendation for the next research and development.
Jayashree, B; Rajgopal, S; Hoisington, D; Prasanth, V P; Chandra, S
2008-09-24
Structure, is a widely used software tool to investigate population genetic structure with multi-locus genotyping data. The software uses an iterative algorithm to group individuals into "K" clusters, representing possibly K genetically distinct subpopulations. The serial implementation of this programme is processor-intensive even with small datasets. We describe an implementation of the program within a parallel framework. Speedup was achieved by running different replicates and values of K on each node of the cluster. A web-based user-oriented GUI has been implemented in PHP, through which the user can specify input parameters for the programme. The number of processors to be used can be specified in the background command. A web-based visualization tool "Visualstruct", written in PHP (HTML and Java script embedded), allows for the graphical display of population clusters output from Structure, where each individual may be visualized as a line segment with K colors defining its possible genomic composition with respect to the K genetic sub-populations. The advantage over available programs is in the increased number of individuals that can be visualized. The analyses of real datasets indicate a speedup of up to four, when comparing the speed of execution on clusters of eight processors with the speed of execution on one desktop. The software package is freely available to interested users upon request.
Biometric template transformation: a security analysis
NASA Astrophysics Data System (ADS)
Nagar, Abhishek; Nandakumar, Karthik; Jain, Anil K.
2010-01-01
One of the critical steps in designing a secure biometric system is protecting the templates of the users that are stored either in a central database or on smart cards. If a biometric template is compromised, it leads to serious security and privacy threats because unlike passwords, it is not possible for a legitimate user to revoke his biometric identifiers and switch to another set of uncompromised identifiers. One methodology for biometric template protection is the template transformation approach, where the template, consisting of the features extracted from the biometric trait, is transformed using parameters derived from a user specific password or key. Only the transformed template is stored and matching is performed directly in the transformed domain. In this paper, we formally investigate the security strength of template transformation techniques and define six metrics that facilitate a holistic security evaluation. Furthermore, we analyze the security of two wellknown template transformation techniques, namely, Biohashing and cancelable fingerprint templates based on the proposed metrics. Our analysis indicates that both these schemes are vulnerable to intrusion and linkage attacks because it is relatively easy to obtain either a close approximation of the original template (Biohashing) or a pre-image of the transformed template (cancelable fingerprints). We argue that the security strength of template transformation techniques must consider also consider the computational complexity of obtaining a complete pre-image of the transformed template in addition to the complexity of recovering the original biometric template.
The complex network of musical tastes
NASA Astrophysics Data System (ADS)
Buldú, Javier M.; Cano, P.; Koppenberger, M.; Almendral, Juan A.; Boccaletti, S.
2007-06-01
We present an empirical study of the evolution of a social network constructed under the influence of musical tastes. The network is obtained thanks to the selfless effort of a broad community of users who share playlists of their favourite songs with other users. When two songs co-occur in a playlist a link is created between them, leading to a complex network where songs are the fundamental nodes. In this representation, songs in the same playlist could belong to different musical genres, but they are prone to be linked by a certain musical taste (e.g. if songs A and B co-occur in several playlists, an user who likes A will probably like also B). Indeed, playlist collections such as the one under study are the basic material that feeds some commercial music recommendation engines. Since playlists have an input date, we are able to evaluate the topology of this particular complex network from scratch, observing how its characteristic parameters evolve in time. We compare our results with those obtained from an artificial network defined by means of a null model. This comparison yields some insight on the evolution and structure of such a network, which could be used as ground data for the development of proper models. Finally, we gather information that can be useful for the development of music recommendation engines and give some hints about how top-hits appear.
VizieR Online Data Catalog: Analytical model for irradiated atmospheres (Parmentier+, 2015)
NASA Astrophysics Data System (ADS)
Parmentier, V.; Guillot, T.; Fortney, J.; Marley, M.
2014-11-01
The model has six parameters to describe the opacities: - {kappa}(N) is the Rosseland mean opacity at each levels of the atmosphere it does not have to be constant with depth. - Gp is the ratio of the thermal Plank mean opacity to the thermal Rosseland mean opacity. - Beta is the width ratio of the two thermal bands in the frequency space. - Gv1 is the ratio of the visible opacity in the first visible band to the thermal Rosseland mean opacity - Gv2 is the ratio of the visible opacity in the second visible band to the thermal Rosseland mean opacity - Gv3 is the ratio of the visible opacity in the second visible band to the thermal Rosseland mean opacity Each visible band has a fixed width of 1/3. Additional parameters describe the physical setting: - Teq0 is the equilibrium temperature of the planet for 0 albedo and full redistribution of energy. - mu is the angle between the vertical direction and the stellar direction. For average profiles set mu=1/sqrt(3) - f is a parameter equal to 0.5 to compute a dayside average profile and 0.25 for planet average profile. - Tint is the internal temperature, given by the internal luminosity - grav is the gravity of the planet - Ab is the Bond albedo of the planet - P(i) are the pressure levels where the temperature is computed. - N is the number of atmospheric levels. Several options are available in order to use the coefficients derived in Parmentier et al. (2014A&A...562A.133P, Cat. J/A+A/562/A133): ROSS can take the values : - "USER" for a Rosseland mean opacity set by the user {kappa}(nlevels) through the atmosphere. - "AUTO" in order to use {kappa}(P,T), the functional form of the Rosseland mean opacities provided by Valencia et al. (2013ApJ...775...10V) and based on the opacities calculated by Freedman et al. (2008ApJS..174..504F). The value of {kappa} is then recalculated and the initial value set by the user is NOT taken into account. COEFF can take the values : - "USER" for coefficients set by the user - "AUTO" for using the fit of the coefficients provided in Parmentier et al. (2014A&A...562A.133P, Cat. J/A+A/562/A133). In that case all the coefficients set by the user are NOT taken into account (apart for the Rosseland mean opacities) COMP can take the values (Valid only if COEFF="AUTO") : - "SOLAR" to use the fit of the coefficients for a solar composition atmosphere - "NOTIO" to use the fit of the coefficients without TiO STAR can take the value (Valid only if COEFF="AUTO"): - "SUN" to use the fit of the coefficients for a sun-like stellar irradiation ALBEDO can thake the value : - "USER" for a user defined albedo - "AUTO" to use the fit of the albedos for solar-composition, clear-sky atmospheres CONV can be either : - "NO" for a pure radiative solution - "YES" for a radiative/convective solution (without taking into account detached convective layers) The code and all the outputs uses SI units. Installation and use : to install the code use the command "make". To test use "make test". The test should be done with the downloaded version of the code, without any changes. To execute the code, once it has been compiled, type ./NonGrey in the same directory.This will output a file PTprofile.csv with the temperature structure in csv format and a file PTprofile.dat in dat format. The input parameters must be changed inside the file paper2.f90. It is necessary to compile the code again each time. The subroutine tprofile2e.f90 can be directly implemented into one's code. (5 data files).
Adaptable Constrained Genetic Programming: Extensions and Applications
NASA Technical Reports Server (NTRS)
Janikow, Cezary Z.
2005-01-01
An evolutionary algorithm applies evolution-based principles to problem solving. To solve a problem, the user defines the space of potential solutions, the representation space. Sample solutions are encoded in a chromosome-like structure. The algorithm maintains a population of such samples, which undergo simulated evolution by means of mutation, crossover, and survival of the fittest principles. Genetic Programming (GP) uses tree-like chromosomes, providing very rich representation suitable for many problems of interest. GP has been successfully applied to a number of practical problems such as learning Boolean functions and designing hardware circuits. To apply GP to a problem, the user needs to define the actual representation space, by defining the atomic functions and terminals labeling the actual trees. The sufficiency principle requires that the label set be sufficient to build the desired solution trees. The closure principle allows the labels to mix in any arity-consistent manner. To satisfy both principles, the user is often forced to provide a large label set, with ad hoc interpretations or penalties to deal with undesired local contexts. This unfortunately enlarges the actual representation space, and thus usually slows down the search. In the past few years, three different methodologies have been proposed to allow the user to alleviate the closure principle by providing means to define, and to process, constraints on mixing the labels in the trees. Last summer we proposed a new methodology to further alleviate the problem by discovering local heuristics for building quality solution trees. A pilot system was implemented last summer and tested throughout the year. This summer we have implemented a new revision, and produced a User's Manual so that the pilot system can be made available to other practitioners and researchers. We have also designed, and partly implemented, a larger system capable of dealing with much more powerful heuristics.
MetaNET--a web-accessible interactive platform for biological metabolic network analysis.
Narang, Pankaj; Khan, Shawez; Hemrom, Anmol Jaywant; Lynn, Andrew Michael
2014-01-01
Metabolic reactions have been extensively studied and compiled over the last century. These have provided a theoretical base to implement models, simulations of which are used to identify drug targets and optimize metabolic throughput at a systemic level. While tools for the perturbation of metabolic networks are available, their applications are limited and restricted as they require varied dependencies and often a commercial platform for full functionality. We have developed MetaNET, an open source user-friendly platform-independent and web-accessible resource consisting of several pre-defined workflows for metabolic network analysis. MetaNET is a web-accessible platform that incorporates a range of functions which can be combined to produce different simulations related to metabolic networks. These include (i) optimization of an objective function for wild type strain, gene/catalyst/reaction knock-out/knock-down analysis using flux balance analysis. (ii) flux variability analysis (iii) chemical species participation (iv) cycles and extreme paths identification and (v) choke point reaction analysis to facilitate identification of potential drug targets. The platform is built using custom scripts along with the open-source Galaxy workflow and Systems Biology Research Tool as components. Pre-defined workflows are available for common processes, and an exhaustive list of over 50 functions are provided for user defined workflows. MetaNET, available at http://metanet.osdd.net , provides a user-friendly rich interface allowing the analysis of genome-scale metabolic networks under various genetic and environmental conditions. The framework permits the storage of previous results, the ability to repeat analysis and share results with other users over the internet as well as run different tools simultaneously using pre-defined workflows, and user-created custom workflows.
Capolongo, S; Bellini, E; Nachiero, D; Rebecchi, A; Buffoli, M
2014-01-01
The design of hospital environments is determined by functional requirements and technical regulations, as well as numerous protocols, which define the structure and system characteristics that such environments need to achieve. In order to improve people's well-being and the quality of their experience within public hospitals, design elements (soft qualities) are added to those 'necessary' features. The aim of this research has been to experiment a new design process and also to create health care spaces with high environmental quality and capable to meet users' emotional and perceptual needs. Such needs were investigated with the help of qualitative research tools and the design criteria for one of these soft qualities - colour - were subsequently defined on the basis of the findings. The colour scheme design for the new San Paolo Hospital Emergency Department in Milan was used as case study. Focus groups were fundamental in defining the project's goals and criteria. The issues raised have led to believe that the proper procedure is not the mere consultation of the users in order to define the goals: users should rather be involved in the whole design process and become co-agents of the choices that determine the environment characteristics, so as to meet the quality requirements identified by the users themselves. The case study has shown the possibility of developing a designing methodology made by three steps (or operational tools) in which users' groups are involved in the choices, loading to plan the environments where compliance with expectations is already implied and verified by means of the process itself. Thus, the method leads to the creation of soft qualities in Healthcare.
Beyond Control Panels: Direct Manipulation for Visual Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; Bradel, Lauren; North, Chris
2013-07-19
Information Visualization strives to provide visual representations through which users can think about and gain insight into information. By leveraging the visual and cognitive systems of humans, complex relationships and phenomena occurring within datasets can be uncovered by exploring information visually. Interaction metaphors for such visualizations are designed to enable users direct control over the filters, queries, and other parameters controlling how the data is visually represented. Through the evolution of information visualization, more complex mathematical and data analytic models are being used to visualize relationships and patterns in data – creating the field of Visual Analytics. However, the expectationsmore » for how users interact with these visualizations has remained largely unchanged – focused primarily on the direct manipulation of parameters of the underlying mathematical models. In this article we present an opportunity to evolve the methodology for user interaction from the direct manipulation of parameters through visual control panels, to interactions designed specifically for visual analytic systems. Instead of focusing on traditional direct manipulation of mathematical parameters, the evolution of the field can be realized through direct manipulation within the visual representation – where users can not only gain insight, but also interact. This article describes future directions and research challenges that fundamentally change the meaning of direct manipulation with regards to visual analytics, advancing the Science of Interaction.« less
NASA Technical Reports Server (NTRS)
1973-01-01
The HD 220 program was created as part of the space shuttle solid rocket booster recovery system definition. The model was generated to investigate the damage to SRB components under water impact loads. The random nature of environmental parameters, such as ocean waves and wind conditions, necessitates estimation of the relative frequency of occurrence for these parameters. The nondeterministic nature of component strengths also lends itself to probabilistic simulation. The Monte Carlo technique allows the simultaneous perturbation of multiple independent parameters and provides outputs describing the probability distribution functions of the dependent parameters. This allows the user to determine the required statistics for each output parameter.
Transport Modeling of Hydrogen in Metals for Application to Hydrogen Assisted Cracking of Metals.
1995-04-04
34 consists of a Fortran "user element" subroutine for use with the ABAQUS 2 finite element program. Documentation of the 1-D user element subroutine is...trapping theory. The use of the ABAQUS finite element "User Element" subroutines for solving 1-D problems is then outlined in full detail. This is followed...reflect the new ordering given by Eq. (57). ABAOUS User Element Subroutines ABAQUS executes a Fortran subroutine named UEL for each "user defined" finite
Leszczuk, Mikołaj; Dudek, Łukasz; Witkowski, Marcin
The VQiPS (Video Quality in Public Safety) Working Group, supported by the U.S. Department of Homeland Security, has been developing a user guide for public safety video applications. According to VQiPS, five parameters have particular importance influencing the ability to achieve a recognition task. They are: usage time-frame, discrimination level, target size, lighting level, and level of motion. These parameters form what are referred to as Generalized Use Classes (GUCs). The aim of our research was to develop algorithms that would automatically assist classification of input sequences into one of the GUCs. Target size and lighting level parameters were approached. The experiment described reveals the experts' ambiguity and hesitation during the manual target size determination process. However, the automatic methods developed for target size classification make it possible to determine GUC parameters with 70 % compliance to the end-users' opinion. Lighting levels of the entire sequence can be classified with an efficiency reaching 93 %. To make the algorithms available for use, a test application has been developed. It is able to process video files and display classification results, the user interface being very simple and requiring only minimal user interaction.
An unsupervised method for summarizing egocentric sport videos
NASA Astrophysics Data System (ADS)
Habibi Aghdam, Hamed; Jahani Heravi, Elnaz; Puig, Domenec
2015-12-01
People are getting more interested to record their sport activities using head-worn or hand-held cameras. This type of videos which is called egocentric sport videos has different motion and appearance patterns compared with life-logging videos. While a life-logging video can be defined in terms of well-defined human-object interactions, notwithstanding, it is not trivial to describe egocentric sport videos using well-defined activities. For this reason, summarizing egocentric sport videos based on human-object interaction might fail to produce meaningful results. In this paper, we propose an unsupervised method for summarizing egocentric videos by identifying the key-frames of the video. Our method utilizes both appearance and motion information and it automatically finds the number of the key-frames. Our blind user study on the new dataset collected from YouTube shows that in 93:5% cases, the users choose the proposed method as their first video summary choice. In addition, our method is within the top 2 choices of the users in 99% of studies.
Action Information Management System (AIMS): a User's View
NASA Technical Reports Server (NTRS)
Wiskerchen, M.
1984-01-01
The initial approach used in establishing a user-defined information system to fulfill the needs of users at NASA Headquarters was unsuccessful in bringing this pilot endeaveor to full project status. The persistence of several users and the full involvement of the Ames Research Center were the ingredients needed to make the AIMS project a success. The lesson learned from this effort is that NASA should always work from its organizational strengths as a Headquarters-Center partnership.
Proceedings of the Workshop on Government Oil Spill Modeling
NASA Technical Reports Server (NTRS)
Bishop, J. M. (Compiler)
1980-01-01
Oil spill model users and modelers were brought together for the purpose of fostering joint communication and increasing understanding of mutual problems. The workshop concentrated on defining user needs, presentations on ongoing modeling programs, and discussions of supporting research for these modeling efforts. Specific user recommendations include the development of an oil spill model user library which identifies and describes available models. The development of models for the long-term fate and effect of spilled oil was examined.
An interactive program to display user-generated or file-based maps on a personal computer monitor
Langer, W.H.; Stephens, R.W.
1987-01-01
PC MAP-MAKER is an ADVANCED BASIC program written to provide users of IBM XT, IBM AT, and compatible computers with a straight-forward, flexible method to display geographical data on a color or monochrome PC (personal computer) monitor. Data can be political boundaries such as State and county boundaries; natural curvilinear features such as rivers, drainage areas, and geological contacts; and points such as well locations and mineral localities. Essentially any point defined by a latitude and longitude and any line defined by a series of latitude and longitude values can be displayed using the program. PC MAP MAKER allows users to view tabular data from U.S. Geological Survey files such as WATSTORE (National Water Data Storage and Retrieval System) in a map format in a time much shorter than required by sending the data to a line plotter. The screen image can be saved to disk for recall at a later date, and hard copies can be printed with a dot matrix printer. The program is user-friendly, using menus or prompts to guide user input. It is fully documented and structured to allow the user to tailor the program to the user 's specific needs. The documentation includes a tutorial designed to introduce users to the capabilities of the program using the State of Colorado as a demonstration map area. (Author 's abstract)
Suicide ideation of individuals in online social networks.
Masuda, Naoki; Kurahashi, Issei; Onari, Hiroko
2013-01-01
Suicide explains the largest number of death tolls among Japanese adolescents in their twenties and thirties. Suicide is also a major cause of death for adolescents in many other countries. Although social isolation has been implicated to influence the tendency to suicidal behavior, the impact of social isolation on suicide in the context of explicit social networks of individuals is scarcely explored. To address this question, we examined a large data set obtained from a social networking service dominant in Japan. The social network is composed of a set of friendship ties between pairs of users created by mutual endorsement. We carried out the logistic regression to identify users' characteristics, both related and unrelated to social networks, which contribute to suicide ideation. We defined suicide ideation of a user as the membership to at least one active user-defined community related to suicide. We found that the number of communities to which a user belongs to, the intransitivity (i.e., paucity of triangles including the user), and the fraction of suicidal neighbors in the social network, contributed the most to suicide ideation in this order. Other characteristics including the age and gender contributed little to suicide ideation. We also found qualitatively the same results for depressive symptoms.
Elevation Difference and Bouguer Anomaly Analysis Tool (EDBAAT) User's Guide
Smittle, Aaron M.; Shoberg, Thomas G.
2017-06-16
This report describes a software tool that imports gravity anomaly point data from the Gravity Database of the United States (GDUS) of the National Geospatial-Intelligence Agency and University of Texas at El Paso along with elevation data from The National Map (TNM) of the U.S. Geological Survey that lie within a user-specified geographic area of interest. Further, the tool integrates these two sets of data spatially and analyzes the consistency of the elevation of each gravity station from the GDUS with TNM elevation data; it also evaluates the consistency of gravity anomaly data within the GDUS data repository. The tool bins the GDUS data based on user-defined criteria of elevation misfit between the GDUS and TNM elevation data. It also provides users with a list of points from the GDUS data, which have Bouguer anomaly values that are considered outliers (two standard deviations or greater) with respect to other nearby GDUS anomaly data. “Nearby” can be defined by the user at time of execution. These outputs should allow users to quickly and efficiently choose which points from the GDUS would be most useful in reconnaissance studies or in augmenting and extending the range of individual gravity studies.
Space Transportation System (STS) propellant scavenging system study. Volume 1: Technical report
NASA Technical Reports Server (NTRS)
1985-01-01
The objectives are to define the most efficient and cost effective methods for scavenging cryogenic and storable propellants and then define the requirements for these scavenging systems. For cryogenic propellants, scavenging is the transfer of propellants from the Shuttle orbiter external tank (ET) and/or main propulsion subsystems (MPS) propellant lines into storage tanks located in the orbiter payload bay for delivery to the user station by a space based transfer stage or the Space Transportation System (STS) by direct insertion. For storable propellants, scavenging is the direct transfer from the orbital maneuvering subsystem (OMS) and/or tankage in the payload bay to users in LEO as well as users in the vicinity of the Space Station.
Neuschulz, J; Schaefer, I; Scheer, M; Christ, H; Braumann, B
2013-07-01
In order to visualize and quantify the direction and extent of morphological upper-jaw changes in infants with unilateral cleft lip and palate (UCLP) during early orthodontic treatment, a three-dimensional method of cast analysis for routine application was developed. In the present investigation, this method was used to identify reaction patterns associated with specific cleft forms. The study included a cast series reflecting the upper-jaw situations of 46 infants with complete (n=27) or incomplete (n=19) UCLP during week 1 and months 3, 6, and 12 of life. Three-dimensional datasets were acquired and visualized with scanning software (DigiModel®; OrthoProof, The Netherlands). Following interactive identification of landmarks on the digitized surface relief, a defined set of representative linear parameters were three-dimensionally measured. At the same time, the three-dimensional surfaces of one patient series were superimposed based on a defined reference plane. Morphometric differences were statistically analyzed. Thanks to the user-friendly software, all landmarks could be identified quickly and reproducibly, thus, allowing for simultaneous three-dimensional measurement of all defined parameters. The measured values revealed that significant morphometric differences were present in all three planes of space between the two patient groups. Patients with complete UCLP underwent significantly larger reductions in cleft width (p<0.001), and sagittal growth in the complete UCLP group exceeded sagittal growth in the incomplete UCLP group by almost 50% within the first year of life. Based on patients with incomplete versus complete UCLP, different reaction patterns were identified that depended not on apparent severities of malformation but on cleft forms.
User-Centered Indexing for Adaptive Information Access
NASA Technical Reports Server (NTRS)
Chen, James R.; Mathe, Nathalie
1996-01-01
We are focusing on information access tasks characterized by large volume of hypermedia connected technical documents, a need for rapid and effective access to familiar information, and long-term interaction with evolving information. The problem for technical users is to build and maintain a personalized task-oriented model of the information to quickly access relevant information. We propose a solution which provides user-centered adaptive information retrieval and navigation. This solution supports users in customizing information access over time. It is complementary to information discovery methods which provide access to new information, since it lets users customize future access to previously found information. It relies on a technique, called Adaptive Relevance Network, which creates and maintains a complex indexing structure to represent personal user's information access maps organized by concepts. This technique is integrated within the Adaptive HyperMan system, which helps NASA Space Shuttle flight controllers organize and access large amount of information. It allows users to select and mark any part of a document as interesting, and to index that part with user-defined concepts. Users can then do subsequent retrieval of marked portions of documents. This functionality allows users to define and access personal collections of information, which are dynamically computed. The system also supports collaborative review by letting users share group access maps. The adaptive relevance network provides long-term adaptation based both on usage and on explicit user input. The indexing structure is dynamic and evolves over time. Leading and generalization support flexible retrieval of information under similar concepts. The network is geared towards more recent information access, and automatically manages its size in order to maintain rapid access when scaling up to large hypermedia space. We present results of simulated learning experiments.
Factors influencing the postoperative use of analgesics in dogs and cats by Canadian veterinarians.
Dohoo, S E; Dohoo, I R
1996-09-01
Four hundred and seventeen Canadian veterinarians were surveyed to determine their postoperative use of analgesics in dogs and cats following 6 categories of surgeries, and their opinion toward pain perception and perceived complications associated with the postoperative use of potent opioid analgesics. Three hundred and seventeen (76%) returned the questionnaire. An analgesic user was defined as a veterinarian who administers analgesics to at least 50% of dogs or 50% of cats following abdominal surgery, excluding ovariohysterectomy. The veterinarians responding exhibited a bimodal distribution of analgesic use, with 49.5% being defined as analgesic users. These veterinarians tended to use analgesics in 100% of animals following abdominal surgery. Veterinarians defined as analgesic nonusers rarely used postoperative analgesics following any abdominal surgery. Pain perception was defined as the average of pain rankings (on a scale of 1 to 10) following abdominal surgery, or the value for dogs or cats if the veterinarian worked with only 1 of the 2 species. Maximum concern about the risks associated with the postoperative use of potent opioid agonists was defined as the highest ranking assigned to any of the 7 risks evaluated in either dogs or cats. Logistic regression analysis identified the pain perception score and the maximum concern regarding the use of potent opioid agonists in the postoperative period as the 2 factors that distinguished analgesic users from analgesic nonusers. This model correctly classified 68% of veterinarians as analgesic users or nonusers. Linear regression analysis identified gender and the presence of an animal health technologist in the practice as the 2 factors that influenced pain perception by veterinarians. Linear regression analysis identified working with an animal health technologist, graduation within the past 10 years, and attendance at continuing education as factors that influenced maximum concern about the postoperative use of opioid agonists.
Emergency Department Frequent Users for Acute Alcohol Intoxication.
Klein, Lauren R; Martel, Marc L; Driver, Brian E; Reing, Mackenzie; Cole, Jon B
2018-03-01
A subset of frequent users of emergency services are those who use the emergency department (ED) for acute alcohol intoxication. This population and their ED encounters have not been previously described. This was a retrospective, observational, cohort study of patients presenting to the ED for acute alcohol intoxication between 2012 and 2016. We collected all data from the electronic medical record. Frequent users for alcohol intoxication were defined as those with greater than 20 visits for acute intoxication without additional medical chief complaints in the previous 12 months. We used descriptive statistics to evaluate characteristics of frequent users for alcohol intoxication, as well as their ED encounters. We identified 32,121 patient encounters. Of those, 325 patients were defined as frequent users for alcohol intoxication, comprising 11,370 of the encounters during the study period. The median maximum number of encounters per person for alcohol intoxication in a one-year period was 47 encounters (range 20 to 169). Frequent users were older (47 years vs. 39 years), and more commonly male (86% vs. 71%). Frequent users for alcohol intoxication had higher rates of medical and psychiatric comorbidities including liver disease, chronic kidney disease, ischemic vascular disease, dementia, chronic obstructive pulmonary disease, history of traumatic brain injury, schizophrenia, and bipolar disorder. In this study, we identified a group of ED frequent users who use the ED for acute alcohol intoxication. This population had higher rates of medical and psychiatric comorbidities compared to non-frequent users.
DIRT: The Dust InfraRed Toolbox
NASA Astrophysics Data System (ADS)
Pound, M. W.; Wolfire, M. G.; Mundy, L. G.; Teuben, P. J.; Lord, S.
We present DIRT, a Java applet geared toward modeling a variety of processes in envelopes of young and evolved stars. Users can automatically and efficiently search grids of pre-calculated models to fit their data. A large set of physical parameters and dust types are included in the model database, which contains over 500,000 models. The computing cluster for the database is described in the accompanying paper by Teuben et al. (2000). A typical user query will return about 50-100 models, which the user can then interactively filter as a function of 8 model parameters (e.g., extinction, size, flux, luminosity). A flexible, multi-dimensional plotter (Figure 1) allows users to view the models, rotate them, tag specific parameters with color or symbol size, and probe individual model points. For any given model, auxiliary plots such as dust grain properties, radial intensity profiles, and the flux as a function of wavelength and beamsize can be viewed. The user can fit observed data to several models simultaneously and see the results of the fit; the best fit is automatically selected for plotting. The URL for this project is http://dustem.astro.umd.edu.
Tracking Multiple Topics for Finding Interesting Articles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pon, R K; Cardenas, A F; Buttler, D J
We introduce multiple topic tracking (MTT) for iScore to better recommend news articles for users with multiple interests and to address changes in user interests over time. As an extension of the basic Rocchio algorithm, traditional topic detection and tracking, and single-pass clustering, MTT maintains multiple interest profiles to identify interesting articles for a specific user given user-feedback. Focusing on only interesting topics enables iScore to discard useless profiles to address changes in user interests and to achieve a balance between resource consumption and classification accuracy. iScore is able to achieve higher quality results than traditional methods such as themore » Rocchio algorithm. We identify several operating parameters that work well for MTT. Using the same parameters, we show that MTT alone yields high quality results for recommending interesting articles from several corpora. The inclusion of MTT improves iScore's performance by 25% in recommending news articles from the Yahoo! News RSS feeds and the TREC11 adaptive filter article collection. And through a small user study, we show that iScore can still perform well when only provided with little user feedback.« less
FLASH Interface; a GUI for managing runtime parameters in FLASH simulations
NASA Astrophysics Data System (ADS)
Walker, Christopher; Tzeferacos, Petros; Weide, Klaus; Lamb, Donald; Flocke, Norbert; Feister, Scott
2017-10-01
We present FLASH Interface, a novel graphical user interface (GUI) for managing runtime parameters in simulations performed with the FLASH code. FLASH Interface supports full text search of available parameters; provides descriptions of each parameter's role and function; allows for the filtering of parameters based on categories; performs input validation; and maintains all comments and non-parameter information already present in existing parameter files. The GUI can be used to edit existing parameter files or generate new ones. FLASH Interface is open source and was implemented with the Electron framework, making it available on Mac OSX, Windows, and Linux operating systems. The new interface lowers the entry barrier for new FLASH users and provides an easy-to-use tool for experienced FLASH simulators. U.S. Department of Energy (DOE), NNSA ASC/Alliances Center for Astrophysical Thermonuclear Flashes, U.S. DOE NNSA ASC through the Argonne Institute for Computing in Science, U.S. National Science Foundation.
Evaluation methodology for query-based scene understanding systems
NASA Astrophysics Data System (ADS)
Huster, Todd P.; Ross, Timothy D.; Culbertson, Jared L.
2015-05-01
In this paper, we are proposing a method for the principled evaluation of scene understanding systems in a query-based framework. We can think of a query-based scene understanding system as a generalization of typical sensor exploitation systems where instead of performing a narrowly defined task (e.g., detect, track, classify, etc.), the system can perform general user-defined tasks specified in a query language. Examples of this type of system have been developed as part of DARPA's Mathematics of Sensing, Exploitation, and Execution (MSEE) program. There is a body of literature on the evaluation of typical sensor exploitation systems, but the open-ended nature of the query interface introduces new aspects to the evaluation problem that have not been widely considered before. In this paper, we state the evaluation problem and propose an approach to efficiently learn about the quality of the system under test. We consider the objective of the evaluation to be to build a performance model of the system under test, and we rely on the principles of Bayesian experiment design to help construct and select optimal queries for learning about the parameters of that model.
A workload model and measures for computer performance evaluation
NASA Technical Reports Server (NTRS)
Kerner, H.; Kuemmerle, K.
1972-01-01
A generalized workload definition is presented which constructs measurable workloads of unit size from workload elements, called elementary processes. An elementary process makes almost exclusive use of one of the processors, CPU, I/O processor, etc., and is measured by the cost of its execution. Various kinds of user programs can be simulated by quantitative composition of elementary processes into a type. The character of the type is defined by the weights of its elementary processes and its structure by the amount and sequence of transitions between its elementary processes. A set of types is batched to a mix. Mixes of identical cost are considered as equivalent amounts of workload. These formalized descriptions of workloads allow investigators to compare the results of different studies quantitatively. Since workloads of different composition are assigned a unit of cost, these descriptions enable determination of cost effectiveness of different workloads on a machine. Subsequently performance parameters such as throughput rate, gain factor, internal and external delay factors are defined and used to demonstrate the effects of various workload attributes on the performance of a selected large scale computer system.
Swarm formation control utilizing elliptical surfaces and limiting functions.
Barnes, Laura E; Fields, Mary Anne; Valavanis, Kimon P
2009-12-01
In this paper, we present a strategy for organizing swarms of unmanned vehicles into a formation by utilizing artificial potential fields that were generated from normal and sigmoid functions. These functions construct the surface on which swarm members travel, controlling the overall swarm geometry and the individual member spacing. Nonlinear limiting functions are defined to provide tighter swarm control by modifying and adjusting a set of control variables that force the swarm to behave according to set constraints, formation, and member spacing. The artificial potential functions and limiting functions are combined to control swarm formation, orientation, and swarm movement as a whole. Parameters are chosen based on desired formation and user-defined constraints. This approach is computationally efficient and scales well to different swarm sizes, to heterogeneous systems, and to both centralized and decentralized swarm models. Simulation results are presented for a swarm of 10 and 40 robots that follow circle, ellipse, and wedge formations. Experimental results are included to demonstrate the applicability of the approach on a swarm of four custom-built unmanned ground vehicles (UGVs).
Selecting Summary Statistics in Approximate Bayesian Computation for Calibrating Stochastic Models
Burr, Tom
2013-01-01
Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the “go-to” option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example. PMID:24288668
Selecting summary statistics in approximate Bayesian computation for calibrating stochastic models.
Burr, Tom; Skurikhin, Alexei
2013-01-01
Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the "go-to" option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example.
Design and development of line type modulators for high impedance electron gun
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dixit, Kavita P.; Tillu, Abhijit; Chavan, Ramchandra
Conventional line type modulators are routinely used for powering pulsed power microwave devices such as magnetrons and klystrons used for radar, medical and scientific applications. The load impedance (operating point) is fairly well defined in these cases, and makes the design of the discharging circuit of the modulator straight forward. This paper describes the Line type modulators that have been developed and being routinely used for powering the Triode Electron Gun of industrial electron linacs. The beam parameters of such guns are user defined and the pulse current varies from few mA to 800mA (typ). The beam energies requirement variesmore » from 40 keV to 80 keV. Hence the impedance offered by the electron gun to the power source (modulator) is not well defined. The load capacitance which is inclusive of the various stray capacitances along with the intrinsic gun capacitance is ∼ 200-400 pF. This capacitance, which depends on the configuration, shunts the load and makes the effective load highly capacitive with the resistive part varying over a wide range. The paper describes the design and development of conventional line type modulators for powering Electron gun load of the type described above. (author)« less
Space Segment (SS) and the Navigation User Segment (US) Interface Control Document (ICD)
DOT National Transportation Integrated Search
1993-10-10
This Interface Control Document (ICD) defines the requirements related to the interface between the Space Segment (SS) of the Global Positioning System (GPS) and the Navigation Users Segment of the GPS. 2880k, 154p.
Impact of Truck Loading on Design and Analysis of Asphaltic Pavement Structures : Phase II
DOT National Transportation Integrated Search
2011-02-01
In this study, Schaperys nonlinear viscoelastic constitutive model is implemented into the commercial finite element (FE) software ABAQUS via user defined subroutine (user material, or UMAT) to analyze asphalt pavement subjected to heavy truck loa...
An Inter-Personal Information Sharing Model Based on Personalized Recommendations
NASA Astrophysics Data System (ADS)
Kamei, Koji; Funakoshi, Kaname; Akahani, Jun-Ichi; Satoh, Tetsuji
In this paper, we propose an inter-personal information sharing model among individuals based on personalized recommendations. In the proposed model, we define an information resource as shared between people when both of them consider it important --- not merely when they both possess it. In other words, the model defines the importance of information resources based on personalized recommendations from identifiable acquaintances. The proposed method is based on a collaborative filtering system that focuses on evaluations from identifiable acquaintances. It utilizes both user evaluations for documents and their contents. In other words, each user profile is represented as a matrix of credibility to the other users' evaluations on each domain of interests. We extended the content-based collaborative filtering method to distinguish other users to whom the documents should be recommended. We also applied a concept-based vector space model to represent the domain of interests instead of the previous method which represented them by a term-based vector space model. We introduce a personalized concept-base compiled from each user's information repository to improve the information retrieval in the user's environment. Furthermore, the concept-spaces change from user to user since they reflect the personalities of the users. Because of different concept-spaces, the similarity between a document and a user's interest varies for each user. As a result, a user receives recommendations from other users who have different view points, achieving inter-personal information sharing based on personalized recommendations. This paper also describes an experimental simulation of our information sharing model. In our laboratory, five participants accumulated a personal repository of e-mails and web pages from which they built their own concept-base. Then we estimated the user profiles according to personalized concept-bases and sets of documents which others evaluated. We simulated inter-personal recommendation based on the user profiles and evaluated the performance of the recommendation method by comparing the recommended documents to the result of the content-based collaborative filtering.
Involving service users in trials: developing a standard operating procedure
2013-01-01
Background Many funding bodies require researchers to actively involve service users in research to improve relevance, accountability and quality. Current guidance to researchers mainly discusses general principles. Formal guidance about how to involve service users operationally in the conduct of trials is lacking. We aimed to develop a standard operating procedure (SOP) to support researchers to involve service users in trials and rigorous studies. Methods Researchers with experience of involving service users and service users who were contributing to trials collaborated with the West Wales Organisation for Rigorous Trials in Health, a registered clinical trials unit, to develop the SOP. Drafts were prepared in a Task and Finish Group, reviewed by all co-authors and amendments made. Results We articulated core principles, which defined equality of service users with all other research team members and collaborative processes underpinning the SOP, plus guidance on how to achieve these. We developed a framework for involving service users in research that defined minimum levels of collaboration plus additional consultation and decision-making opportunities. We recommended service users be involved throughout the life of a trial, including planning and development, data collection, analysis and dissemination, and listed tasks for collaboration. We listed people responsible for involving service users in studies and promoting an inclusive culture. We advocate actively involving service users as early as possible in the research process, with a minimum of two on all formal trial groups and committees. We propose that researchers protect at least 1% of their total research budget as a minimum resource to involve service users and allow enough time to facilitate active involvement. Conclusions This SOP provides guidance to researchers to involve service users successfully in developing and conducting clinical trials and creating a culture of actively involving service users in research at all stages. The UK Clinical Research Collaboration should encourage clinical trials units actively to involve service users and research funders should provide sufficient funds and time for this in research grants. PMID:23866730
NASA Technical Reports Server (NTRS)
Devito, D. M.
1981-01-01
A low-cost GPS civil-user mobile terminal whose purchase cost is substantially an order of magnitude less than estimates for the military counterpart is considered with focus on ground station requirements for position monitoring of civil users requiring this capability and the civil user navigation and location-monitoring requirements. Existing survey literature was examined to ascertain the potential users of a low-cost NAVSTAR receiver and to estimate their number, function, and accuracy requirements. System concepts are defined for low cost user equipments for in-situ navigation and the retransmission of low data rate positioning data via a geostationary satellite to a central computing facility.
User-guided segmentation for volumetric retinal optical coherence tomography images
Yin, Xin; Chao, Jennifer R.; Wang, Ruikang K.
2014-01-01
Abstract. Despite the existence of automatic segmentation techniques, trained graders still rely on manual segmentation to provide retinal layers and features from clinical optical coherence tomography (OCT) images for accurate measurements. To bridge the gap between this time-consuming need of manual segmentation and currently available automatic segmentation techniques, this paper proposes a user-guided segmentation method to perform the segmentation of retinal layers and features in OCT images. With this method, by interactively navigating three-dimensional (3-D) OCT images, the user first manually defines user-defined (or sketched) lines at regions where the retinal layers appear very irregular for which the automatic segmentation method often fails to provide satisfactory results. The algorithm is then guided by these sketched lines to trace the entire 3-D retinal layer and anatomical features by the use of novel layer and edge detectors that are based on robust likelihood estimation. The layer and edge boundaries are finally obtained to achieve segmentation. Segmentation of retinal layers in mouse and human OCT images demonstrates the reliability and efficiency of the proposed user-guided segmentation method. PMID:25147962
User-guided segmentation for volumetric retinal optical coherence tomography images.
Yin, Xin; Chao, Jennifer R; Wang, Ruikang K
2014-08-01
Despite the existence of automatic segmentation techniques, trained graders still rely on manual segmentation to provide retinal layers and features from clinical optical coherence tomography (OCT) images for accurate measurements. To bridge the gap between this time-consuming need of manual segmentation and currently available automatic segmentation techniques, this paper proposes a user-guided segmentation method to perform the segmentation of retinal layers and features in OCT images. With this method, by interactively navigating three-dimensional (3-D) OCT images, the user first manually defines user-defined (or sketched) lines at regions where the retinal layers appear very irregular for which the automatic segmentation method often fails to provide satisfactory results. The algorithm is then guided by these sketched lines to trace the entire 3-D retinal layer and anatomical features by the use of novel layer and edge detectors that are based on robust likelihood estimation. The layer and edge boundaries are finally obtained to achieve segmentation. Segmentation of retinal layers in mouse and human OCT images demonstrates the reliability and efficiency of the proposed user-guided segmentation method.
Remote Sensing/gis Integration for Site Planning and Resource Management
NASA Technical Reports Server (NTRS)
Fellows, J. D.
1982-01-01
The development of an interactive/batch gridded information system (array of cells georeferenced to USGS quad sheets) and interfacing application programs (e.g., hydrologic models) is discussed. This system allows non-programer users to request any data set(s) stored in the data base by inputing any random polygon's (watershed, political zone) boundary points. The data base information contained within this polygon can be used to produce maps, statistics, and define model parameters for the area. Present/proposed conditions for the area may be compared by inputing future usage (land cover, soils, slope, etc.). This system, known as the Hydrologic Analysis Program (HAP), is especially effective in the real time analysis of proposed land cover changes on runoff hydrographs and graphics/statistics resource inventories of random study area/watersheds.
Modeling experimental plasma diagnostics in the FLASH code: Thomson scattering
NASA Astrophysics Data System (ADS)
Weide, Klaus; Flocke, Norbert; Feister, Scott; Tzeferacos, Petros; Lamb, Donald
2017-10-01
Spectral analysis of the Thomson scattering of laser light sent into a plasma provides an experimental method to quantify plasma properties in laser-driven plasma experiments. We have implemented such a synthetic Thomson scattering diagnostic unit in the FLASH code, to emulate the probe-laser propagation, scattering and spectral detection. User-defined laser rays propagate into the FLASH simulation region and experience scattering (change in direction and frequency) based on plasma parameters. After scattering, the rays propagate out of the interaction region and are spectrally characterized. The diagnostic unit can be used either during a physics simulation or in post-processing of simulation results. FLASH is publicly available at flash.uchicago.edu. U.S. DOE NNSA, U.S. DOE NNSA ASC, U.S. DOE Office of Science and NSF.
NASA Astrophysics Data System (ADS)
Mántaras, Daniel A.; Luque, Pablo
2012-10-01
A virtual test rig is presented using a three-dimensional model of the elasto-kinematic behaviour of a vehicle. A general approach is put forward to determine the three-dimensional position of the body and the main parameters which influence the handling of the vehicle. For the design process, the variable input data are the longitudinal and lateral acceleration and the curve radius, which are defined by the user as a design goal. For the optimisation process, once the vehicle has been built, the variable input data are the travel of the four struts and the steering wheel angle, which is obtained through monitoring the vehicle. The virtual test rig has been applied to a standard vehicle and the validity of the results has been proven.
Numerical model for learning concepts of streamflow simulation
DeLong, L.L.; ,
1993-01-01
Numerical models are useful for demonstrating principles of open-channel flow. Such models can allow experimentation with cause-and-effect relations, testing concepts of physics and numerical techniques. Four PT is a numerical model written primarily as a teaching supplement for a course in one-dimensional stream-flow modeling. Four PT options particularly useful in training include selection of governing equations, boundary-value perturbation, and user-programmable constraint equations. The model can simulate non-trivial concepts such as flow in complex interconnected channel networks, meandering channels with variable effective flow lengths, hydraulic structures defined by unique three-parameter relations, and density-driven flow.The model is coded in FORTRAN 77, and data encapsulation is used extensively to simplify maintenance and modification and to enhance the use of Four PT modules by other programs and programmers.
NASA Technical Reports Server (NTRS)
Metcalf, David
1995-01-01
Multimedia Information eXchange (MIX) is a multimedia information system that accommodates multiple data types and provides consistency across platforms. Information from all over the world can be accessed quickly and efficiently with the Internet-based system. I-NET's MIX uses the World Wide Web and Mosaic graphical user interface. Mosaic is available on all platforms used at I-NET's Kennedy Space Center (KSC) facilities. Key information system design concepts and benefits are reviewed. The MIX system also defines specific configuration and helper application parameters to ensure consistent operations across the entire organization. Guidelines and procedures for other areas of importance in information systems design are also addressed. Areas include: code of ethics, content, copyright, security, system administration, and support.
Identifying the perceptive users for online social systems
Liu, Xiao-Lu; Guo, Qiang; Han, Jing-Ti
2017-01-01
In this paper, the perceptive user, who could identify the high-quality objects in their initial lifespan, is presented. By tracking the ratings given to the rewarded objects, we present a method to identify the user perceptibility, which is defined as the capability that a user can identify these objects at their early lifespan. Moreover, we investigate the behavior patterns of the perceptive users from three dimensions: User activity, correlation characteristics of user rating series and user reputation. The experimental results for the empirical networks indicate that high perceptibility users show significantly different behavior patterns with the others: Having larger degree, stronger correlation of rating series and higher reputation. Furthermore, in view of the hysteresis in finding the rewarded objects, we present a general framework for identifying the high perceptibility users based on user behavior patterns. The experimental results show that this work is helpful for deeply understanding the collective behavior patterns for online users. PMID:28704382
Identifying the perceptive users for online social systems.
Liu, Jian-Guo; Liu, Xiao-Lu; Guo, Qiang; Han, Jing-Ti
2017-01-01
In this paper, the perceptive user, who could identify the high-quality objects in their initial lifespan, is presented. By tracking the ratings given to the rewarded objects, we present a method to identify the user perceptibility, which is defined as the capability that a user can identify these objects at their early lifespan. Moreover, we investigate the behavior patterns of the perceptive users from three dimensions: User activity, correlation characteristics of user rating series and user reputation. The experimental results for the empirical networks indicate that high perceptibility users show significantly different behavior patterns with the others: Having larger degree, stronger correlation of rating series and higher reputation. Furthermore, in view of the hysteresis in finding the rewarded objects, we present a general framework for identifying the high perceptibility users based on user behavior patterns. The experimental results show that this work is helpful for deeply understanding the collective behavior patterns for online users.
Morgenstern, M; Moriarty, T F; Kuehl, R; Richards, R G; McNally, M A; Verhofstad, M H J; Borens, O; Zalavras, C; Raschke, M; Kates, S L; Metsemakers, W J
2018-03-01
Fracture-related infection (FRI) is one of the most challenging musculoskeletal complications in orthopaedic-trauma surgery. Although the orthopaedic community has developed and adopted a consensus definition of prosthetic joint infections (PJI), it still remains unclear how the trauma surgery community defines FRI in daily clinical practice or in performing clinical research studies. The central aim of this study was to survey the opinions of a global network of trauma surgeons on the definitions and criteria they routinely use, and their opinion on the need for a unified definition of FRI. The secondary aims were to survey their opinion on the utility of currently used definitions that may be at least partially applicable for FRI, and finally their opinion on the important clinical parameters that should be considered as diagnostic criteria for FRI. An 11-item questionnaire was developed to cover the above-mentioned aims. The questionnaire was administered by SurveyMonkey and was sent via blast email to all registered users of AO Trauma (Davos, Switzerland). Out of the 26'563 recipients who opened the email, 2'327 (8.8%) completed the questionnaire. Nearly 90% of respondents agreed that a consensus-derived definition for FRI is required and 66% of the surgeons also agreed that PJI and FRI are not equal with respect to diagnosis, treatment and outcome. Furthermore, "positive cultures from microbiology testing", "elevation of CRP", "purulent drainage" and "local clinical signs of infection" were voted the most important diagnostic parameters for FRI. This international survey infers the need for a consensus definition of FRI and provides insight into the clinical parameters seen by an international community of trauma surgeons as being critical for defining FRI. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Engler, N. A.; Nash, J. F.; Strange, J. D.
1980-01-01
A description of each of the satellites is given and a brief summary of each user experiment is presented. A Cross Index of User Experiments sorted by various parameters and a listing of keywords versus Experiment Number are presented.
Playing Games with Optimal Competitive Scheduling
NASA Technical Reports Server (NTRS)
Frank, Jeremy; Crawford, James; Khatib, Lina; Brafman, Ronen
2005-01-01
This paper is concerned with the problem of allocating a unit capacity resource to multiple users within a pre-defined time period. The resource is indivisible, so that at most one user can use it at each time instance. However, different users may use it at different times. The users have independent, selfish preferences for when and for how long they are allocated this resource. Thus, they value different resource access durations differently, and they value different time slots differently. We seek an optimal allocation schedule for this resource.
2013-12-01
Media (IACP defines Social Media as the following:2 A category of Internet-based resources that integrate user - generated content and user ...Social Media—a category of Internet –based resources that integrate user - generated content and user participations. Includes Facebook, MySpace...reduction in crime is unresolved, warranting future study in this area. vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF CONTENTS I. INTRODUCTION
NASA Technical Reports Server (NTRS)
1974-01-01
User models defined as any explicit process or procedure used to transform information extracted from remotely sensed data into a form useful as a resource management information input are discussed. The role of the user models as information, technological, and operations interfaces between the TERSSE and the resource managers is emphasized. It is recommended that guidelines and management strategies be developed for a systems approach to user model development.
a Standardized Approach to Topographic Data Processing and Workflow Management
NASA Astrophysics Data System (ADS)
Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.
2013-12-01
An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and sharing effective topographic processing workflows.
An Earthquake Shake Map Routine with Low Cost Accelerometers: Preliminary Results
NASA Astrophysics Data System (ADS)
Alcik, H. A.; Tanircan, G.; Kaya, Y.
2015-12-01
Vast amounts of high quality strong motion data are indispensable inputs of the analyses in the field of geotechnical and earthquake engineering however, high cost of installation of the strong motion systems constitutes the biggest obstacle for worldwide dissemination. In recent years, MEMS based (micro-electro-mechanical systems) accelerometers have been used in seismological research-oriented studies as well as earthquake engineering oriented projects basically due to precision obtained in downsized instruments. In this research our primary goal is to ensure the usage of these low-cost instruments in the creation of shake-maps immediately after a strong earthquake. Second goal is to develop software that will automatically process the real-time data coming from the rapid response network and create shake-map. For those purposes, four MEMS sensors have been set up to deliver real-time data. Data transmission is done through 3G modems. A subroutine was coded in assembler language and embedded into the operating system of each instrument to create MiniSEED files with packages of 1-second instead of 512-byte packages.The Matlab-based software calculates the strong motion (SM) parameters at every second, and they are compared with the user-defined thresholds. A voting system embedded in the software captures the event if the total vote exceeds the threshold. The user interface of the software enables users to monitor the calculated SM parameters either in a table or in a graph (Figure 1). A small scale and affordable rapid response network is created using four MEMS sensors, and the functionality of the software has been tested and validated using shake table tests. The entire system is tested together with a reference sensor under real strong ground motion recordings as well as series of sine waves with varying amplitude and frequency. The successful realization of this software allowed us to set up a test network at Tekirdağ Province, the closest coastal point to the moderate size earthquake activities in the Marmara Sea, Turkey.
Private Graphs - Access Rights on Graphs for Seamless Navigation
NASA Astrophysics Data System (ADS)
Dorner, W.; Hau, F.; Pagany, R.
2016-06-01
After the success of GNSS (Global Navigational Satellite Systems) and navigation services for public streets, indoor seems to be the next big development in navigational services, relying on RTLS - Real Time Locating Services (e.g. WIFI) and allowing seamless navigation. In contrast to navigation and routing services on public streets, seamless navigation will cause an additional challenge: how to make routing data accessible to defined users or restrict access rights for defined areas or only to parts of the graph to a defined user group? The paper will present case studies and data from literature, where seamless and especially indoor navigation solutions are presented (hospitals, industrial complexes, building sites), but the problem of restricted access rights was only touched from a real world, but not a technical perspective. The analysis of case studies will show, that the objective of navigation and the different target groups for navigation solutions will demand well defined access rights and require solutions, how to make only parts of a graph to a user or application available to solve a navigational task. The paper will therefore introduce the concept of private graphs, which is defined as a graph for navigational purposes covering the street, road or floor network of an area behind a public street and suggest different approaches how to make graph data for navigational purposes available considering access rights and data protection, privacy and security issues as well.
Mass-storage management for distributed image/video archives
NASA Astrophysics Data System (ADS)
Franchi, Santina; Guarda, Roberto; Prampolini, Franco
1993-04-01
The realization of image/video database requires a specific design for both database structures and mass storage management. This issue has addressed the project of the digital image/video database system that has been designed at IBM SEMEA Scientific & Technical Solution Center. Proper database structures have been defined to catalog image/video coding technique with the related parameters, and the description of image/video contents. User workstations and servers are distributed along a local area network. Image/video files are not managed directly by the DBMS server. Because of their wide size, they are stored outside the database on network devices. The database contains the pointers to the image/video files and the description of the storage devices. The system can use different kinds of storage media, organized in a hierarchical structure. Three levels of functions are available to manage the storage resources. The functions of the lower level provide media management. They allow it to catalog devices and to modify device status and device network location. The medium level manages image/video files on a physical basis. It manages file migration between high capacity media and low access time media. The functions of the upper level work on image/video file on a logical basis, as they archive, move and copy image/video data selected by user defined queries. These functions are used to support the implementation of a storage management strategy. The database information about characteristics of both storage devices and coding techniques are used by the third level functions to fit delivery/visualization requirements and to reduce archiving costs.
ERIC Educational Resources Information Center
Garcia-Barriocanal, Elena; Sicilia, Miguel-Angel; Sanchez-Alonso, Salvador; Lytras, Miltiadis
2011-01-01
Web 2.0 technologies can be considered a loosely defined set of Web application styles that foster a kind of media consumer more engaged, and usually active in creating and maintaining Internet contents. Thus, Web 2.0 applications have resulted in increased user participation and massive user-generated (or user-published) open multimedia content,…
Identity management and privacy languages technologies: Improving user control of data privacy
NASA Astrophysics Data System (ADS)
García, José Enrique López; García, Carlos Alberto Gil; Pacheco, Álvaro Armenteros; Organero, Pedro Luis Muñoz
The identity management solutions have the capability to bring confidence to internet services, but this confidence could be improved if user has more control over the privacy policy of its attributes. Privacy languages could help to this task due to its capability to define privacy policies for data in a very flexible way. So, an integration problem arises: making work together both identity management and privacy languages. Despite several proposals for accomplishing this have already been defined, this paper suggests some topics and improvements that could be considered.
SMOG 2: A Versatile Software Package for Generating Structure-Based Models.
Noel, Jeffrey K; Levi, Mariana; Raghunathan, Mohit; Lammert, Heiko; Hayes, Ryan L; Onuchic, José N; Whitford, Paul C
2016-03-01
Molecular dynamics simulations with coarse-grained or simplified Hamiltonians have proven to be an effective means of capturing the functionally important long-time and large-length scale motions of proteins and RNAs. Originally developed in the context of protein folding, structure-based models (SBMs) have since been extended to probe a diverse range of biomolecular processes, spanning from protein and RNA folding to functional transitions in molecular machines. The hallmark feature of a structure-based model is that part, or all, of the potential energy function is defined by a known structure. Within this general class of models, there exist many possible variations in resolution and energetic composition. SMOG 2 is a downloadable software package that reads user-designated structural information and user-defined energy definitions, in order to produce the files necessary to use SBMs with high performance molecular dynamics packages: GROMACS and NAMD. SMOG 2 is bundled with XML-formatted template files that define commonly used SBMs, and it can process template files that are altered according to the needs of each user. This computational infrastructure also allows for experimental or bioinformatics-derived restraints or novel structural features to be included, e.g. novel ligands, prosthetic groups and post-translational/transcriptional modifications. The code and user guide can be downloaded at http://smog-server.org/smog2.
Fast online generalized multiscale finite element method using constraint energy minimization
NASA Astrophysics Data System (ADS)
Chung, Eric T.; Efendiev, Yalchin; Leung, Wing Tat
2018-02-01
Local multiscale methods often construct multiscale basis functions in the offline stage without taking into account input parameters, such as source terms, boundary conditions, and so on. These basis functions are then used in the online stage with a specific input parameter to solve the global problem at a reduced computational cost. Recently, online approaches have been introduced, where multiscale basis functions are adaptively constructed in some regions to reduce the error significantly. In multiscale methods, it is desired to have only 1-2 iterations to reduce the error to a desired threshold. Using Generalized Multiscale Finite Element Framework [10], it was shown that by choosing sufficient number of offline basis functions, the error reduction can be made independent of physical parameters, such as scales and contrast. In this paper, our goal is to improve this. Using our recently proposed approach [4] and special online basis construction in oversampled regions, we show that the error reduction can be made sufficiently large by appropriately selecting oversampling regions. Our numerical results show that one can achieve a three order of magnitude error reduction, which is better than our previous methods. We also develop an adaptive algorithm and enrich in selected regions with large residuals. In our adaptive method, we show that the convergence rate can be determined by a user-defined parameter and we confirm this by numerical simulations. The analysis of the method is presented.
NASA Astrophysics Data System (ADS)
Becker, Roland; Vexler, Boris
2005-06-01
We consider the calibration of parameters in physical models described by partial differential equations. This task is formulated as a constrained optimization problem with a cost functional of least squares type using information obtained from measurements. An important issue in the numerical solution of this type of problem is the control of the errors introduced, first, by discretization of the equations describing the physical model, and second, by measurement errors or other perturbations. Our strategy is as follows: we suppose that the user defines an interest functional I, which might depend on both the state variable and the parameters and which represents the goal of the computation. First, we propose an a posteriori error estimator which measures the error with respect to this functional. This error estimator is used in an adaptive algorithm to construct economic meshes by local mesh refinement. The proposed estimator requires the solution of an auxiliary linear equation. Second, we address the question of sensitivity. Applying similar techniques as before, we derive quantities which describe the influence of small changes in the measurements on the value of the interest functional. These numbers, which we call relative condition numbers, give additional information on the problem under consideration. They can be computed by means of the solution of the auxiliary problem determined before. Finally, we demonstrate our approach at hand of a parameter calibration problem for a model flow problem.
PRIMO: An Interactive Homology Modeling Pipeline.
Hatherley, Rowan; Brown, David K; Glenister, Michael; Tastan Bishop, Özlem
2016-01-01
The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO's automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/.
PRIMO: An Interactive Homology Modeling Pipeline
Glenister, Michael
2016-01-01
The development of automated servers to predict the three-dimensional structure of proteins has seen much progress over the years. These servers make calculations simpler, but largely exclude users from the process. In this study, we present the PRotein Interactive MOdeling (PRIMO) pipeline for homology modeling of protein monomers. The pipeline eases the multi-step modeling process, and reduces the workload required by the user, while still allowing engagement from the user during every step. Default parameters are given for each step, which can either be modified or supplemented with additional external input. PRIMO has been designed for users of varying levels of experience with homology modeling. The pipeline incorporates a user-friendly interface that makes it easy to alter parameters used during modeling. During each stage of the modeling process, the site provides suggestions for novice users to improve the quality of their models. PRIMO provides functionality that allows users to also model ligands and ions in complex with their protein targets. Herein, we assess the accuracy of the fully automated capabilities of the server, including a comparative analysis of the available alignment programs, as well as of the refinement levels used during modeling. The tests presented here demonstrate the reliability of the PRIMO server when producing a large number of protein models. While PRIMO does focus on user involvement in the homology modeling process, the results indicate that in the presence of suitable templates, good quality models can be produced even without user intervention. This gives an idea of the base level accuracy of PRIMO, which users can improve upon by adjusting parameters in their modeling runs. The accuracy of PRIMO’s automated scripts is being continuously evaluated by the CAMEO (Continuous Automated Model EvaluatiOn) project. The PRIMO site is free for non-commercial use and can be accessed at https://primo.rubi.ru.ac.za/. PMID:27855192
Provides detailed guidance to the user on how to select input parameters for running the Terrestrial Investigation Model (TIM) and recommendations for default values that can be used when no chemical-specific or species-specific information are available.
The use and misuse of V(c,max) in Earth System Models.
Rogers, Alistair
2014-02-01
Earth System Models (ESMs) aim to project global change. Central to this aim is the need to accurately model global carbon fluxes. Photosynthetic carbon dioxide assimilation by the terrestrial biosphere is the largest of these fluxes, and in many ESMs is represented by the Farquhar, von Caemmerer and Berry (FvCB) model of photosynthesis. The maximum rate of carboxylation by the enzyme Rubisco, commonly termed V c,max, is a key parameter in the FvCB model. This study investigated the derivation of the values of V c,max used to represent different plant functional types (PFTs) in ESMs. Four methods for estimating V c,max were identified; (1) an empirical or (2) mechanistic relationship was used to relate V c,max to leaf N content, (3) V c,max was estimated using an approach based on the optimization of photosynthesis and respiration or (4) calibration of a user-defined V c,max to obtain a target model output. Despite representing the same PFTs, the land model components of ESMs were parameterized with a wide range of values for V c,max (-46 to +77% of the PFT mean). In many cases, parameterization was based on limited data sets and poorly defined coefficients that were used to adjust model parameters and set PFT-specific values for V c,max. Examination of the models that linked leaf N mechanistically to V c,max identified potential changes to fixed parameters that collectively would decrease V c,max by 31% in C3 plants and 11% in C4 plants. Plant trait data bases are now available that offer an excellent opportunity for models to update PFT-specific parameters used to estimate V c,max. However, data for parameterizing some PFTs, particularly those in the Tropics and the Arctic are either highly variable or largely absent.
SpaceWire Protocol ID: What Does It Mean To You?
NASA Technical Reports Server (NTRS)
Rakow, Glenn; Schnurr, Richard; Gilley, Daniel; Parks, Steve
2006-01-01
Spacewire is becoming a popular solution for satellite high-speed data buses because it is a simple standard that provides great flexibility for a wide range of system requirements. It is simple in packet format and protocol, allowing users to easily tailor their implementation for their specific application. Some of the attractive aspects of Spacewire that make it easy to implement also make it hard for future reuse. Protocol reuse is difficult because Spacewire does not have a defined mechanism to communicate with the higher layers of the protocol stack. This has forced users of Spacewire to define unique packet formats and define how these packets are to be processed. Each mission writes their own Interface Control Document (ICD) and tailors Spacewire for their specific requirements making reuse difficult. Part of the reason for this habit may be because engineers typically optimize designs for their own requirements in the absence of a standard. This is an inefficient use of project resources and costs more to develop missions. A new packet format for Spacewire has been defined as a solution for this problem. This new packet format is a compliment to the Spacewire standard that will support protocol development upon Spacewire. The new packet definition does not replace the current packet structure, i.e., does not make the standard obsolete, but merely extends the standard for those who want to develop protocols over Spacewire. The Spacewire packet is defined with the first part being the Destination Address, which may be one or more bytes. This is followed by the packet cargo, which is user defined. The cargo is truncated with an End-Of-Packet (EOP) marker. This packet structure offers low packet overhead and allows the user to define how the contents are to be formatted. It also provides for many different addressing schemes, which provide flexibility in the system. This packet flexibility is typically an attractive part of the Spacewire. The new extended packet format adds one new field to the packet that greatly enhances the capability of Spacewire. This new field called the Protocol Identifier (ID) is used to identify the packet contents and the associated processing for the packet. This feature along with the restriction in the packet format that uses the Protocol ID, allows a deterministic method of decoding packets that was not before possible. The first part of the packet is still the Destination Address, which still conforms to the original standard but with one restriction. The restriction is that the first byte seen at the destination by the user needs to be a logical address, independent of the addressing scheme used. The second field is defined as the Protocol ID, which is usually one byte in length. The packet cargo (user defined) follows the Protocol ID. After the packet cargo is the EOP, which defines the end of packet. The value of the Protocol ID is assigned by the Spacewire working group and the protocol description published for others to use. The development of Protocols for Spacewire is currently the area of greatest activity by the Spacewire working group. The first protocol definition by the working group has been completed and is now in the process of formal standardization. There are many other protocols in development for missions that have not yet received formal Protocol ID assignment, but even if the protocols are not formally assigned a value, this effort will provide synergism for future developments.
Use of NARCCAP results for extremes: British Columbia case studies
NASA Astrophysics Data System (ADS)
Murdock, T. Q.; Eckstrand, H.; Buerger, G.; Hiebert, J.
2011-12-01
Demand for projections of extremes has arisen out of local infrastructure vulnerability assessments and adaptation planning. Four preliminary analyses of extremes have been undertaken in British Columbia in the past two years in collaboration with users: BC Ministry of Transportation and Infrastructure, Engineers Canada, City of Castelgar, and Columbia Basin Trust. Projects have included analysis of extremes for stormwater management, highways, and community adaptation in different areas of the province. This need for projections of extremes has been met using an ensemble of Regional Climate Model (RCM) results from NARCCAP, in some cases supplemented by and compared to statistical downscaling. Before assessing indices of extremes, each RCM simulation in the NARCCAP ensemble driven by reanalysis (NCEP) was compared to historical observations to assess RCM skill. Next, the anomalies according to each RCM future projection were compared to those of their driving GCM to determine the "value added" by the RCMs. Selected results will be shown for several indices of extremes, including the Climdex set of indices that has been widely used elsewhere (e.g., Stardex) and specific parameters of interest defined by users. Finally, the need for threshold scaling of some indices and use of as large an ensemble as possible will be illustrated.
Analyzing Aeroelastic Stability of a Tilt-Rotor Aircraft
NASA Technical Reports Server (NTRS)
Kvaternil, Raymond G.
2006-01-01
Proprotor Aeroelastic Stability Analysis, now at version 4.5 (PASTA 4.5), is a FORTRAN computer program for analyzing the aeroelastic stability of a tiltrotor aircraft in the airplane mode of flight. The program employs a 10-degree- of-freedom (DOF), discrete-coordinate, linear mathematical model of a rotor with three or more blades and its drive system coupled to a 10-DOF modal model of an airframe. The user can select which DOFs are included in the analysis. Quasi-steady strip-theory aerodynamics is employed for the aerodynamic loads on the blades, a quasi-steady representation is employed for the aerodynamic loads acting on the vibrational modes of the airframe, and a stability-derivative approach is used for the aerodynamics associated with the rigid-body DOFs of the airframe. Blade parameters that vary with the blade collective pitch can be obtained by interpolation from a user-defined table. Stability is determined by examining the eigenvalues that are obtained by solving the coupled equations of motions as a matrix eigenvalue problem. Notwithstanding the relative simplicity of its mathematical foundation, PASTA 4.5 and its predecessors have played key roles in a number of engineering investigations over the years.
Unified Software Solution for Efficient SPR Data Analysis in Drug Research
Dahl, Göran; Steigele, Stephan; Hillertz, Per; Tigerström, Anna; Egnéus, Anders; Mehrle, Alexander; Ginkel, Martin; Edfeldt, Fredrik; Holdgate, Geoff; O’Connell, Nichole; Kappler, Bernd; Brodte, Annette; Rawlins, Philip B.; Davies, Gareth; Westberg, Eva-Lotta; Folmer, Rutger H. A.; Heyse, Stephan
2016-01-01
Surface plasmon resonance (SPR) is a powerful method for obtaining detailed molecular interaction parameters. Modern instrumentation with its increased throughput has enabled routine screening by SPR in hit-to-lead and lead optimization programs, and SPR has become a mainstream drug discovery technology. However, the processing and reporting of SPR data in drug discovery are typically performed manually, which is both time-consuming and tedious. Here, we present the workflow concept, design and experiences with a software module relying on a single, browser-based software platform for the processing, analysis, and reporting of SPR data. The efficiency of this concept lies in the immediate availability of end results: data are processed and analyzed upon loading the raw data file, allowing the user to immediately quality control the results. Once completed, the user can automatically report those results to data repositories for corporate access and quickly generate printed reports or documents. The software module has resulted in a very efficient and effective workflow through saved time and improved quality control. We discuss these benefits and show how this process defines a new benchmark in the drug discovery industry for the handling, interpretation, visualization, and sharing of SPR data. PMID:27789754
Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J
2009-01-01
Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies.
NASA Astrophysics Data System (ADS)
Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi
2017-01-01
Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.
Code Parallelization with CAPO: A User Manual
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Frumkin, Michael; Yan, Jerry; Biegel, Bryan (Technical Monitor)
2001-01-01
A software tool has been developed to assist the parallelization of scientific codes. This tool, CAPO, extends an existing parallelization toolkit, CAPTools developed at the University of Greenwich, to generate OpenMP parallel codes for shared memory architectures. This is an interactive toolkit to transform a serial Fortran application code to an equivalent parallel version of the software - in a small fraction of the time normally required for a manual parallelization. We first discuss the way in which loop types are categorized and how efficient OpenMP directives can be defined and inserted into the existing code using the in-depth interprocedural analysis. The use of the toolkit on a number of application codes ranging from benchmark to real-world application codes is presented. This will demonstrate the great potential of using the toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of processors. The second part of the document gives references to the parameters and the graphic user interface implemented in the toolkit. Finally a set of tutorials is included for hands-on experiences with this toolkit.
Coarsening of three-dimensional structured and unstructured grids for subsurface flow
NASA Astrophysics Data System (ADS)
Aarnes, Jørg Espen; Hauge, Vera Louise; Efendiev, Yalchin
2007-11-01
We present a generic, semi-automated algorithm for generating non-uniform coarse grids for modeling subsurface flow. The method is applicable to arbitrary grids and does not impose smoothness constraints on the coarse grid. One therefore avoids conventional smoothing procedures that are commonly used to ensure that the grids obtained with standard coarsening procedures are not too rough. The coarsening algorithm is very simple and essentially involves only two parameters that specify the level of coarsening. Consequently the algorithm allows the user to specify the simulation grid dynamically to fit available computer resources, and, e.g., use the original geomodel as input for flow simulations. This is of great importance since coarse grid-generation is normally the most time-consuming part of an upscaling phase, and therefore the main obstacle that has prevented simulation workflows with user-defined resolution. We apply the coarsening algorithm to a series of two-phase flow problems on both structured (Cartesian) and unstructured grids. The numerical results demonstrate that one consistently obtains significantly more accurate results using the proposed non-uniform coarsening strategy than with corresponding uniform coarse grids with roughly the same number of cells.
Sequential addition of short DNA oligos in DNA-polymerase-based synthesis reactions
Gardner, Shea N; Mariella, Jr., Raymond P; Christian, Allen T; Young, Jennifer A; Clague, David S
2013-06-25
A method of preselecting a multiplicity of DNA sequence segments that will comprise the DNA molecule of user-defined sequence, separating the DNA sequence segments temporally, and combining the multiplicity of DNA sequence segments with at least one polymerase enzyme wherein the multiplicity of DNA sequence segments join to produce the DNA molecule of user-defined sequence. Sequence segments may be of length n, where n is an odd integer. In one embodiment the length of desired hybridizing overlap is specified by the user and the sequences and the protocol for combining them are guided by computational (bioinformatics) predictions. In one embodiment sequence segments are combined from multiple reading frames to span the same region of a sequence, so that multiple desired hybridizations may occur with different overlap lengths.
Designing Class Methods from Dataflow Diagrams
NASA Astrophysics Data System (ADS)
Shoval, Peretz; Kabeli-Shani, Judith
A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.
A strategy to determine operating parameters in tissue engineering hollow fiber bioreactors
Shipley, RJ; Davidson, AJ; Chan, K; Chaudhuri, JB; Waters, SL; Ellis, MJ
2011-01-01
The development of tissue engineering hollow fiber bioreactors (HFB) requires the optimal design of the geometry and operation parameters of the system. This article provides a strategy for specifying operating conditions for the system based on mathematical models of oxygen delivery to the cell population. Analytical and numerical solutions of these models are developed based on Michaelis–Menten kinetics. Depending on the minimum oxygen concentration required to culture a functional cell population, together with the oxygen uptake kinetics, the strategy dictates the model needed to describe mass transport so that the operating conditions can be defined. If cmin ≫ Km we capture oxygen uptake using zero-order kinetics and proceed analytically. This enables operating equations to be developed that allow the user to choose the medium flow rate, lumen length, and ECS depth to provide a prescribed value of cmin. When , we use numerical techniques to solve full Michaelis–Menten kinetics and present operating data for the bioreactor. The strategy presented utilizes both analytical and numerical approaches and can be applied to any cell type with known oxygen transport properties and uptake kinetics. PMID:21370228
NASA Astrophysics Data System (ADS)
Hamim, Salah Uddin Ahmed
Nanoindentation involves probing a hard diamond tip into a material, where the load and the displacement experienced by the tip is recorded continuously. This load-displacement data is a direct function of material's innate stress-strain behavior. Thus, theoretically it is possible to extract mechanical properties of a material through nanoindentation. However, due to various nonlinearities associated with nanoindentation the process of interpreting load-displacement data into material properties is difficult. Although, simple elastic behavior can be characterized easily, a method to characterize complicated material behavior such as nonlinear viscoelasticity is still lacking. In this study, a nanoindentation-based material characterization technique is developed to characterize soft materials exhibiting nonlinear viscoelasticity. Nanoindentation experiment was modeled in finite element analysis software (ABAQUS), where a nonlinear viscoelastic behavior was incorporated using user-defined subroutine (UMAT). The model parameters were calibrated using a process called inverse analysis. In this study, a surrogate model-based approach was used for the inverse analysis. The different factors affecting the surrogate model performance are analyzed in order to optimize the performance with respect to the computational cost.
Guidelines to electrode positioning for human and animal electrical impedance myography research
NASA Astrophysics Data System (ADS)
Sanchez, Benjamin; Pacheck, Adam; Rutkove, Seward B.
2016-09-01
The positioning of electrodes in electrical impedance myography (EIM) is critical for accurately assessing disease progression and effectiveness of treatment. In human and animal trials for neuromuscular disorders, inconsistent electrode positioning adds errors to the muscle impedance. Despite its importance, how the reproducibility of resistance and reactance, the two parameters that define EIM, are affected by changes in electrode positioning remains unknown. In this paper, we present a novel approach founded on biophysical principles to study the reproducibility of resistance and reactance to electrode misplacements. The analytical framework presented allows the user to quantify a priori the effect on the muscle resistance and reactance using only one parameter: the uncertainty placing the electrodes. We also provide quantitative data on the precision needed to position the electrodes and the minimum muscle length needed to achieve a pre-specified EIM reproducibility. The results reported here are confirmed with finite element model simulations and measurements on five healthy subjects. Ultimately, our data can serve as normative values to enhance the reliability of EIM as a biomarker and facilitate comparability of future human and animal studies.
CalFitter: a web server for analysis of protein thermal denaturation data.
Mazurenko, Stanislav; Stourac, Jan; Kunka, Antonin; Nedeljkovic, Sava; Bednar, David; Prokop, Zbynek; Damborsky, Jiri
2018-05-14
Despite significant advances in the understanding of protein structure-function relationships, revealing protein folding pathways still poses a challenge due to a limited number of relevant experimental tools. Widely-used experimental techniques, such as calorimetry or spectroscopy, critically depend on a proper data analysis. Currently, there are only separate data analysis tools available for each type of experiment with a limited model selection. To address this problem, we have developed the CalFitter web server to be a unified platform for comprehensive data fitting and analysis of protein thermal denaturation data. The server allows simultaneous global data fitting using any combination of input data types and offers 12 protein unfolding pathway models for selection, including irreversible transitions often missing from other tools. The data fitting produces optimal parameter values, their confidence intervals, and statistical information to define unfolding pathways. The server provides an interactive and easy-to-use interface that allows users to directly analyse input datasets and simulate modelled output based on the model parameters. CalFitter web server is available free at https://loschmidt.chemi.muni.cz/calfitter/.
NCTM workshop splinter session, IR thermal measurement instruments
NASA Astrophysics Data System (ADS)
Kaplan, Herbert
1989-06-01
The splinter session dealing with commercial industrial thermal measurement state-of-the-hardware had a total attendance of 15. Two papers were presented in the splinter session as follows: (1) Development of an Infrared Imaging System for the Surface Tension Driven Convection Experiment, Alexander D. Pline, NASA LeRC; (2) A Space-qualified PtSi Thermal Imaging System, Robert W. Astheimer, Barnes Engineering Div., EDO Corp. In addition a brief description of SPRITE detector technology was presented by Richard F. Leftwich of Magnovox. As anticipated, the discussions were concerned mainly with thermal imaging figures of merit rather than those for point measurement instruments. The need for uniform guidelines whereby infrared thermal imaging instruments could be specified and evaluated was identified as most important, particularly where temperature measurements are required. Presently there are differences in the way different manufacturers present significant performance parameters in their instrument data sheets. Furthermore, the prospective user has difficulty relating these parameters to actual measurement needs, and procedures by which performance can be verified are poorly defined. The current availability of powerful thermal imaging diagnostic software was discussed.
NCTM workshop splinter session, IR thermal measurement instruments
NASA Technical Reports Server (NTRS)
Kaplan, Herbert
1989-01-01
The splinter session dealing with commercial industrial thermal measurement state-of-the-hardware had a total attendance of 15. Two papers were presented in the splinter session as follows: (1) Development of an Infrared Imaging System for the Surface Tension Driven Convection Experiment, Alexander D. Pline, NASA LeRC; (2) A Space-qualified PtSi Thermal Imaging System, Robert W. Astheimer, Barnes Engineering Div., EDO Corp. In addition a brief description of SPRITE detector technology was presented by Richard F. Leftwich of Magnovox. As anticipated, the discussions were concerned mainly with thermal imaging figures of merit rather than those for point measurement instruments. The need for uniform guidelines whereby infrared thermal imaging instruments could be specified and evaluated was identified as most important, particularly where temperature measurements are required. Presently there are differences in the way different manufacturers present significant performance parameters in their instrument data sheets. Furthermore, the prospective user has difficulty relating these parameters to actual measurement needs, and procedures by which performance can be verified are poorly defined. The current availability of powerful thermal imaging diagnostic software was discussed.
Software-defined Quantum Networking Ecosystem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S.; Sadlier, Ronald
The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less
System and method for creating expert systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M. (Inventor); Luczak, Edward C. (Inventor)
1998-01-01
A system and method provides for the creation of a highly graphical expert system without the need for programming in code. An expert system is created by initially building a data interface, defining appropriate Mission, User-Defined, Inferred, and externally-generated GenSAA (EGG) data variables whose data values will be updated and input into the expert system. Next, rules of the expert system are created by building appropriate conditions of the rules which must be satisfied and then by building appropriate actions of rules which are to be executed upon corresponding conditions being satisfied. Finally, an appropriate user interface is built which can be highly graphical in nature and which can include appropriate message display and/or modification of display characteristics of a graphical display object, to visually alert a user of the expert system of varying data values, upon conditions of a created rule being satisfied. The data interface building, rule building, and user interface building are done in an efficient manner and can be created without the need for programming in code.
FROST - FREEDOM OPERATIONS SIMULATION TEST VERSION 1.0
NASA Technical Reports Server (NTRS)
Deshpande, G. K.
1994-01-01
The Space Station Freedom Information System processes and transmits data between the space station and the station controllers and payload operators on the ground. Components of the system include flight hardware, communications satellites, software and ground facilities. FROST simulates operation of the SSF Information System, tracking every data packet from generation to destination for both uplinks and downlinks. This program collects various statistics concerning the SSF Information System operation and provides reports of these at user-specified intervals. Additionally, FROST has graphical display capability to enhance interpretation of these statistics. FROST models each of the components of the SSF Information System as an object, to which packets are generated, received, processed, transmitted, and/or dumped. The user must provide the information system design with specified parameters and inter-connections among objects. To aid this process, FROST supplies an example SSF Information System for simulation, but this example must be copied before it is changed and used for further simulation. Once specified, system architecture and parameters are put into the input file, named the Test Configuration Definition (TCD) file. Alternative system designs can then be simulated simply by editing the TCD file. Within this file the user can define new objects, alter object parameters, redefine paths, redefine generation rates and windows, and redefine object interconnections. At present, FROST does not model every feature of the SSF Information System, but it is capable of simulating many of the system's important functions. To generate data messages, which can come from any object, FROST defines "windows" to specify when, what kind, and how much of that data is generated. All messages are classified by priority as either (1)emergency (2)quick look (3)telemetry or (4)payload data. These messages are processed by all objects according to priority. That is, all priority 1 (emergency) messages are processed and transmitted before priority 2 messages, and so forth. FROST also allows for specification of "pipeline" or "direct" links. Pipeline links are used to broadcast at constant intervals, while direct links transmit messages only when packets are ready for transmission. FROST allows the user substantial flexibility to customize output for a simulation. Output consists of tables and graphs, as specified in the TCD file, to be generated at the specified interval. These tables may be generated at short intervals during the run to produce snapshots as simulation proceeds, or generated after the run to give a summary of the entire run. FROST is written in SIMSCRIPT II.5 (developed by CACI) for DEC VAX series computers running VMS. FROST was developed on a VAX 8700 and is intended to be run on large VAXes with at least 32Mb of memory. The main memory requirement for FROST is dependent on the number of processors used in the simulation and the event time. The standard distribution medium for this package is a 9-track 1600 BPI DEC VAX BACKUP Format Magnetic Tape. An executable is included on the tape in addition to the source code. FROST was developed in 1990 and is a copyrighted work with all copyright vested in NASA. DEC, VAX and VMS are registered trademarks of Digital Equipment Corporation. IBM PC is a trademark of International Business Machines. SIMSCRIPT II.5 is a trademark of CACI.
NASA Astrophysics Data System (ADS)
Hauffe, Corina; Schwarze, Robert; Röhm, Patric; Müller, Ruben; Dröge, Werner; Gurova, Anastasia; Winkler, Peter; Baldy, Agnes
2016-04-01
Changes in weather and climate lead to increasing discussions about reasons and possible future impacts on the hydrological cycle. The question of a changed distribution of water also concerns the federal state of Saxony in the eastern part of Germany. Especially with a look at the different and increased requirements for water authorities, water economy and the public. To define and prepare these future requirements estimations of the future development of the natural water resources are necessary. Therefore data, information, and forecast concerning the development of the several components of the water balance are needed. And to make the obtained information easily available for experts and the public, tools like the internet have to be used. Under these frame conditions the water balance portal Saxony (www.wasserhaushaltsportal.sachsen.de) was developed within the project KliWES. The overall approach of the project was devided into the so-called „3 pillars".The first pillar focused on the evaluation of the status quo water balance from 1951-2005 by using a complex area-wide analysis of measured data. Also it contained the generating of a database and the development of a physically based parameter model. Furthermore an extensive model evaluation has been conducted with a number of objective assessment criteria, to select an appropriate model for the project. The second pillar included the calibration of the water balance model and the impact study of climate and land use change (1961-2100) on the water balance of Saxonian catchments. In this context 13 climate scenarios and three land use scenarios were simulated. The web presence of these two pillars represents a classical information service, which provides finalized results at the spatial resolution of sub-catchments using GIS-based webpages. The third pillar focused on the development of an interactive expert system. It allows the user (public, officials and consulting engineers) to simulate the water balance with user defined catchment parameters for catchments in Saxony under recent climatic und climate change conditions.
NASA Astrophysics Data System (ADS)
Kromp, Florian; Taschner-Mandl, Sabine; Schwarz, Magdalena; Blaha, Johanna; Weiss, Tamara; Ambros, Peter F.; Reiter, Michael
2015-02-01
We propose a user-driven method for the segmentation of neuroblastoma nuclei in microscopic fluorescence images involving the gradient energy tensor. Multispectral fluorescence images contain intensity and spatial information about antigene expression, fluorescence in situ hybridization (FISH) signals and nucleus morphology. The latter serves as basis for the detection of single cells and the calculation of shape features, which are used to validate the segmentation and to reject false detections. Accurate segmentation is difficult due to varying staining intensities and aggregated cells. It requires several (meta-) parameters, which have a strong influence on the segmentation results and have to be selected carefully for each sample (or group of similar samples) by user interactions. Because our method is designed for clinicians and biologists, who may have only limited image processing background, an interactive parameter selection step allows the implicit tuning of parameter values. With this simple but intuitive method, segmentation results with high precision for a large number of cells can be achieved by minimal user interaction. The strategy was validated on handsegmented datasets of three neuroblastoma cell lines.
Systematic Assessment of the Impact of User Roles on Network Flow Patterns
2017-09-01
Protocol SNMP Simple Network Management Protocol SQL Structured Query Language SSH Secure Shell SYN TCP Sync Flag SVDD Support Vector Data Description SVM...and evaluating users based on roles provide the best approach for defining normal digital behaviors? People are individuals, with different interests...activities on the network. We evaluate the assumption that users sharing similar roles exhibit similar network behaviors, and contrast the level of similarity
Øymoen, Anita; Pottegård, Anton; Almarsdóttir, Anna Birna
2015-06-01
The objectives of this study were to (1) identify and characterize heavy users of prescription drugs among persons aged 60 years and above; (2) investigate the association of demographic, socioeconomic, and health-related variables with being a heavy drug user; and (3) study the most frequently used drugs among heavy drug users and development in use over time. This is a descriptive study. Heavy drug users were defined as the accumulated top 1 percentile who accounted for the largest share of prescription drug use measured in number of dispensed defined daily doses (DDDs). The nationwide Danish registers were used to obtain data. Multivariable logistic binary regression was used to determine which factors were associated with being a heavy drug user. Heavy drug users among persons aged 60 years and above accounted for 6.8, 6.0, and 5.5% of prescription drug use in 2002, 2007, and 2012, respectively. Male gender, those aged 60-69 years, being divorced, shorter education, low annual income, and recent hospitalization were all significantly associated with being in the top 1 percentile group of drug users (p < 0.05). The ten most frequently used drug classes among heavy drug users accounted for 75.4% of their use in 2012, and five of these were cardiovascular drugs. The development over time for the ten most used drug classes followed the same pattern among heavy drug users and in the general population. There is a skewed utilization of prescription drugs. Contrary to earlier findings, being male was associated with heavy prescription drug use both with respect to number of drugs used and drug expenditure.
Development of Data Acquisition Set-up for Steady-state Experiments
NASA Astrophysics Data System (ADS)
Srivastava, Amit K.; Gupta, Arnab D.; Sunil, S.; Khan, Ziauddin
2017-04-01
For short duration experiments, generally digitized data is transferred for processing and storage after the experiment whereas in case of steady-state experiment the data is acquired, processed, displayed and stored continuously in pipelined manner. This requires acquiring data through special techniques for storage and on-the-go viewing data to display the current data trends for various physical parameters. A small data acquisition set-up is developed for continuously acquiring signals from various physical parameters at different sampling rate for long duration experiment. This includes the hardware set-up for signal digitization, Field Programmable Gate Arrays (FPGA) based timing system for clock synchronization and event/trigger distribution, time slicing of data streams for storage of data chunks to enable viewing of data during acquisition and channel profile display through down sampling etc. In order to store a long data stream of indefinite/long time duration, the data stream is divided into data slices/chunks of user defined time duration. Data chunks avoid the problem of non-access of server data until the channel data file is closed at the end of the long duration experiment. A graphical user interface has been developed in Lab VIEW application development environment for configuring the data acquisition hardware and storing data chunks on local machine as well as at remote data server through Python for further data access. The data plotting and analysis utilities have been developed with Python software, which provides tools for further data processing. This paper describes the development and implementation of data acquisition for steady-state experiment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T.
2000-07-01
The Write One, Run Many (WORM) site (worm.csirc.net) is the on-line home of the WORM language and is hosted by the Criticality Safety Information Resource Center (CSIRC) (www.csirc.net). The purpose of this web site is to create an on-line community for WORM users to gather, share, and archive WORM-related information. WORM is an embedded, functional, programming language designed to facilitate the creation of input decks for computer codes that take standard ASCII text files as input. A functional programming language is one that emphasizes the evaluation of expressions, rather than execution of commands. The simplest and perhaps most common examplemore » of a functional language is a spreadsheet such as Microsoft Excel. The spreadsheet user specifies expressions to be evaluated, while the spreadsheet itself determines the commands to execute, as well as the order of execution/evaluation. WORM functions in a similar fashion and, as a result, is very simple to use and easy to learn. WORM improves the efficiency of today's criticality safety analyst by allowing: (1) input decks for parameter studies to be created quickly and easily; (2) calculations and variables to be embedded into any input deck, thus allowing for meaningful parameter specifications; (3) problems to be specified using any combination of units; and (4) complex mathematically defined models to be created. WORM is completely written in Perl. Running on all variants of UNIX, Windows, MS-DOS, MacOS, and many other operating systems, Perl is one of the most portable programming languages available. As such, WORM works on practically any computer platform.« less
The HITRAN2016 Molecular Spectroscopic Database
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon, I. E.; Rothman, L. S.; Hill, C.
This article describes the contents of the 2016 edition of the HITRAN molecular spectroscopic compilation. The new edition replaces the previous HITRAN edition of 2012 and its updates during the intervening years. The HITRAN molecular absorption compilation is composed of five major components: the traditional line-by-line spectroscopic parameters required for high-resolution radiative-transfer codes, infrared absorption cross-sections for molecules not yet amenable to representation in a line-by-line form, collision-induced absorption data, aerosol indices of refraction, and general tables such as partition sums that apply globally to the data. The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additionalmore » absorption phenomena, added line-shape formalisms, and validity. Moreover, molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth. Of considerable note, experimental IR cross-sections for almost 300 additional molecules important in different areas of atmospheric science have been added to the database. The compilation can be accessed through www.hitran.org. Most of the HITRAN data have now been cast into an underlying relational database structure that offers many advantages over the long-standing sequential text-based structure. The new structure empowers the user in many ways. It enables the incorporation of an extended set of fundamental parameters per transition, sophisticated line-shape formalisms, easy user-defined output formats, and very convenient searching, filtering, and plotting of data. Finally, a powerful application programming interface making use of structured query language (SQL) features for higher-level applications of HITRAN is also provided.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childs, K.W.
1991-07-01
HEATING is a FORTRAN program designed to solve steady-state and/or transient heat conduction problems in one-, two-, or three- dimensional Cartesian, cylindrical, or spherical coordinates. A model may include multiple materials, and the thermal conductivity, density, and specific heat of each material may be both time- and temperature-dependent. The thermal conductivity may be anisotropic. Materials may undergo change of phase. Thermal properties of materials may be input or may be extracted from a material properties library. Heating generation rates may be dependent on time, temperature, and position, and boundary temperatures may be time- and position-dependent. The boundary conditions, which maymore » be surface-to-boundary or surface-to-surface, may be specified temperatures or any combination of prescribed heat flux, forced convection, natural convection, and radiation. The boundary condition parameters may be time- and/or temperature-dependent. General graybody radiation problems may be modeled with user-defined factors for radiant exchange. The mesh spacing may be variable along each axis. HEATING is variably dimensioned and utilizes free-form input. Three steady-state solution techniques are available: point-successive-overrelaxation iterative method with extrapolation, direct-solution (for one-dimensional or two-dimensional problems), and conjugate gradient. Transient problems may be solved using one of several finite-difference schemes: Crank-Nicolson implicit, Classical Implicit Procedure (CIP), Classical Explicit Procedure (CEP), or Levy explicit method (which for some circumstances allows a time step greater than the CEP stability criterion). The solution of the system of equations arising from the implicit techniques is accomplished by point-successive-overrelaxation iteration and includes procedures to estimate the optimum acceleration parameter.« less
The HITRAN2016 Molecular Spectroscopic Database
Gordon, I. E.; Rothman, L. S.; Hill, C.; ...
2017-07-05
This article describes the contents of the 2016 edition of the HITRAN molecular spectroscopic compilation. The new edition replaces the previous HITRAN edition of 2012 and its updates during the intervening years. The HITRAN molecular absorption compilation is composed of five major components: the traditional line-by-line spectroscopic parameters required for high-resolution radiative-transfer codes, infrared absorption cross-sections for molecules not yet amenable to representation in a line-by-line form, collision-induced absorption data, aerosol indices of refraction, and general tables such as partition sums that apply globally to the data. The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additionalmore » absorption phenomena, added line-shape formalisms, and validity. Moreover, molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth. Of considerable note, experimental IR cross-sections for almost 300 additional molecules important in different areas of atmospheric science have been added to the database. The compilation can be accessed through www.hitran.org. Most of the HITRAN data have now been cast into an underlying relational database structure that offers many advantages over the long-standing sequential text-based structure. The new structure empowers the user in many ways. It enables the incorporation of an extended set of fundamental parameters per transition, sophisticated line-shape formalisms, easy user-defined output formats, and very convenient searching, filtering, and plotting of data. Finally, a powerful application programming interface making use of structured query language (SQL) features for higher-level applications of HITRAN is also provided.« less
toxoMine: an integrated omics data warehouse for Toxoplasma gondii systems biology research
Rhee, David B.; Croken, Matthew McKnight; Shieh, Kevin R.; Sullivan, Julie; Micklem, Gos; Kim, Kami; Golden, Aaron
2015-01-01
Toxoplasma gondii (T. gondii) is an obligate intracellular parasite that must monitor for changes in the host environment and respond accordingly; however, it is still not fully known which genetic or epigenetic factors are involved in regulating virulence traits of T. gondii. There are on-going efforts to elucidate the mechanisms regulating the stage transition process via the application of high-throughput epigenomics, genomics and proteomics techniques. Given the range of experimental conditions and the typical yield from such high-throughput techniques, a new challenge arises: how to effectively collect, organize and disseminate the generated data for subsequent data analysis. Here, we describe toxoMine, which provides a powerful interface to support sophisticated integrative exploration of high-throughput experimental data and metadata, providing researchers with a more tractable means toward understanding how genetic and/or epigenetic factors play a coordinated role in determining pathogenicity of T. gondii. As a data warehouse, toxoMine allows integration of high-throughput data sets with public T. gondii data. toxoMine is also able to execute complex queries involving multiple data sets with straightforward user interaction. Furthermore, toxoMine allows users to define their own parameters during the search process that gives users near-limitless search and query capabilities. The interoperability feature also allows users to query and examine data available in other InterMine systems, which would effectively augment the search scope beyond what is available to toxoMine. toxoMine complements the major community database ToxoDB by providing a data warehouse that enables more extensive integrative studies for T. gondii. Given all these factors, we believe it will become an indispensable resource to the greater infectious disease research community. Database URL: http://toxomine.org PMID:26130662
Rapid Airplane Parametric Input Design (RAPID)
NASA Technical Reports Server (NTRS)
Smith, Robert E.
1995-01-01
RAPID is a methodology and software system to define a class of airplane configurations and directly evaluate surface grids, volume grids, and grid sensitivity on and about the configurations. A distinguishing characteristic which separates RAPID from other airplane surface modellers is that the output grids and grid sensitivity are directly applicable in CFD analysis. A small set of design parameters and grid control parameters govern the process which is incorporated into interactive software for 'real time' visual analysis and into batch software for the application of optimization technology. The computed surface grids and volume grids are suitable for a wide range of Computational Fluid Dynamics (CFD) simulation. The general airplane configuration has wing, fuselage, horizontal tail, and vertical tail components. The double-delta wing and tail components are manifested by solving a fourth order partial differential equation (PDE) subject to Dirichlet and Neumann boundary conditions. The design parameters are incorporated into the boundary conditions and therefore govern the shapes of the surfaces. The PDE solution yields a smooth transition between boundaries. Surface grids suitable for CFD calculation are created by establishing an H-type topology about the configuration and incorporating grid spacing functions in the PDE equation for the lifting components and the fuselage definition equations. User specified grid parameters govern the location and degree of grid concentration. A two-block volume grid about a configuration is calculated using the Control Point Form (CPF) technique. The interactive software, which runs on Silicon Graphics IRIS workstations, allows design parameters to be continuously varied and the resulting surface grid to be observed in real time. The batch software computes both the surface and volume grids and also computes the sensitivity of the output grid with respect to the input design parameters by applying the precompiler tool ADIFOR to the grid generation program. The output of ADIFOR is a new source code containing the old code plus expressions for derivatives of specified dependent variables (grid coordinates) with respect to specified independent variables (design parameters). The RAPID methodology and software provide a means of rapidly defining numerical prototypes, grids, and grid sensitivity of a class of airplane configurations. This technology and software is highly useful for CFD research for preliminary design and optimization processes.
Lei, Yang; Yu, Dai; Bin, Zhang; Yang, Yang
2017-01-01
Clustering algorithm as a basis of data analysis is widely used in analysis systems. However, as for the high dimensions of the data, the clustering algorithm may overlook the business relation between these dimensions especially in the medical fields. As a result, usually the clustering result may not meet the business goals of the users. Then, in the clustering process, if it can combine the knowledge of the users, that is, the doctor's knowledge or the analysis intent, the clustering result can be more satisfied. In this paper, we propose an interactive K -means clustering method to improve the user's satisfactions towards the result. The core of this method is to get the user's feedback of the clustering result, to optimize the clustering result. Then, a particle swarm optimization algorithm is used in the method to optimize the parameters, especially the weight settings in the clustering algorithm to make it reflect the user's business preference as possible. After that, based on the parameter optimization and adjustment, the clustering result can be closer to the user's requirement. Finally, we take an example in the breast cancer, to testify our method. The experiments show the better performance of our algorithm.
Tracking Multiple Topics for Finding Interesting Articles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pon, R K; Cardenas, A F; Buttler, D J
We introduce multiple topic tracking (MTT) for iScore to better recommend news articles for users with multiple interests and to address changes in user interests over time. As an extension of the basic Rocchio algorithm, traditional topic detection and tracking, and single-pass clustering, MTT maintains multiple interest profiles to identify interesting articles for a specific user given user-feedback. Focusing on only interesting topics enables iScore to discard useless profiles to address changes in user interests and to achieve a balance between resource consumption and classification accuracy. Also by relating a topic's interestingness to an article's interestingness, iScore is able tomore » achieve higher quality results than traditional methods such as the Rocchio algorithm. We identify several operating parameters that work well for MTT. Using the same parameters, we show that MTT alone yields high quality results for recommending interesting articles from several corpora. The inclusion of MTT improves iScore's performance by 9% to 14% in recommending news articles from the Yahoo! News RSS feeds and the TREC11 adaptive filter article collection. And through a small user study, we show that iScore can still perform well when only provided with little user feedback.« less
A thermal vacuum test optimization procedure
NASA Technical Reports Server (NTRS)
Kruger, R.; Norris, H. P.
1979-01-01
An analytical model was developed that can be used to establish certain parameters of a thermal vacuum environmental test program based on an optimization of program costs. This model is in the form of a computer program that interacts with a user insofar as the input of certain parameters. The program provides the user a list of pertinent information regarding an optimized test program and graphs of some of the parameters. The model is a first attempt in this area and includes numerous simplifications. The model appears useful as a general guide and provides a way for extrapolating past performance to future missions.
Modeling erosion under future climates with the WEPP model
Timothy Bayley; William Elliot; Mark A. Nearing; D. Phillp Guertin; Thomas Johnson; David Goodrich; Dennis Flanagan
2010-01-01
The Water Erosion Prediction Project Climate Assessment Tool (WEPPCAT) was developed to be an easy-to-use, web-based erosion model that allows users to adjust climate inputs for user-specified climate scenarios. WEPPCAT allows the user to modify monthly mean climate parameters, including maximum and minimum temperatures, number of wet days, precipitation, and...
Section 4. The GIS Weasel User's Manual
Viger, Roland J.; Leavesley, George H.
2007-01-01
INTRODUCTION The GIS Weasel was designed to aid in the preparation of spatial information for input to lumped and distributed parameter hydrologic or other environmental models. The GIS Weasel provides geographic information system (GIS) tools to help create maps of geographic features relevant to a user's model and to generate parameters from those maps. The operation of the GIS Weasel does not require the user to be a GIS expert, only that the user have an understanding of the spatial information requirements of the environmental simulation model being used. The GIS Weasel software system uses a GIS-based graphical user interface (GUI), the C programming language, and external scripting languages. The software will run on any computing platform where ArcInfo Workstation (version 8.0.2 or later) and the GRID extension are accessible. The user controls the processing of the GIS Weasel by interacting with menus, maps, and tables. The purpose of this document is to describe the operation of the software. This document is not intended to describe the usage of this software in support of any particular environmental simulation model. Such guides are published separately.
Wang, Yu; Koenig, Steven C; Slaughter, Mark S; Giridharan, Guruprasad A
2015-01-01
The risk for left ventricular (LV) suction during left ventricular assist devices (LVAD) support has been a clinical concern. Current development efforts suggest LVAD suction prevention and physiologic control algorithms may require chronic implantation of pressure or flow sensors, which can be unreliable because of baseline drift and short lifespan. To overcome this limitation, we designed a sensorless suction prevention and physiologic control (eSPPC) algorithm that only requires LVAD intrinsic parameters (pump speed and power). Two gain-scheduled, proportional-integral controllers maintain a differential pump speed (ΔRPM) above a user-defined threshold to prevent LV suction while maintaining an average reference differential pressure (ΔP) between the LV and aorta. ΔRPM is calculated from noisy pump speed measurements that are low-pass filtered, and ΔP is estimated using an extended Kalman filter. Efficacy and robustness of the eSPPC algorithm were evaluated in silico during simulated rest and exercise test conditions for 1) excessive ΔP setpoint (ES); 2) rapid eightfold increase in pulmonary vascular resistance (PVR); and 3) ES and PVR. Simulated hemodynamic waveforms (LV pressure and volume; aortic pressure and flow) using only intrinsic pump parameters showed the feasibility of our proposed eSPPC algorithm in preventing LV suction for all test conditions.
Detecting microsatellites within genomes: significant variation among algorithms.
Leclercq, Sébastien; Rivals, Eric; Jarne, Philippe
2007-04-18
Microsatellites are short, tandemly-repeated DNA sequences which are widely distributed among genomes. Their structure, role and evolution can be analyzed based on exhaustive extraction from sequenced genomes. Several dedicated algorithms have been developed for this purpose. Here, we compared the detection efficiency of five of them (TRF, Mreps, Sputnik, STAR, and RepeatMasker). Our analysis was first conducted on the human X chromosome, and microsatellite distributions were characterized by microsatellite number, length, and divergence from a pure motif. The algorithms work with user-defined parameters, and we demonstrate that the parameter values chosen can strongly influence microsatellite distributions. The five algorithms were then compared by fixing parameters settings, and the analysis was extended to three other genomes (Saccharomyces cerevisiae, Neurospora crassa and Drosophila melanogaster) spanning a wide range of size and structure. Significant differences for all characteristics of microsatellites were observed among algorithms, but not among genomes, for both perfect and imperfect microsatellites. Striking differences were detected for short microsatellites (below 20 bp), regardless of motif. Since the algorithm used strongly influences empirical distributions, studies analyzing microsatellite evolution based on a comparison between empirical and theoretical size distributions should therefore be considered with caution. We also discuss why a typological definition of microsatellites limits our capacity to capture their genomic distributions.
Detecting microsatellites within genomes: significant variation among algorithms
Leclercq, Sébastien; Rivals, Eric; Jarne, Philippe
2007-01-01
Background Microsatellites are short, tandemly-repeated DNA sequences which are widely distributed among genomes. Their structure, role and evolution can be analyzed based on exhaustive extraction from sequenced genomes. Several dedicated algorithms have been developed for this purpose. Here, we compared the detection efficiency of five of them (TRF, Mreps, Sputnik, STAR, and RepeatMasker). Results Our analysis was first conducted on the human X chromosome, and microsatellite distributions were characterized by microsatellite number, length, and divergence from a pure motif. The algorithms work with user-defined parameters, and we demonstrate that the parameter values chosen can strongly influence microsatellite distributions. The five algorithms were then compared by fixing parameters settings, and the analysis was extended to three other genomes (Saccharomyces cerevisiae, Neurospora crassa and Drosophila melanogaster) spanning a wide range of size and structure. Significant differences for all characteristics of microsatellites were observed among algorithms, but not among genomes, for both perfect and imperfect microsatellites. Striking differences were detected for short microsatellites (below 20 bp), regardless of motif. Conclusion Since the algorithm used strongly influences empirical distributions, studies analyzing microsatellite evolution based on a comparison between empirical and theoretical size distributions should therefore be considered with caution. We also discuss why a typological definition of microsatellites limits our capacity to capture their genomic distributions. PMID:17442102
NASA Astrophysics Data System (ADS)
Shahariar, G. M. H.; Wardana, M. K. A.; Lim, O. T.
2018-04-01
The post impingement effects of urea-water solution spray on the heated wall of automotive SCR systems was numerically investigated in a constant volume chamber using STAR CCM+ CFD code. The turbulence flow was modelled by realizable k-ε two-layer model together with standard wall function and all y+ treatment was applied along with two-layer approach. The Eulerian-Lagrangian approach was used for the modelling of multi phase flow. Urea water solution (UWS) was injected onto the heated wall for the wall temperature of 338, 413, 473, 503 & 573 K. Spray development after impinging on the heated wall was visualized and measured. Droplet size distribution and droplet evaporation rates were also measured, which are vital parameters for the system performance but still not well researched. Specially developed user defined functions (UDF) are implemented to simulate the desired conditions and parameters. The investigation reveals that wall temperature has a great impact on spray development after impingement, droplet size distribution and evaporation. Increasing the wall temperature leads to longer spray front projection length, smaller droplet size and faster droplet evaporation which are preconditions for urea crystallization reduction. The numerical model and parameters are validated comparing with experimental data.
The Profile-Query Relationship.
ERIC Educational Resources Information Center
Shepherd, Michael A.; Phillips, W. J.
1986-01-01
Defines relationship between user profile and user query in terms of relationship between clusters of documents retrieved by each, and explores the expression of cluster similarity and cluster overlap as linear functions of similarity existing between original pairs of profiles and queries, given the desired retrieval threshold. (23 references)…
DOT National Transportation Integrated Search
1998-04-01
Human factors can be defined as "designing to match the capabilities and limitations of the human user." The objectives of this human-centered design process are to maximize the effectiveness and efficiency of system performance, ensure a high level ...
Orbital service module systems analysis study documentation. Volume 2: Technical report
NASA Technical Reports Server (NTRS)
1978-01-01
Near term, cost effective concepts were defined to augment the power and duration capability offered to shuttle payload users. Feasible concept options that could evolve to provide free-flying power and other services to users in the 1984 time frame were also examined.
Effects of hormonal contraceptives on mental rotation and verbal fluency.
Griksiene, Ramune; Ruksenas, Osvaldas
2011-09-01
Cognitive abilities, such as verbal fluency and mental rotation, are most sensitive to changes in sex steroids but poorly studied in the context of hormonal contraceptive usage. Therefore, we investigated the performance of mental rotation and verbal fluency in young (21.5±1.8 years) healthy oral contraceptive (OC) users (23 women) and non-users (20 women) during the follicular, ovulatory and luteal phases of the menstrual cycle. Salivary 17β-estradiol, progesterone and testosterone levels were assayed to evaluate hormonal differences between groups and the phases of the menstrual cycle. To assess the effects of progestins having androgenic/anti-androgenic properties, OC users were subdivided into the third and new generation OC users. In addition, positive and negative affects as factors possibly affecting cognitive performance were evaluated. Salivary 17β-estradiol and progesterone levels were significantly lower in hormonal contraception users. Level of salivary testosterone was slightly lower in the OC users group with significant difference only during ovulatory phase. Naturally cycling women performed better on verbal fluency task as compared to OC users. Subjects who used the third generation (androgenic) OCs generated significantly fewer words as compared to new generation (anti-androgenic) OC users and non-users. The third generation OC users demonstrated significantly longer RT in MRT task as compared to non-users. The MRT, verbal fluency and mood parameters did not depend on the phase of menstrual cycle. The parameters of the PANAS (Positive and Negative Affect Schedule) scales did not differ between OC users and non-users. Our findings show that hormonal contraception has an impact on verbal and spatial abilities. Different performances between users of oral contraceptives with androgenic and anti-androgenic properties suggest an essential role for the progestins contained in OCs on cognitive performance. Copyright © 2011 Elsevier Ltd. All rights reserved.
The Value of Data and Metadata Standardization for Interoperability in Giovanni
NASA Astrophysics Data System (ADS)
Smit, C.; Hegde, M.; Strub, R. F.; Bryant, K.; Li, A.; Petrenko, M.
2017-12-01
Giovanni (https://giovanni.gsfc.nasa.gov/giovanni/) is a data exploration and visualization tool at the NASA Goddard Earth Sciences Data Information Services Center (GES DISC). It has been around in one form or another for more than 15 years. Giovanni calculates simple statistics and produces 22 different visualizations for more than 1600 geophysical parameters from more than 90 satellite and model products. Giovanni relies on external data format standards to ensure interoperability, including the NetCDF CF Metadata Conventions. Unfortunately, these standards were insufficient to make Giovanni's internal data representation truly simple to use. Finding and working with dimensions can be convoluted with the CF Conventions. Furthermore, the CF Conventions are silent on machine-friendly descriptive metadata such as the parameter's source product and product version. In order to simplify analyzing disparate earth science data parameters in a unified way, we developed Giovanni's internal standard. First, the format standardizes parameter dimensions and variables so they can be easily found. Second, the format adds all the machine-friendly metadata Giovanni needs to present our parameters to users in a consistent and clear manner. At a glance, users can grasp all the pertinent information about parameters both during parameter selection and after visualization. This poster gives examples of how our metadata and data standards, both external and internal, have both simplified our code base and improved our users' experiences.
CosmoCalc: An Excel add-in for cosmogenic nuclide calculations
NASA Astrophysics Data System (ADS)
Vermeesch, Pieter
2007-08-01
As dating methods using Terrestrial Cosmogenic Nuclides (TCN) become more popular, the need arises for a general-purpose and easy-to-use data reduction software. The CosmoCalc Excel add-in calculates TCN production rate scaling factors (using Lal, Stone, Dunai, and Desilets methods); topographic, snow, and self-shielding factors; and exposure ages, erosion rates, and burial ages and visualizes the results on banana-style plots. It uses an internally consistent TCN production equation that is based on the quadruple exponential approach of Granger and Smith (2000). CosmoCalc was designed to be as user-friendly as possible. Although the user interface is extremely simple, the program is also very flexible, and nearly all default parameter values can be changed. To facilitate the comparison of different scaling factors, a set of converter tools is provided, allowing the user to easily convert cut-off rigidities to magnetic inclinations, elevations to atmospheric depths, and so forth. Because it is important to use a consistent set of scaling factors for the sample measurements and the production rate calibration sites, CosmoCalc defines the production rates implicitly, as a function of the original TCN concentrations of the calibration site. The program is best suited for 10Be, 26Al, 3He, and 21Ne calculations, although basic functionality for 36Cl and 14C is also provided. CosmoCalc can be downloaded along with a set of test data from http://cosmocalc.googlepages.com.
NASA Astrophysics Data System (ADS)
Reilly, B. T.; Stoner, J. S.; Wiest, J.; Abbott, M. B.; Francus, P.; Lapointe, F.
2015-12-01
Computed Tomography (CT) of sediment cores allow for high resolution images, three dimensional volumes, and down core profiles, generated through the attenuation of X-rays as a function of density and atomic number. When using a medical CT-Scanner, these quantitative data are stored in pixels using the Hounsfield scale, which are relative to the attenuation of X-rays in water and air at standard temperature and pressure. Here we present MATLAB based software specifically designed for sedimentary applications with a user friendly graphical interface to process DICOM files and stitch overlapping CT scans. For visualization, the software allows easy generation of core slice images with grayscale and false color relative to a user defined Hounsfield number range. For comparison to other high resolution non-destructive methods, down core Hounsfield number profiles are extracted using a method robust to coring imperfections, like deformation, bowing, gaps, and gas expansion. We demonstrate the usefulness of this technique with lacustrine sediment cores from the Western United States and Canadian High Arctic, including Fish Lake, Oregon, and Sawtooth Lake, Ellesmere Island. These sites represent two different depositional environments and provide examples for a variety of common coring defects and lithologies. The Hounsfield profiles and images can be used in combination with other high resolution data sets, including sediment magnetic parameters, XRF core scans and many other types of data, to provide unique insights into how lithology influences paleoenvironmental and paleomagnetic records and their interpretations.
A CZT-based blood counter for quantitative molecular imaging.
Espagnet, Romain; Frezza, Andrea; Martin, Jean-Pierre; Hamel, Louis-André; Lechippey, Laëtitia; Beauregard, Jean-Mathieu; Després, Philippe
2017-12-01
Robust quantitative analysis in positron emission tomography (PET) and in single-photon emission computed tomography (SPECT) typically requires the time-activity curve as an input function for the pharmacokinetic modeling of tracer uptake. For this purpose, a new automated tool for the determination of blood activity as a function of time is presented. The device, compact enough to be used on the patient bed, relies on a peristaltic pump for continuous blood withdrawal at user-defined rates. Gamma detection is based on a 20 × 20 × 15 mm 3 cadmium zinc telluride (CZT) detector, read by custom-made electronics and a field-programmable gate array-based signal processing unit. A graphical user interface (GUI) allows users to select parameters and easily perform acquisitions. This paper presents the overall design of the device as well as the results related to the detector performance in terms of stability, sensitivity and energy resolution. Results from a patient study are also reported. The device achieved a sensitivity of 7.1 cps/(kBq/mL) and a minimum detectable activity of 2.5 kBq/ml for 18 F. The gamma counter also demonstrated an excellent stability with a deviation in count rates inferior to 0.05% over 6 h. An energy resolution of 8% was achieved at 662 keV. The patient study was conclusive and demonstrated that the compact gamma blood counter developed has the sensitivity and the stability required to conduct quantitative molecular imaging studies in PET and SPECT.
Interactive knowledge networks for interdisciplinary course navigation within Moodle.
Scherl, Andre; Dethleffsen, Kathrin; Meyer, Michael
2012-12-01
Web-based hypermedia learning environments are widely used in modern education and seem particularly well suited for interdisciplinary learning. Previous work has identified guidance through these complex environments as a crucial problem of their acceptance and efficiency. We reasoned that map-based navigation might provide straightforward and effortless orientation. To achieve this, we developed a clickable and user-oriented concept map-based navigation plugin. This tool is implemented as an extension of Moodle, a widely used learning management system. It visualizes inner and interdisciplinary relations between learning objects and is generated dynamically depending on user set parameters and interactions. This plugin leaves the choice of navigation type to the user and supports direct guidance. Previously developed and evaluated face-to-face interdisciplinary learning materials bridging physiology and physics courses of a medical curriculum were integrated as learning objects, the relations of which were defined by metadata. Learning objects included text pages, self-assessments, videos, animations, and simulations. In a field study, we analyzed the effects of this learning environment on physiology and physics knowledge as well as the transfer ability of third-term medical students. Data were generated from pre- and posttest questionnaires and from tracking student navigation. Use of the hypermedia environment resulted in a significant increase of knowledge and transfer capability. Furthermore, the efficiency of learning was enhanced. We conclude that hypermedia environments based on Moodle and enriched by concept map-based navigation tools can significantly support interdisciplinary learning. Implementation of adaptivity may further strengthen this approach.
Wang, Chunliang; Ritter, Felix; Smedby, Orjan
2010-07-01
To enhance the functional expandability of a picture archiving and communication systems (PACS) workstation and to facilitate the integration of third-part image-processing modules, we propose a browser-server style method. In the proposed solution, the PACS workstation shows the front-end user interface defined in an XML file while the image processing software is running in the background as a server. Inter-process communication (IPC) techniques allow an efficient exchange of image data, parameters, and user input between the PACS workstation and stand-alone image-processing software. Using a predefined communication protocol, the PACS workstation developer or image processing software developer does not need detailed information about the other system, but will still be able to achieve seamless integration between the two systems and the IPC procedure is totally transparent to the final user. A browser-server style solution was built between OsiriX (PACS workstation software) and MeVisLab (Image-Processing Software). Ten example image-processing modules were easily added to OsiriX by converting existing MeVisLab image processing networks. Image data transfer using shared memory added <10ms of processing time while the other IPC methods cost 1-5 s in our experiments. The browser-server style communication based on IPC techniques is an appealing method that allows PACS workstation developers and image processing software developers to cooperate while focusing on different interests.
The economics of time shared computing: Congestion, user costs and capacity
NASA Technical Reports Server (NTRS)
Agnew, C. E.
1982-01-01
Time shared systems permit the fixed costs of computing resources to be spread over large numbers of users. However, bottleneck results in the theory of closed queueing networks can be used to show that this economy of scale will be offset by the increased congestion that results as more users are added to the system. If one considers the total costs, including the congestion cost, there is an optimal number of users for a system which equals the saturation value usually used to define system capacity.
A software for managing after-hours activities in research user facilities
Camino, F. E.
2017-05-01
Here, we present an afterhours activity management program for shared facilities, which handles the processes required for afterhours access (request, approval, extension, etc.). It implements the concept of permitted afterhours activities, which consists of a list of well-defined activities that each user can perform afterhours. The program provides an easy and unambiguous way for users to know which activities they are allowed to perform afterhours. In addition, the program can enhance its safety efficacy by interacting with lab and instrument access control systems commonly present in user facilities.
A software for managing after-hours activities in research user facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Camino, F. E.
Here, we present an afterhours activity management program for shared facilities, which handles the processes required for afterhours access (request, approval, extension, etc.). It implements the concept of permitted afterhours activities, which consists of a list of well-defined activities that each user can perform afterhours. The program provides an easy and unambiguous way for users to know which activities they are allowed to perform afterhours. In addition, the program can enhance its safety efficacy by interacting with lab and instrument access control systems commonly present in user facilities.
A database system to support image algorithm evaluation
NASA Technical Reports Server (NTRS)
Lien, Y. E.
1977-01-01
The design is given of an interactive image database system IMDB, which allows the user to create, retrieve, store, display, and manipulate images through the facility of a high-level, interactive image query (IQ) language. The query language IQ permits the user to define false color functions, pixel value transformations, overlay functions, zoom functions, and windows. The user manipulates the images through generic functions. The user can direct images to display devices for visual and qualitative analysis. Image histograms and pixel value distributions can also be computed to obtain a quantitative analysis of images.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-22
... ``devices'' and/or ``UserIDs.'' \\4\\ OPRA defines a ``Subscriber,'' in general, as an entity or person that... Exchanges on an Unlisted Trading Privilege Basis'' (``Nasdaq/UTP Plan''). CTA and the Nasdaq/UTP Plan define... capacity.\\9\\ Second, because the definition of the term ``associated person'' is defined differently in the...
A Transionospheric Communication Channel Model
1977-07-01
F30602-75-C-0236 Anne R. Hessing V. Elaine Hatfield 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT , PROJECT, TASK AREA & WORK UNIT...34* ables from a user-selected set of ionospheric state parameters. Mode II of IONSCNT extends the Mode-I results to second-order statistics for cases...describes only representative conditions for the set of input parameters selected by the user. Night-to-night departures from the calcu- :". lated "mean
Display system for imaging scientific telemetric information
NASA Technical Reports Server (NTRS)
Zabiyakin, G. I.; Rykovanov, S. N.
1979-01-01
A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.
Data vs. information: A system paradigm
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1982-01-01
The data system designer requires data parameters, and is dependent on the user to convert information needs to these data parameters. This conversion will be done with more or less accuracy, beginning a chain of inaccuracies which propagate through the system, and which, in the end, may prevent the user from converting the data received into the information required. The concept to be pursued is that errors occur in various parts of the system, and, having occurred, propagate to the end. Modeling of the system may allow an estimation of the effects at any point and the final accumulated effect, and may prove a method of allocating an error budget among the system components. The selection of the various technical parameters which a data system must meet must be done in relation to the ability of the user to turn the cold, impersonal data into a live, personal decision or piece of information.
Mohr, David C; Schueller, Stephen M; Montague, Enid; Burns, Michelle Nicole; Rashidi, Parisa
2014-06-05
A growing number of investigators have commented on the lack of models to inform the design of behavioral intervention technologies (BITs). BITs, which include a subset of mHealth and eHealth interventions, employ a broad range of technologies, such as mobile phones, the Web, and sensors, to support users in changing behaviors and cognitions related to health, mental health, and wellness. We propose a model that conceptually defines BITs, from the clinical aim to the technological delivery framework. The BIT model defines both the conceptual and technological architecture of a BIT. Conceptually, a BIT model should answer the questions why, what, how (conceptual and technical), and when. While BITs generally have a larger treatment goal, such goals generally consist of smaller intervention aims (the "why") such as promotion or reduction of specific behaviors, and behavior change strategies (the conceptual "how"), such as education, goal setting, and monitoring. Behavior change strategies are instantiated with specific intervention components or "elements" (the "what"). The characteristics of intervention elements may be further defined or modified (the technical "how") to meet the needs, capabilities, and preferences of a user. Finally, many BITs require specification of a workflow that defines when an intervention component will be delivered. The BIT model includes a technological framework (BIT-Tech) that can integrate and implement the intervention elements, characteristics, and workflow to deliver the entire BIT to users over time. This implementation may be either predefined or include adaptive systems that can tailor the intervention based on data from the user and the user's environment. The BIT model provides a step towards formalizing the translation of developer aims into intervention components, larger treatments, and methods of delivery in a manner that supports research and communication between investigators on how to design, develop, and deploy BITs.