Computer-Assisted Traffic Engineering Using Assignment, Optimal Signal Setting, and Modal Split
DOT National Transportation Integrated Search
1978-05-01
Methods of traffic assignment, traffic signal setting, and modal split analysis are combined in a set of computer-assisted traffic engineering programs. The system optimization and user optimization traffic assignments are described. Travel time func...
NASA Astrophysics Data System (ADS)
Osei, Richard
There are many problems associated with operating a data center. Some of these problems include data security, system performance, increasing infrastructure complexity, increasing storage utilization, keeping up with data growth, and increasing energy costs. Energy cost differs by location, and at most locations fluctuates over time. The rising cost of energy makes it harder for data centers to function properly and provide a good quality of service. With reduced energy cost, data centers will have longer lasting servers/equipment, higher availability of resources, better quality of service, a greener environment, and reduced service and software costs for consumers. Some of the ways that data centers have tried to using to reduce energy costs include dynamically switching on and off servers based on the number of users and some predefined conditions, the use of environmental monitoring sensors, and the use of dynamic voltage and frequency scaling (DVFS), which enables processors to run at different combinations of frequencies with voltages to reduce energy cost. This thesis presents another method by which energy cost at data centers could be reduced. This method involves the use of Ant Colony Optimization (ACO) on a Quadratic Assignment Problem (QAP) in assigning user request to servers in geo-distributed data centers. In this paper, an effort to reduce data center energy cost involves the use of front portals, which handle users' requests, were used as ants to find cost effective ways to assign users requests to a server in heterogeneous geo-distributed data centers. The simulation results indicate that the ACO for Optimal Server Activation and Task Placement algorithm reduces energy cost on a small and large number of users' requests in a geo-distributed data center and its performance increases as the input data grows. In a simulation with 3 geo-distributed data centers, and user's resource request ranging from 25,000 to 25,000,000, the ACO algorithm was able to reduce energy cost on an average of $.70 per second. The ACO for Optimal Server Activation and Task Placement algorithm has proven to work as an alternative or improvement in reducing energy cost in geo-distributed data centers.
ERIC Educational Resources Information Center
Gale, David; And Others
Four units make up the contents of this document. The first examines applications of finite mathematics to business and economies. The user is expected to learn the method of optimization in optimal assignment problems. The second module presents applications of difference equations to economics and social sciences, and shows how to: 1) interpret…
Distributed Combinatorial Optimization Using Privacy on Mobile Phones
NASA Astrophysics Data System (ADS)
Ono, Satoshi; Katayama, Kimihiro; Nakayama, Shigeru
This paper proposes a method for distributed combinatorial optimization which uses mobile phones as computers. In the proposed method, an ordinary computer generates solution candidates and mobile phones evaluates them by referring privacy — private information and preferences. Users therefore does not have to send their privacy to any other computers and does not have to refrain from inputting their preferences. They therefore can obtain satisfactory solution. Experimental results have showed the proposed method solved room assignment problems without sending users' privacy to a server.
A Genetic Algorithm for the Bi-Level Topological Design of Local Area Networks
Camacho-Vallejo, José-Fernando; Mar-Ortiz, Julio; López-Ramos, Francisco; Rodríguez, Ricardo Pedraza
2015-01-01
Local access networks (LAN) are commonly used as communication infrastructures which meet the demand of a set of users in the local environment. Usually these networks consist of several LAN segments connected by bridges. The topological LAN design bi-level problem consists on assigning users to clusters and the union of clusters by bridges in order to obtain a minimum response time network with minimum connection cost. Therefore, the decision of optimally assigning users to clusters will be made by the leader and the follower will make the decision of connecting all the clusters while forming a spanning tree. In this paper, we propose a genetic algorithm for solving the bi-level topological design of a Local Access Network. Our solution method considers the Stackelberg equilibrium to solve the bi-level problem. The Stackelberg-Genetic algorithm procedure deals with the fact that the follower’s problem cannot be optimally solved in a straightforward manner. The computational results obtained from two different sets of instances show that the performance of the developed algorithm is efficient and that it is more suitable for solving the bi-level problem than a previous Nash-Genetic approach. PMID:26102502
Development of Watch Schedule Using Rules Approach
NASA Astrophysics Data System (ADS)
Jurkevicius, Darius; Vasilecas, Olegas
The software for schedule creation and optimization solves a difficult, important and practical problem. The proposed solution is an online employee portal where administrator users can create and manage watch schedules and employee requests. Each employee can login with his/her own account and see his/her assignments, manage requests, etc. Employees set as administrators can perform the employee scheduling online, manage requests, etc. This scheduling software allows users not only to see the initial and optimized watch schedule in a simple and understandable form, but also to create special rules and criteria and input their business. The system using rules automatically will generate watch schedule.
Optimizing Search and Ranking in Folksonomy Systems by Exploiting Context Information
NASA Astrophysics Data System (ADS)
Abel, Fabian; Henze, Nicola; Krause, Daniel
Tagging systems enable users to annotate resources with freely chosen keywords. The evolving bunch of tag assignments is called folksonomy and there exist already some approaches that exploit folksonomies to improve resource retrieval. In this paper, we analyze and compare graph-based ranking algorithms: FolkRank and SocialPageRank. We enhance these algorithms by exploiting the context of tags, and evaluate the results on the GroupMe! dataset. In GroupMe!, users can organize and maintain arbitrary Web resources in self-defined groups. When users annotate resources in GroupMe!, this can be interpreted in context of a certain group. The grouping activity itself is easy for users to perform. However, it delivers valuable semantic information about resources and their context. We present GRank that uses the context information to improve and optimize the detection of relevant search results, and compare different strategies for ranking result lists in folksonomy systems.
Castillo, Andrés M; Bernal, Andrés; Patiny, Luc; Wist, Julien
2015-08-01
We present a method for the automatic assignment of small molecules' NMR spectra. The method includes an automatic and novel self-consistent peak-picking routine that validates NMR peaks in each spectrum against peaks in the same or other spectra that are due to the same resonances. The auto-assignment routine used is based on branch-and-bound optimization and relies predominantly on integration and correlation data; chemical shift information may be included when available to fasten the search and shorten the list of viable assignments, but in most cases tested, it is not required in order to find the correct assignment. This automatic assignment method is implemented as a web-based tool that runs without any user input other than the acquired spectra. Copyright © 2015 John Wiley & Sons, Ltd.
An initial approach towards quality of service based Spectrum Trading
NASA Astrophysics Data System (ADS)
Bastidas, Carlos E. Caicedo; Vanhoy, Garret; Volos, Haris I.; Bose, Tamal
Spectrum scarcity has become an important issue as demands for higher data rates increase in diverse wireless applications and aerospace communication scenarios. To address this problem, it becomes necessary to manage radio spectrum assignment in a way that optimizes the distribution of spectrum resources among several users while taking into account the quality of service (QoS) characteristics desired by the users of spectrum. In this paper, a novel approach to managing spectrum assignment based on Spectrum Trading (ST) will be presented. Market based spectrum assignment mechanisms such as spectrum trading are of growing interest to many spectrum management agencies that are planning to increase the use of these mechanisms for spectrum management and reduce their emphasis on command and control methods. This paper presents some of our initial work into incorporating quality of service information into the mechanisms that determine how spectrum should be traded when using a spectrum exchange. Through simulations and a testbed implementation of a QoS aware spectrum exchange our results show the viability of using QoS based mechanisms in spectrum trading and in the enhancement of dynamic spectrum assignment systems.
Steps Toward Optimal Competitive Scheduling
NASA Technical Reports Server (NTRS)
Frank, Jeremy; Crawford, James; Khatib, Lina; Brafman, Ronen
2006-01-01
This paper is concerned with the problem of allocating a unit capacity resource to multiple users within a pre-defined time period. The resource is indivisible, so that at most one user can use it at each time instance. However, different users may use it at different times. The users have independent, se@sh preferences for when and for how long they are allocated this resource. Thus, they value different resource access durations differently, and they value different time slots differently. We seek an optimal allocation schedule for this resource. This problem arises in many institutional settings where, e.g., different departments, agencies, or personal, compete for a single resource. We are particularly motivated by the problem of scheduling NASA's Deep Space Satellite Network (DSN) among different users within NASA. Access to DSN is needed for transmitting data from various space missions to Earth. Each mission has different needs for DSN time, depending on satellite and planetary orbits. Typically, the DSN is over-subscribed, in that not all missions will be allocated as much time as they want. This leads to various inefficiencies - missions spend much time and resource lobbying for their time, often exaggerating their needs. NASA, on the other hand, would like to make optimal use of this resource, ensuring that the good for NASA is maximized. This raises the thorny problem of how to measure the utility to NASA of each allocation. In the typical case, it is difficult for the central agency, NASA in our case, to assess the value of each interval to each user - this is really only known to the users who understand their needs. Thus, our problem is more precisely formulated as follows: find an allocation schedule for the resource that maximizes the sum of users preferences, when the preference values are private information of the users. We bypass this problem by making the assumptions that one can assign money to customers. This assumption is reasonable; a committee is usually in charge of deciding the priority of each mission competing for access to the DSN within a time period while scheduling. Instead, we can assume that the committee assigns a budget to each mission.This paper is concerned with the problem of allocating a unit capacity resource to multiple users within a pre-defined time period. The resource is indivisible, so that at most one user can use it at each time instance. However, different users may use it at different times. The users have independent, se@sh preferences for when and for how long they are allocated this resource. Thus, they value different resource access durations differently, and they value different time slots differently. We seek an optimal allocation schedule for this resource. This problem arises in many institutional settings where, e.g., different departments, agencies, or personal, compete for a single resource. We are particularly motivated by the problem of scheduling NASA's Deep Space Satellite Network (DSN) among different users within NASA. Access to DSN is needed for transmitting data from various space missions to Earth. Each mission has different needs for DSN time, depending on satellite and planetary orbits. Typically, the DSN is over-subscribed, in that not all missions will be allocated as much time as they want. This leads to various inefficiencies - missions spend much time and resource lobbying for their time, often exaggerating their needs. NASA, on the other hand, would like to make optimal use of this resource, ensuring that the good for NASA is maximized. This raises the thorny problem of how to measure the utility to NASA of each allocation. In the typical case, it is difficult for the central agency, NASA in our case, to assess the value of each interval to each user - this is really only known to the users who understand their needs. Thus, our problem is more precisely formulated as follows: find an allocation schedule for the resource that maximizes the sum ofsers preferences, when the preference values are private information of the users. We bypass this problem by making the assumptions that one can assign money to customers. This assumption is reasonable; a committee is usually in charge of deciding the priority of each mission competing for access to the DSN within a time period while scheduling. Instead, we can assume that the committee assigns a budget to each mission.
Cell transmission model of dynamic assignment for urban rail transit networks.
Xu, Guangming; Zhao, Shuo; Shi, Feng; Zhang, Feilian
2017-01-01
For urban rail transit network, the space-time flow distribution can play an important role in evaluating and optimizing the space-time resource allocation. For obtaining the space-time flow distribution without the restriction of schedules, a dynamic assignment problem is proposed based on the concept of continuous transmission. To solve the dynamic assignment problem, the cell transmission model is built for urban rail transit networks. The priority principle, queuing process, capacity constraints and congestion effects are considered in the cell transmission mechanism. Then an efficient method is designed to solve the shortest path for an urban rail network, which decreases the computing cost for solving the cell transmission model. The instantaneous dynamic user optimal state can be reached with the method of successive average. Many evaluation indexes of passenger flow can be generated, to provide effective support for the optimization of train schedules and the capacity evaluation for urban rail transit network. Finally, the model and its potential application are demonstrated via two numerical experiments using a small-scale network and the Beijing Metro network.
Lee, Chaewoo
2014-01-01
The advancement in wideband wireless network supports real time services such as IPTV and live video streaming. However, because of the sharing nature of the wireless medium, efficient resource allocation has been studied to achieve a high level of acceptability and proliferation of wireless multimedia. Scalable video coding (SVC) with adaptive modulation and coding (AMC) provides an excellent solution for wireless video streaming. By assigning different modulation and coding schemes (MCSs) to video layers, SVC can provide good video quality to users in good channel conditions and also basic video quality to users in bad channel conditions. For optimal resource allocation, a key issue in applying SVC in the wireless multicast service is how to assign MCSs and the time resources to each SVC layer in the heterogeneous channel condition. We formulate this problem with integer linear programming (ILP) and provide numerical results to show the performance under 802.16 m environment. The result shows that our methodology enhances the overall system throughput compared to an existing algorithm. PMID:25276862
Eye-gaze determination of user intent at the computer interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, J.H.; Schryver, J.C.
1993-12-31
Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less
DeRobertis, Christopher V.; Lu, Yantian T.
2010-02-23
A method, system, and program storage device for creating a new user account or user group with a unique identification number in a computing environment having multiple user registries is provided. In response to receiving a command to create a new user account or user group, an operating system of a clustered computing environment automatically checks multiple registries configured for the operating system to determine whether a candidate identification number for the new user account or user group has been assigned already to one or more existing user accounts or groups, respectively. The operating system automatically assigns the candidate identification number to the new user account or user group created in a target user registry if the checking indicates that the candidate identification number has not been assigned already to any of the existing user accounts or user groups, respectively.
Rate Adaptive Based Resource Allocation with Proportional Fairness Constraints in OFDMA Systems
Yin, Zhendong; Zhuang, Shufeng; Wu, Zhilu; Ma, Bo
2015-01-01
Orthogonal frequency division multiple access (OFDMA), which is widely used in the wireless sensor networks, allows different users to obtain different subcarriers according to their subchannel gains. Therefore, how to assign subcarriers and power to different users to achieve a high system sum rate is an important research area in OFDMA systems. In this paper, the focus of study is on the rate adaptive (RA) based resource allocation with proportional fairness constraints. Since the resource allocation is a NP-hard and non-convex optimization problem, a new efficient resource allocation algorithm ACO-SPA is proposed, which combines ant colony optimization (ACO) and suboptimal power allocation (SPA). To reduce the computational complexity, the optimization problem of resource allocation in OFDMA systems is separated into two steps. For the first one, the ant colony optimization algorithm is performed to solve the subcarrier allocation. Then, the suboptimal power allocation algorithm is developed with strict proportional fairness, and the algorithm is based on the principle that the sums of power and the reciprocal of channel-to-noise ratio for each user in different subchannels are equal. To support it, plenty of simulation results are presented. In contrast with root-finding and linear methods, the proposed method provides better performance in solving the proportional resource allocation problem in OFDMA systems. PMID:26426016
Providing Effective Access to Shared Resources: A COIN Approach
NASA Technical Reports Server (NTRS)
Airiau, Stephane; Wolpert, David H.
2004-01-01
Managers of systems of shared resources typically have many separate goals. Examples are efficient utilization of the resources among its users and ensuring no user s satisfaction in the system falls below a preset minimal level. Since such goals will usually conflict with one another, either implicitly or explicitly the manager must determine the relative importance of the goals, encapsulating that into an overall utility function rating the possible behaviors of the entire system. Here we demonstrate a distributed, robust, and adaptive way to optimize that overall function. Our approach is to interpose adaptive agents between each user and the system, where each such agent is working to maximize its own private utility function. In turn, each such agent's function should be both relatively easy for the agent to learn to optimize, and "aligned" with the overall utility function of the system manager - an overall function that is based on but in general different from the satisfaction functions of the individual users. To ensure this we enhance the Collective INtelligence (COIN) framework to incorporate user satisfaction functions in the overall utility function of the system manager and accordingly in the associated private utility functions assigned to the users agents. We present experimental evaluations of different COIN-based private utility functions and demonstrate that those COIN-based functions outperform some natural alternatives.
Providing Effective Access to Shared Resources: A COIN Approach
NASA Technical Reports Server (NTRS)
Airiau, Stephane; Wolpert, David H.; Sen, Sandip; Tumer, Kagan
2003-01-01
Managers of systems of shared resources typically have many separate goals. Examples are efficient utilization of the resources among its users and ensuring no user's satisfaction in the system falls below a preset minimal level. Since such goals will usually conflict with one another, either implicitly or explicitly the manager must determine the relative importance of the goals, encapsulating that into an overall utility function rating the possible behaviors of the entire system. Here we demonstrate a distributed, robust, and adaptive way to optimize that overall function. Our approach is to interpose adaptive agents between each user and the system, where each such agent is working to maximize its own private utility function. In turn, each such agent's function should be both relatively easy for the agent to learn to optimize, and 'aligned' with the overall utility function of the system manager - an overall function that is based on but in general different from the satisfaction functions of the individual users. To ensure this we enhance the COllective INtelligence (COIN) framework to incorporate user satisfaction functions in the overall utility function of the system manager and accordingly in the associated private utility functions assigned to the users agents. We present experimental evaluations of different COIN-based private utility functions and demonstrate that those COIN-based functions outperform some natural alternatives.
2008-01-01
designing cost -effective CIRF networks or readily comparing alterna- tive potential network designs. The RAND Corporation was asked to develop such an...optimization model that allows users to select the best mix of land- and sea-based FSLs for a given set of operational scenarios, thereby reducing costs while...27 Overview of Post-BRAC Bed-Downs and CIRF Assignments . . . . . . . . . . . . . 27 JEIM Cost
Printed wiring board system programmer's manual
NASA Technical Reports Server (NTRS)
Brinkerhoff, C. D.
1973-01-01
The printed wiring board system provides automated techniques for the design of printed circuit boards and hybrid circuit boards. The system consists of four programs: (1) the preprocessor program combines user supplied data and pre-defined library data to produce the detailed circuit description data; (2) the placement program assigns circuit components to specific areas of the board in a manner that optimizes the total interconnection length of the circuit; (3) the organizer program assigns pin interconnections to specific board levels and determines the optimal order in which the router program should attempt to layout the paths connecting the pins; and (4) the router program determines the wire paths which are to be used to connect each input pin pair on the circuit board. This document is intended to serve as a programmer's reference manual for the printed wiring board system. A detailed description of the internal logic and flow of the printed wiring board programs is included.
Distributed Energy Resources Customer Adoption Model - Graphical User Interface, Version 2.1.8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewald, Friedrich; Stadler, Michael; Cardoso, Goncalo F
The DER-CAM Graphical User Interface has been redesigned to consist of a dynamic tree structure on the left side of the application window to allow users to quickly navigate between different data categories and views. Views can either be tables with model parameters and input data, the optimization results, or a graphical interface to draw circuit topology and visualize investment results. The model parameters and input data consist of tables where values are assigned to specific keys. The aggregation of all model parameters and input data amounts to the data required to build a DER-CAM model, and is passed tomore » the GAMS solver when users initiate the DER-CAM optimization process. Passing data to the GAMS solver relies on the use of a Java server that handles DER-CAM requests, queuing, and results delivery. This component of the DER-CAM GUI can be deployed either locally or remotely, and constitutes an intermediate step between the user data input and manipulation, and the execution of a DER-CAM optimization in the GAMS engine. The results view shows the results of the DER-CAM optimization and distinguishes between a single and a multi-objective process. The single optimization runs the DER-CAM optimization once and presents the results as a combination of summary charts and hourly dispatch profiles. The multi-objective optimization process consists of a sequence of runs initiated by the GUI, including: 1) CO2 minimization, 2) cost minimization, 3) a user defined number of points in-between objectives 1) and 2). The multi-objective results view includes both access to the detailed results of each point generated by the process as well as the generation of a Pareto Frontier graph to illustrate the trade-off between objectives. DER-CAM GUI 2.1.8 also introduces the ability to graphically generate circuit topologies, enabling support to DER-CAM 5.0.0. This feature consists of: 1) The drawing area, where users can manually create nodes and define their properties (e.g. point of common coupling, slack bus, load) and connect them through edges representing either power lines, transformers, or heat pipes, all with user defined characteristics (e.g., length, ampacity, inductance, or heat loss); 2) The tables, which display the user-defined topology in the final numerical form that will be passed to the DER-CAM optimization. Finally, the DER-CAM GUI is also deployed with a database schema that allows users to provide different energy load profiles, solar irradiance profiles, and tariff data, that can be stored locally and later used in any DER-CAM model. However, no real data will be delivered with this version.« less
NASA Astrophysics Data System (ADS)
Shi, Xiaoyu; Shang, Ming-Sheng; Luo, Xin; Khushnood, Abbas; Li, Jian
2017-02-01
As the explosion growth of Internet economy, recommender system has become an important technology to solve the problem of information overload. However, recommenders are not one-size-fits-all, different recommenders have different virtues, making them be suitable for different users. In this paper, we propose a novel personalized recommender based on user preferences, which allows multiple recommenders to exist in E-commerce system simultaneously. We find that output of a recommender to each user is quite different when using different recommenders, the recommendation accuracy can be significantly improved if each user is assigned with his/her optimal personalized recommender. Furthermore, different from previous works focusing on short-term effects on recommender, we also evaluate the long-term effect of the proposed method by modeling the evolution of mutual feedback between user and online system. Finally, compared with single recommender running on the online system, the proposed method can improve the accuracy of recommendation significantly and get better trade-offs between short- and long-term performances of recommendation.
An Optimization of the Basic School Military Occupational Skill Assignment Process
2003-06-01
Corps Intranet (NMCI)23 supports it. We evaluated the use of Microsoft’s SQL Server, but dismissed this after learning that TBS did not possess a SQL ...Server license or a qualified SQL Server administrator.24 SQL Server would have provided for additional security measures not available in MS...administrator. Although not has powerful as SQL Server, MS Access can handle the multi-user environment necessary for this system.25 The training
Optimization techniques applied to spectrum management for communications satellites
NASA Astrophysics Data System (ADS)
Ottey, H. R.; Sullivan, T. M.; Zusman, F. S.
This paper describes user requirements, algorithms and software design features for the application of optimization techniques to the management of the geostationary orbit/spectrum resource. Relevant problems include parameter sensitivity analyses, frequency and orbit position assignment coordination, and orbit position allotment planning. It is shown how integer and nonlinear programming as well as heuristic search techniques can be used to solve these problems. Formalized mathematical objective functions that define the problems are presented. Constraint functions that impart the necessary solution bounds are described. A versatile program structure is outlined, which would allow problems to be solved in stages while varying the problem space, solution resolution, objective function and constraints.
Optimizing Mars Airplane Trajectory with the Application Navigation System
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Riley, Derek
2004-01-01
Planning complex missions requires a number of programs to be executed in concert. The Application Navigation System (ANS), developed in the NAS Division, can execute many interdependent programs in a distributed environment. We show that the ANS simplifies user effort and reduces time in optimization of the trajectory of a martian airplane. We use a software package, Cart3D, to evaluate trajectories and a shortest path algorithm to determine the optimal trajectory. ANS employs the GridScape to represent the dynamic state of the available computer resources. Then, ANS uses a scheduler to dynamically assign ready task to machine resources and the GridScape for tracking available resources and forecasting completion time of running tasks. We demonstrate system capability to schedule and run the trajectory optimization application with efficiency exceeding 60% on 64 processors.
Cai, Lile; Tay, Wei-Liang; Nguyen, Binh P; Chui, Chee-Kong; Ong, Sim-Heng
2013-01-01
Transfer functions play a key role in volume rendering of medical data, but transfer function manipulation is unintuitive and can be time-consuming; achieving an optimal visualization of patient anatomy or pathology is difficult. To overcome this problem, we present a system for automatic transfer function design based on visibility distribution and projective color mapping. Instead of assigning opacity directly based on voxel intensity and gradient magnitude, the opacity transfer function is automatically derived by matching the observed visibility distribution to a target visibility distribution. An automatic color assignment scheme based on projective mapping is proposed to assign colors that allow for the visual discrimination of different structures, while also reflecting the degree of similarity between them. When our method was tested on several medical volumetric datasets, the key structures within the volume were clearly visualized with minimal user intervention. Copyright © 2013 Elsevier Ltd. All rights reserved.
Contrast research of CDMA and GSM network optimization
NASA Astrophysics Data System (ADS)
Wu, Yanwen; Liu, Zehong; Zhou, Guangyue
2004-03-01
With the development of mobile telecommunication network, users of CDMA advanced their request of network service quality. While the operators also change their network management object from signal coverage to performance improvement. In that case, reasonably layout & optimization of mobile telecommunication network, reasonably configuration of network resource, improvement of the service quality, and increase the enterprise's core competition ability, all those have been concerned by the operator companies. This paper firstly looked into the flow of CDMA network optimization. Then it dissertated to some keystones in the CDMA network optimization, like PN code assignment, calculation of soft handover, etc. As GSM is also the similar cellular mobile telecommunication system like CDMA, so this paper also made a contrast research of CDMA and GSM network optimization in details, including the similarity and the different. In conclusion, network optimization is a long time job; it will run through the whole process of network construct. By the adjustment of network hardware (like BTS equipments, RF systems, etc.) and network software (like parameter optimized, configuration optimized, capacity optimized, etc.), network optimization work can improve the performance and service quality of the network.
An Active RBSE Framework to Generate Optimal Stimulus Sequences in a BCI for Spelling
NASA Astrophysics Data System (ADS)
Moghadamfalahi, Mohammad; Akcakaya, Murat; Nezamfar, Hooman; Sourati, Jamshid; Erdogmus, Deniz
2017-10-01
A class of brain computer interfaces (BCIs) employs noninvasive recordings of electroencephalography (EEG) signals to enable users with severe speech and motor impairments to interact with their environment and social network. For example, EEG based BCIs for typing popularly utilize event related potentials (ERPs) for inference. Presentation paradigm design in current ERP-based letter by letter typing BCIs typically query the user with an arbitrary subset characters. However, the typing accuracy and also typing speed can potentially be enhanced with more informed subset selection and flash assignment. In this manuscript, we introduce the active recursive Bayesian state estimation (active-RBSE) framework for inference and sequence optimization. Prior to presentation in each iteration, rather than showing a subset of randomly selected characters, the developed framework optimally selects a subset based on a query function. Selected queries are made adaptively specialized for users during each intent detection. Through a simulation-based study, we assess the effect of active-RBSE on the performance of a language-model assisted typing BCI in terms of typing speed and accuracy. To provide a baseline for comparison, we also utilize standard presentation paradigms namely, row and column matrix presentation paradigm and also random rapid serial visual presentation paradigms. The results show that utilization of active-RBSE can enhance the online performance of the system, both in terms of typing accuracy and speed.
Evaluating a Web-Based Interface for Internet Telemedicine
NASA Technical Reports Server (NTRS)
Lathan, Corinna E.; Newman, Dava J.; Sebrechts, Marc M.; Doarn, Charles R.
1997-01-01
The objective is to introduce the usability engineering methodology, heuristic evaluation, to the design and development of a web-based telemedicine system. Using a set of usability criteria, or heuristics, one evaluator examined the Spacebridge to Russia web-site for usability problems. Thirty-four usability problems were found in this preliminary study and all were assigned a severity rating. The value of heuristic analysis in the iterative design of a system is shown because the problems can be fixed before deployment of a system and the problems are of a different nature than those found by actual users of the system. It was therefore determined that there is potential value of heuristic evaluation paired with user testing as a strategy for optimal system performance design.
The Power of Ground User in Recommender Systems
Zhou, Yanbo; Lü, Linyuan; Liu, Weiping; Zhang, Jianlin
2013-01-01
Accuracy and diversity are two important aspects to evaluate the performance of recommender systems. Two diffusion-based methods were proposed respectively inspired by the mass diffusion (MD) and heat conduction (HC) processes on networks. It has been pointed out that MD has high recommendation accuracy yet low diversity, while HC succeeds in seeking out novel or niche items but with relatively low accuracy. The accuracy-diversity dilemma is a long-term challenge in recommender systems. To solve this problem, we introduced a background temperature by adding a ground user who connects to all the items in the user-item bipartite network. Performing the HC algorithm on the network with ground user (GHC), it showed that the accuracy can be largely improved while keeping the diversity. Furthermore, we proposed a weighted form of the ground user (WGHC) by assigning some weights to the newly added links between the ground user and the items. By turning the weight as a free parameter, an optimal value subject to the highest accuracy is obtained. Experimental results on three benchmark data sets showed that the WGHC outperforms the state-of-the-art method MD for both accuracy and diversity. PMID:23936380
A Vision and Roadmap for Increasing User Autonomy in Flight Operations in the National Airspace
NASA Technical Reports Server (NTRS)
Cotton, William B.; Hilb, Robert; Koczo, Stefan; Wing, David
2016-01-01
The purpose of Air Transportation is to move people and cargo safely, efficiently and swiftly to their destinations. The companies and individuals who use aircraft for this purpose, the airspace users, desire to operate their aircraft according to a dynamically optimized business trajectory for their specific mission and operational business model. In current operations, the dynamic optimization of business trajectories is limited by constraints built into operations in the National Airspace System (NAS) for reasons of safety and operational needs of the air navigation service providers. NASA has been developing and testing means to overcome many of these constraints and permit operations to be conducted closer to the airspace user's changing business trajectory as conditions unfold before and during the flight. A roadmap of logical steps progressing toward increased user autonomy is proposed, beginning with NASA's Traffic Aware Strategic Aircrew Requests (TASAR) concept that enables flight crews to make informed, deconflicted flight-optimization requests to air traffic control. These steps include the use of data communications for route change requests and approvals, integration with time-based arrival flow management processes under development by the Federal Aviation Administration (FAA), increased user authority for defining and modifying downstream, strategic portions of the trajectory, and ultimately application of self-separation. This progression takes advantage of existing FAA NextGen programs and RTCA standards development, and it is designed to minimize the number of hardware upgrades required of airspace users to take advantage of these advanced capabilities to achieve dynamically optimized business trajectories in NAS operations. The roadmap is designed to provide operational benefits to first adopters so that investment decisions do not depend upon a large segment of the user community becoming equipped before benefits can be realized. The issues of equipment certification and operational approval of new procedures are addressed in a way that minimizes their impact on the transition by deferring a change in the assignment of separation responsibility until a large body of operational data is available to support the safety case for this change in the last roadmap step.This paper will relate the roadmap steps to ongoing activities to clarify the economics-based transition to these technologies for operational use.
A model of cloud application assignments in software-defined storages
NASA Astrophysics Data System (ADS)
Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; E Shukhman, Alexander
2017-01-01
The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.
NASA Astrophysics Data System (ADS)
Lee, Junghyun; Kim, Heewon; Chung, Hyun; Kim, Haedong; Choi, Sujin; Jung, Okchul; Chung, Daewon; Ko, Kwanghee
2018-04-01
In this paper, we propose a method that uses a genetic algorithm for the dynamic schedule optimization of imaging missions for multiple satellites and ground systems. In particular, the visibility conflicts of communication and mission operation using satellite resources (electric power and onboard memory) are integrated in sequence. Resource consumption and restoration are considered in the optimization process. Image acquisition is an essential part of satellite missions and is performed via a series of subtasks such as command uplink, image capturing, image storing, and image downlink. An objective function for optimization is designed to maximize the usability by considering the following components: user-assigned priority, resource consumption, and image-acquisition time. For the simulation, a series of hypothetical imaging missions are allocated to a multi-satellite control system comprising five satellites and three ground stations having S- and X-band antennas. To demonstrate the performance of the proposed method, simulations are performed via three operation modes: general, commercial, and tactical.
Competitive game theoretic optimal routing in optical networks
NASA Astrophysics Data System (ADS)
Yassine, Abdulsalam; Kabranov, Ognian; Makrakis, Dimitrios
2002-09-01
Optical transport service providers need control and optimization strategies for wavelength management, network provisioning, restoration and protection, allowing them to define and deploy new services and maintain competitiveness. In this paper, we investigate a game theory based model for wavelength and flow assignment in multi wavelength optical networks, consisting of several backbone long-haul optical network transport service providers (TSPs) who are offering their services -in terms of bandwidth- to Internet service providers (ISPs). The ISPs act as brokers or agents between the TSP and end user. The agent (ISP) buys services (bandwidth) from the TSP. The TSPs compete among themselves to sell their services and maintain profitability. We present a case study, demonstrating the impact of different bandwidth broker demands on the supplier's profit and the price paid by the network broker.
Specdata: Automated Analysis Software for Broadband Spectra
NASA Astrophysics Data System (ADS)
Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.
2017-06-01
With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.
QoS-aware health monitoring system using cloud-based WBANs.
Almashaqbeh, Ghada; Hayajneh, Thaier; Vasilakos, Athanasios V; Mohd, Bassam J
2014-10-01
Wireless Body Area Networks (WBANs) are amongst the best options for remote health monitoring. However, as standalone systems WBANs have many limitations due to the large amount of processed data, mobility of monitored users, and the network coverage area. Integrating WBANs with cloud computing provides effective solutions to these problems and promotes the performance of WBANs based systems. Accordingly, in this paper we propose a cloud-based real-time remote health monitoring system for tracking the health status of non-hospitalized patients while practicing their daily activities. Compared with existing cloud-based WBAN frameworks, we divide the cloud into local one, that includes the monitored users and local medical staff, and a global one that includes the outer world. The performance of the proposed framework is optimized by reducing congestion, interference, and data delivery delay while supporting users' mobility. Several novel techniques and algorithms are proposed to accomplish our objective. First, the concept of data classification and aggregation is utilized to avoid clogging the network with unnecessary data traffic. Second, a dynamic channel assignment policy is developed to distribute the WBANs associated with the users on the available frequency channels to manage interference. Third, a delay-aware routing metric is proposed to be used by the local cloud in its multi-hop communication to speed up the reporting process of the health-related data. Fourth, the delay-aware metric is further utilized by the association protocols used by the WBANs to connect with the local cloud. Finally, the system with all the proposed techniques and algorithms is evaluated using extensive ns-2 simulations. The simulation results show superior performance of the proposed architecture in optimizing the end-to-end delay, handling the increased interference levels, maximizing the network capacity, and tracking user's mobility.
Optimal processor assignment for pipeline computations
NASA Technical Reports Server (NTRS)
Nicol, David M.; Simha, Rahul; Choudhury, Alok N.; Narahari, Bhagirath
1991-01-01
The availability of large scale multitasked parallel architectures introduces the following processor assignment problem for pipelined computations. Given a set of tasks and their precedence constraints, along with their experimentally determined individual responses times for different processor sizes, find an assignment of processor to tasks. Two objectives are of interest: minimal response given a throughput requirement, and maximal throughput given a response time requirement. These assignment problems differ considerably from the classical mapping problem in which several tasks share a processor; instead, it is assumed that a large number of processors are to be assigned to a relatively small number of tasks. Efficient assignment algorithms were developed for different classes of task structures. For a p processor system and a series parallel precedence graph with n constituent tasks, an O(np2) algorithm is provided that finds the optimal assignment for the response time optimization problem; it was found that the assignment optimizing the constrained throughput in O(np2log p) time. Special cases of linear, independent, and tree graphs are also considered.
Evaluation of a Modified User Guide for Hearing Aid Management.
Caposecco, Andrea; Hickson, Louise; Meyer, Carly; Khan, Asaduzzaman
2016-01-01
This study investigated if a hearing aid user guide modified using best practice principles for health literacy resulted in superior ability to perform hearing aid management tasks, compared with the user guide in the original form. This research utilized a two-arm study design to compare the original manufacturer's user guide with a modified user guide for the same hearing aid--an Oticon Acto behind-the-ear aid with an open dome. The modified user guide had a lower reading grade level (4.2 versus 10.5), used a larger font size, included more graphics, and had less technical information. Eighty-nine adults ages 55 years and over were included in the study; none had experience with hearing aid use or management. Participants were randomly assigned either the modified guide (n = 47) or the original guide (n = 42). All participants were administered the Hearing Aid Management test, designed for this study, which assessed their ability to perform seven management tasks (e.g., change battery) with their assigned user guide. The regression analysis indicated that the type of user guide was significantly associated with performance on the Hearing Aid Management test, adjusting for 11 potential covariates. In addition, participants assigned the modified guide required significantly fewer prompts to perform tasks and were significantly more likely to perform four of the seven tasks without the need for prompts. The median time taken by those assigned the modified guide was also significantly shorter for three of the tasks. Other variables associated with performance on the Hearing Aid Management test were health literacy level, finger dexterity, and age. Findings indicate that the need to design hearing aid user guides in line with best practice principles of health literacy as a means of facilitating improved hearing aid management in older adults.
Knob manager (KM) operators guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1993-10-08
KM, Knob Manager, is a tool which enables the user to use the SUNDIALS knob box to adjust the settings of the control system. The followings are some features of KM: dynamic knob assignments with the user friendly interface; user-defined gain for individual knob; graphical displays for operating range and status of each process variable is assigned; backup and restore one or multiple process variable; save current settings to a file and recall the settings from that file in future.
Adding and Removing Web Area Users, and Changing User Roles
Webmasters can add users to a web area, and assign or change roles, which define the actions a user is able to take in the web area. Non-webmasters must use a request form to add users and change roles.
A Two-Phase Model for Trade Matching and Price Setting in Double Auction Water Markets
NASA Astrophysics Data System (ADS)
Xu, Tingting; Zheng, Hang; Zhao, Jianshi; Liu, Yicheng; Tang, Pingzhong; Yang, Y. C. Ethan; Wang, Zhongjing
2018-04-01
Delivery in water markets is generally operated by agencies through channel systems, which imposes physical and institutional market constraints. Many water markets allow water users to post selling and buying requests on a board. However, water users may not be able to choose efficiently when the information (including the constraints) becomes complex. This study proposes an innovative two-phase model to address this problem based on practical experience in China. The first phase seeks and determines the optimal assignment that maximizes the incremental improvement of the system's social welfare according to the bids and asks in the water market. The second phase sets appropriate prices under constraints. Applying this model to China's Xiying Irrigation District shows that it can improve social welfare more than the current "pool exchange" method can. Within the second phase, we evaluate three objective functions (minimum variance, threshold-based balance, and two-sided balance), which represent different managerial goals. The threshold-based balance function should be preferred by most users, while the two-sided balance should be preferred by players who post extreme prices.
Global Optimization of Emergency Evacuation Assignments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, Lee; Yuan, Fang; Chin, Shih-Miao
2006-01-01
Conventional emergency evacuation plans often assign evacuees to fixed routes or destinations based mainly on geographic proximity. Such approaches can be inefficient if the roads are congested, blocked, or otherwise dangerous because of the emergency. By not constraining evacuees to prespecified destinations, a one-destination evacuation approach provides flexibility in the optimization process. We present a framework for the simultaneous optimization of evacuation-traffic distribution and assignment. Based on the one-destination evacuation concept, we can obtain the optimal destination and route assignment by solving a one-destination traffic-assignment problem on a modified network representation. In a county-wide, large-scale evacuation case study, the one-destinationmore » model yields substantial improvement over the conventional approach, with the overall evacuation time reduced by more than 60 percent. More importantly, emergency planners can easily implement this framework by instructing evacuees to go to destinations that the one-destination optimization process selects.« less
Burton, Wayne N; Chen, Chin-Yu; Schultz, Alyssa B; Edington, Dee W
2010-02-01
Statin medications are recommended for patients who have not achieved low-density lipoprotein cholesterol (LDL-C) goals through lifestyle modifications. The objective of this retrospective observational study was to examine statin medication usage patterns and the relationship with LDL-C goal levels (according to Adult Treatment Panel III guidelines) among a cohort of employees of a major financial services corporation. From 1995 to 2004, a total of 1607 executives participated in a periodic health examination program. An index date was assigned for each study participant (date of their exam) and statin medication usage was determined from the pharmacy claims database for 365 days before the index date. Patients were identified as adherent to statins if the medication possession ratio was > or =80%. In all, 150 (9.3%) executives filled at least 1 statin prescription in the 365 days prior to their exam. A total of 102 statin users (68%) were adherent to statin medication. Among all executives who received statin treatment, 70% (odds ratio [OR] = 2.30, 95% confidence interval [CI] = 1.82, 2.90) achieved near-optimal (<130 mg/dL) and 30% (OR = 1.78, 95% CI = 1.15, 2.76) achieved optimal (<100 mg/dL) LDL-C goals, which is significantly higher than the rates among statin nonusers (55% and 21%). Adherent statin users were more likely to achieve recommended near-optimal LDL-C goals compared to statin nonusers (overall P = 0.002; adherent: OR = 2.75, 95% CI = 1.662, 4.550), while nonadherent statin users were more likely to achieve the optimal goal compared to statin nonusers (OR = 2.223; CI = 1.145, 4.313). Statin usage was associated with improvements in LDL-C goal attainment among executives who participated in a periodic health examination. Appropriate statin medication adherence should be encouraged in working populations in order to achieve LDL-C goals.
NASA Technical Reports Server (NTRS)
Bishop, Matt
1990-01-01
Password selection has long been a difficult issue; traditionally, passwords are either assigned by the computer or chosen by the user. When the computer does the assignment, the passwords are often hard to remember; when the user makes the selection, the passwords are often easy to guess. This paper describes a technique, and a mechanism, to allow users to select passwords which to them are easy to remember but to others would be very difficult to guess. The technique is site, user, and group compatible, and allows rapid changing of constraints imposed upon the password. Although experience with this technique is limited, it appears to have much promise.
Method for routing events from key strokes in a multi-processing computer systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhodes, D.A.; Rustici, E.; Carter, K.H.
1990-01-23
The patent describes a method of routing user input in a computer system which concurrently runs a plurality of processes. It comprises: generating keycodes representative of keys typed by a user; distinguishing generated keycodes by looking up each keycode in a routing table which assigns each possible keycode to an individual assigned process of the plurality of processes, one of which processes being a supervisory process; then, sending each keycode to its assigned process until a keycode assigned to the supervisory process is received; sending keycodes received subsequent to the keycode assigned to the supervisory process to a buffer; next,more » providing additional keycodes to the supervisory process from the buffer until the supervisory process has completed operation; and sending keycodes stored in the buffer to processes assigned therewith after the supervisory process has completedoperation.« less
Quid pro quo: a mechanism for fair collaboration in networked systems.
Santos, Agustín; Fernández Anta, Antonio; López Fernández, Luis
2013-01-01
Collaboration may be understood as the execution of coordinated tasks (in the most general sense) by groups of users, who cooperate for achieving a common goal. Collaboration is a fundamental assumption and requirement for the correct operation of many communication systems. The main challenge when creating collaborative systems in a decentralized manner is dealing with the fact that users may behave in selfish ways, trying to obtain the benefits of the tasks but without participating in their execution. In this context, Game Theory has been instrumental to model collaborative systems and the task allocation problem, and to design mechanisms for optimal allocation of tasks. In this paper, we revise the classical assumptions of these models and propose a new approach to this problem. First, we establish a system model based on heterogenous nodes (users, players), and propose a basic distributed mechanism so that, when a new task appears, it is assigned to the most suitable node. The classical technique for compensating a node that executes a task is the use of payments (which in most networks are hard or impossible to implement). Instead, we propose a distributed mechanism for the optimal allocation of tasks without payments. We prove this mechanism to be robust evenevent in the presence of independent selfish or rationally limited players. Additionally, our model is based on very weak assumptions, which makes the proposed mechanisms susceptible to be implemented in networked systems (e.g., the Internet).
Minimal-delay traffic grooming for WDM star networks
NASA Astrophysics Data System (ADS)
Choi, Hongsik; Garg, Nikhil; Choi, Hyeong-Ah
2003-10-01
All-optical networks face the challenge of reducing slower opto-electronic conversions by managing assignment of traffic streams to wavelengths in an intelligent manner, while at the same time utilizing bandwidth resources to the maximum. This challenge becomes harder in networks closer to the end users that have insufficient data to saturate single wavelengths as well as traffic streams outnumbering the usable wavelengths, resulting in traffic grooming which requires costly traffic analysis at access nodes. We study the problem of traffic grooming that reduces the need to analyze traffic, for a class of network architecture most used by Metropolitan Area Networks; the star network. The problem being NP-complete, we provide an efficient twice-optimal-bound greedy heuristic for the same, that can be used to intelligently groom traffic at the LANs to reduce latency at the access nodes. Simulation results show that our greedy heuristic achieves a near-optimal solution.
Automated sequence-specific protein NMR assignment using the memetic algorithm MATCH.
Volk, Jochen; Herrmann, Torsten; Wüthrich, Kurt
2008-07-01
MATCH (Memetic Algorithm and Combinatorial Optimization Heuristics) is a new memetic algorithm for automated sequence-specific polypeptide backbone NMR assignment of proteins. MATCH employs local optimization for tracing partial sequence-specific assignments within a global, population-based search environment, where the simultaneous application of local and global optimization heuristics guarantees high efficiency and robustness. MATCH thus makes combined use of the two predominant concepts in use for automated NMR assignment of proteins. Dynamic transition and inherent mutation are new techniques that enable automatic adaptation to variable quality of the experimental input data. The concept of dynamic transition is incorporated in all major building blocks of the algorithm, where it enables switching between local and global optimization heuristics at any time during the assignment process. Inherent mutation restricts the intrinsically required randomness of the evolutionary algorithm to those regions of the conformation space that are compatible with the experimental input data. Using intact and artificially deteriorated APSY-NMR input data of proteins, MATCH performed sequence-specific resonance assignment with high efficiency and robustness.
Dong, Yuwen; Deshpande, Sunil; Rivera, Daniel E; Downs, Danielle S; Savage, Jennifer S
2014-06-01
Control engineering offers a systematic and efficient method to optimize the effectiveness of individually tailored treatment and prevention policies known as adaptive or "just-in-time" behavioral interventions. The nature of these interventions requires assigning dosages at categorical levels, which has been addressed in prior work using Mixed Logical Dynamical (MLD)-based hybrid model predictive control (HMPC) schemes. However, certain requirements of adaptive behavioral interventions that involve sequential decision making have not been comprehensively explored in the literature. This paper presents an extension of the traditional MLD framework for HMPC by representing the requirements of sequential decision policies as mixed-integer linear constraints. This is accomplished with user-specified dosage sequence tables, manipulation of one input at a time, and a switching time strategy for assigning dosages at time intervals less frequent than the measurement sampling interval. A model developed for a gestational weight gain (GWG) intervention is used to illustrate the generation of these sequential decision policies and their effectiveness for implementing adaptive behavioral interventions involving multiple components.
Interactive multi-objective path planning through a palette-based user interface
NASA Astrophysics Data System (ADS)
Shaikh, Meher T.; Goodrich, Michael A.; Yi, Daqing; Hoehne, Joseph
2016-05-01
n a problem where a human uses supervisory control to manage robot path-planning, there are times when human does the path planning, and if satisfied commits those paths to be executed by the robot, and the robot executes that plan. In planning a path, the robot often uses an optimization algorithm that maximizes or minimizes an objective. When a human is assigned the task of path planning for robot, the human may care about multiple objectives. This work proposes a graphical user interface (GUI) designed for interactive robot path-planning when an operator may prefer one objective over others or care about how multiple objectives are traded off. The GUI represents multiple objectives using the metaphor of an artist's palette. A distinct color is used to represent each objective, and tradeoffs among objectives are balanced in a manner that an artist mixes colors to get the desired shade of color. Thus, human intent is analogous to the artist's shade of color. We call the GUI an "Adverb Palette" where the word "Adverb" represents a specific type of objective for the path, such as the adverbs "quickly" and "safely" in the commands: "travel the path quickly", "make the journey safely". The novel interactive interface provides the user an opportunity to evaluate various alternatives (that tradeoff between different objectives) by allowing her to visualize the instantaneous outcomes that result from her actions on the interface. In addition to assisting analysis of various solutions given by an optimization algorithm, the palette has additional feature of allowing the user to define and visualize her own paths, by means of waypoints (guiding locations) thereby spanning variety for planning. The goal of the Adverb Palette is thus to provide a way for the user and robot to find an acceptable solution even though they use very different representations of the problem. Subjective evaluations suggest that even non-experts in robotics can carry out the planning tasks with a great deal of flexibility using the adverb palette.
Glocker, Ben; Paragios, Nikos; Komodakis, Nikos; Tziritas, Georgios; Navab, Nassir
2007-01-01
In this paper we propose a novel non-rigid volume registration based on discrete labeling and linear programming. The proposed framework reformulates registration as a minimal path extraction in a weighted graph. The space of solutions is represented using a set of a labels which are assigned to predefined displacements. The graph topology corresponds to a superimposed regular grid onto the volume. Links between neighborhood control points introduce smoothness, while links between the graph nodes and the labels (end-nodes) measure the cost induced to the objective function through the selection of a particular deformation for a given control point once projected to the entire volume domain, Higher order polynomials are used to express the volume deformation from the ones of the control points. Efficient linear programming that can guarantee the optimal solution up to (a user-defined) bound is considered to recover the optimal registration parameters. Therefore, the method is gradient free, can encode various similarity metrics (simple changes on the graph construction), can guarantee a globally sub-optimal solution and is computational tractable. Experimental validation using simulated data with known deformation, as well as manually segmented data demonstrate the extreme potentials of our approach.
AUTOBA: automation of backbone assignment from HN(C)N suite of experiments.
Borkar, Aditi; Kumar, Dinesh; Hosur, Ramakrishna V
2011-07-01
Development of efficient strategies and automation represent important milestones of progress in rapid structure determination efforts in proteomics research. In this context, we present here an efficient algorithm named as AUTOBA (Automatic Backbone Assignment) designed to automate the assignment protocol based on HN(C)N suite of experiments. Depending upon the spectral dispersion, the user can record 2D or 3D versions of the experiments for assignment. The algorithm uses as inputs: (i) protein primary sequence and (ii) peak-lists from user defined HN(C)N suite of experiments. In the end, one gets H(N), (15)N, C(α) and C' assignments (in common BMRB format) for the individual residues along the polypeptide chain. The success of the algorithm has been demonstrated, not only with experimental spectra recorded on two small globular proteins: ubiquitin (76 aa) and M-crystallin (85 aa), but also with simulated spectra of 27 other proteins using assignment data from the BMRB.
Absolute Points for Multiple Assignment Problems
ERIC Educational Resources Information Center
Adlakha, V.; Kowalski, K.
2006-01-01
An algorithm is presented to solve multiple assignment problems in which a cost is incurred only when an assignment is made at a given cell. The proposed method recursively searches for single/group absolute points to identify cells that must be loaded in any optimal solution. Unlike other methods, the first solution is the optimal solution. The…
NASA Astrophysics Data System (ADS)
Colantonio, Alessandro; di Pietro, Roberto; Ocello, Alberto; Verde, Nino Vincenzo
In this paper we address the problem of generating a candidate role-set for an RBAC configuration that enjoys the following two key features: it minimizes the administration cost; and, it is a stable candidate role-set. To achieve these goals, we implement a three steps methodology: first, we associate a weight to roles; second, we identify and remove the user-permission assignments that cannot belong to a role that have a weight exceeding a given threshold; third, we restrict the problem of finding a candidate role-set for the given system configuration using only the user-permission assignments that have not been removed in the second step—that is, user-permission assignments that belong to roles with a weight exceeding the given threshold. We formally show—proof of our results are rooted in graph theory—that this methodology achieves the intended goals. Finally, we discuss practical applications of our approach to the role mining problem.
Harshberger, Cara A.; Harper, Abigail J.; Carro, George W.; Spath, Wayne E.; Hui, Wendy C.; Lawton, Jessica M.; Brockstein, Bruce E.
2011-01-01
Purpose: Computerized physician order entry (CPOE) in electronic health records (EHR) has been recognized as an important tool in optimal health care provision that can reduce errors and improve safety. The objective of this study is to describe documentation completeness and user satisfaction of medical charts before and after implementation of an outpatient oncology EHR/ CPOE system in a hospital-based outpatient cancer center within three treatment sites. Methods: This study is a retrospective chart review of 90 patients who received one of the following regimens between 1999 and 2006: FOLFOX, AC, carboplatin + paclitaxel, ABVD, cisplatin + etoposide, R-CHOP, and clinical trials. Documentation completeness scores were assigned to each chart based on the number of documented data points found out of the total data points assessed. EHR/CPOE documentation completeness was compared with completeness of paper charts orders of the same regimens. A user satisfaction survey of the paper chart and EHR/CPOE system was conducted among the physicians, nurses, and pharmacists who worked with both systems. Results: The mean percentage of identified data points successfully found in the EHR/CPOE charts was 93% versus 67% in the paper charts (P < .001). Regimen complexity did not alter the number of data points found. The survey response rate was 64%, and the results showed that satisfaction was statistically significant in favor of the EHR/CPOE system. Conclusion: Using EHR/CPOE systems improves completeness of medical record and chemotherapy order documentation and improves user satisfaction with the medical record system. EHR/CPOE requires constant vigilance and maintenance to optimize patient safety. PMID:22043187
Quantum Optimal Multiple Assignment Scheme for Realizing General Access Structure of Secret Sharing
NASA Astrophysics Data System (ADS)
Matsumoto, Ryutaroh
The multiple assignment scheme is to assign one or more shares to single participant so that any kind of access structure can be realized by classical secret sharing schemes. We propose its quantum version including ramp secret sharing schemes. Then we propose an integer optimization approach to minimize the average share size.
Davids, Mogamat Razeen; Chikte, Usuf M E; Halperin, Mitchell L
2013-09-01
Optimizing the usability of e-learning materials is necessary to maximize their potential educational impact, but this is often neglected when time and other resources are limited, leading to the release of materials that cannot deliver the desired learning outcomes. As clinician-teachers in a resource-constrained environment, we investigated whether heuristic evaluation of our multimedia e-learning resource by a panel of experts would be an effective and efficient alternative to testing with end users. We engaged six inspectors, whose expertise included usability, e-learning, instructional design, medical informatics, and the content area of nephrology. They applied a set of commonly used heuristics to identify usability problems, assigning severity scores to each problem. The identification of serious problems was compared with problems previously found by user testing. The panel completed their evaluations within 1 wk and identified a total of 22 distinct usability problems, 11 of which were considered serious. The problems violated the heuristics of visibility of system status, user control and freedom, match with the real world, intuitive visual layout, consistency and conformity to standards, aesthetic and minimalist design, error prevention and tolerance, and help and documentation. Compared with user testing, heuristic evaluation found most, but not all, of the serious problems. Combining heuristic evaluation and user testing, with each involving a small number of participants, may be an effective and efficient way of improving the usability of e-learning materials. Heuristic evaluation should ideally be used first to identify the most obvious problems and, once these are fixed, should be followed by testing with typical end users.
MAGIC Computer Simulation. Volume 1: User Manual
1970-07-01
vulnerability and MAGIC programs. A three-digit code is assigned to each component of the target, such as armor, gun tube; and a two-digit code is assigned to...A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1970 4. TITLE AND SUBTITLE MAGIC Computer Simulation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT
DNATCO: assignment of DNA conformers at dnatco.org.
Černý, Jiří; Božíková, Paulína; Schneider, Bohdan
2016-07-08
The web service DNATCO (dnatco.org) classifies local conformations of DNA molecules beyond their traditional sorting to A, B and Z DNA forms. DNATCO provides an interface to robust algorithms assigning conformation classes called NTC: to dinucleotides extracted from DNA-containing structures uploaded in PDB format version 3.1 or above. The assigned dinucleotide NTC: classes are further grouped into DNA structural alphabet NTA: , to the best of our knowledge the first DNA structural alphabet. The results are presented at two levels: in the form of user friendly visualization and analysis of the assignment, and in the form of a downloadable, more detailed table for further analysis offline. The website is free and open to all users and there is no login requirement. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Randomized Controlled Ethanol Cookstove Intervention and Blood Pressure in Pregnant Nigerian Women.
Alexander, Donee; Northcross, Amanda; Wilson, Nathaniel; Dutta, Anindita; Pandya, Rishi; Ibigbami, Tope; Adu, Damilola; Olamijulo, John; Morhason-Bello, Oludare; Karrison, Theodore; Ojengbede, Oladosu; Olopade, Christopher O
2017-06-15
Hypertension during pregnancy is a leading cause of maternal mortality. Exposure to household air pollution elevates blood pressure (BP). To investigate the ability of a clean cookstove intervention to lower BP during pregnancy. We conducted a randomized controlled trial in Nigeria. Pregnant women cooking with kerosene or firewood were randomly assigned to an ethanol arm (n = 162) or a control arm (n = 162). BP measurements were taken during six antenatal visits. In the primary analysis, we compared ethanol users with control subjects. In subgroup analyses, we compared baseline kerosene users assigned to the intervention with kerosene control subjects and compared baseline firewood users assigned to ethanol with firewood control subjects. The change in diastolic blood pressure (DBP) over time was significantly different between ethanol users and control subjects (P = 0.040); systolic blood pressure (SBP) did not differ (P = 0.86). In subgroup analyses, there was no significant intervention effect for SBP; a significant difference for DBP (P = 0.031) existed among preintervention kerosene users. At the last visit, mean DBP was 2.8 mm Hg higher in control subjects than in ethanol users (3.6 mm Hg greater in control subjects than in ethanol users among preintervention kerosene users), and 6.4% of control subjects were hypertensive (SBP ≥140 and/or DBP ≥90 mm Hg) versus 1.9% of ethanol users (P = 0.051). Among preintervention kerosene users, 8.8% of control subjects were hypertensive compared with 1.8% of ethanol users (P = 0.029). To our knowledge, this is the first cookstove randomized controlled trial examining prenatal BP. Ethanol cookstoves have potential to reduce DBP and hypertension during pregnancy. Accordingly, clean cooking fuels may reduce adverse health impacts associated with household air pollution. Clinical trial registered with www.clinicaltrials.gov (NCT02394574).
Harmonisation of seven common enzyme results through EQA.
Weykamp, Cas; Franck, Paul; Gunnewiek, Jacqueline Klein; de Jonge, Robert; Kuypers, Aldy; van Loon, Douwe; Steigstra, Herman; Cobbaert, Christa
2014-11-01
Equivalent results between different laboratories enable optimal patient care and can be achieved with harmonisation. We report on EQA-initiated national harmonisation of seven enzymes using commutable samples. EQA samples were prepared from human serum spiked with human recombinant enzymes. Target values were assigned with the IFCC Reference Measurement Procedures. The same samples were included at four occasions in the EQA programmes of 2012 and 2013. Laboratories were encouraged to report IFCC traceable results. A parallel study was done to confirm commutability of the samples. Of the 223 participating laboratories, 95% reported IFCC traceable results, ranging from 98% (ASAT) to 87% (amylase). Users of Roche and Siemens (97%) more frequently reported in IFCC traceable results than users of Abbott (91%), Beckman (90%), and Olympus (87%). The success of harmonisation, expressed as the recovery of assigned values and the inter-laboratory CV was: ALAT (recovery 100%; inter-lab CV 4%), ASAT (102%; 4%), LD (98%; 3%), CK (101%; 5%), GGT (98%; 4%), AP (96%; 6%), amylase (99%; 4%). There were no significant differences between the manufacturers. Commutability was demonstrated in the parallel study. Equal results in the same sample in the 2012 and 2013 EQA programmes demonstrated stability of the samples. The EQA-initiated national harmonisation of seven enzymes, using stable, commutable human serum samples, spiked with human recombinant enzymes, and targeted with the IFCC Reference Measurement Procedures, was successful in terms of implementation of IFCC traceable results (95%), recovery of the target (99%), and inter-laboratory CV (4%).
a Context-Aware Tourism Recommender System Based on a Spreading Activation Method
NASA Astrophysics Data System (ADS)
Bahramian, Z.; Abbaspour, R. Ali; Claramunt, C.
2017-09-01
Users planning a trip to a given destination often search for the most appropriate points of interest location, this being a non-straightforward task as the range of information available is very large and not very well structured. The research presented by this paper introduces a context-aware tourism recommender system that overcomes the information overload problem by providing personalized recommendations based on the user's preferences. It also incorporates contextual information to improve the recommendation process. As previous context-aware tourism recommender systems suffer from a lack of formal definition to represent contextual information and user's preferences, the proposed system is enhanced using an ontology approach. We also apply a spreading activation technique to contextualize user preferences and learn the user profile dynamically according to the user's feedback. The proposed method assigns more effect in the spreading process for nodes which their preference values are assigned directly by the user. The results show the overall performance of the proposed context-aware tourism recommender systems by an experimental application to the city of Tehran.
Rapid-Prototyping of Application Specific Signal Processors (RASSP) Education and Facilitation
2000-12-01
Digest 4. User/ passwd /deskey authentication required. o single point RASSP contractor release o phone call authentication required...o user/ passwd /deskey assign over phone 5. WWW user/ passwd used to access release for in component datasheet. o single file for each model
ERIC Educational Resources Information Center
Vos, Hans J.
An approach to simultaneous optimization of assignments of subjects to treatments followed by an end-of-mastery test is presented using the framework of Bayesian decision theory. Focus is on demonstrating how rules for the simultaneous optimization of sequences of decisions can be found. The main advantages of the simultaneous approach, compared…
Improved configuration control for redundant robots
NASA Technical Reports Server (NTRS)
Seraji, H.; Colbaugh, R.
1990-01-01
This article presents a singularity-robust task-prioritized reformulation of the configuration control scheme for redundant robot manipulators. This reformulation suppresses large joint velocities near singularities, at the expense of small task trajectory errors. This is achieved by optimally reducing the joint velocities to induce minimal errors in the task performance by modifying the task trajectories. Furthermore, the same framework provides a means for assignment of priorities between the basic task of end-effector motion and the user-defined additional task for utilizing redundancy. This allows automatic relaxation of the additional task constraints in favor of the desired end-effector motion, when both cannot be achieved exactly. The improved configuration control scheme is illustrated for a variety of additional tasks, and extensive simulation results are presented.
New optimization model for routing and spectrum assignment with nodes insecurity
NASA Astrophysics Data System (ADS)
Xuan, Hejun; Wang, Yuping; Xu, Zhanqi; Hao, Shanshan; Wang, Xiaoli
2017-04-01
By adopting the orthogonal frequency division multiplexing technology, elastic optical networks can provide the flexible and variable bandwidth allocation to each connection request and get higher spectrum utilization. The routing and spectrum assignment problem in elastic optical network is a well-known NP-hard problem. In addition, information security has received worldwide attention. We combine these two problems to investigate the routing and spectrum assignment problem with the guaranteed security in elastic optical network, and establish a new optimization model to minimize the maximum index of the used frequency slots, which is used to determine an optimal routing and spectrum assignment schemes. To solve the model effectively, a hybrid genetic algorithm framework integrating a heuristic algorithm into a genetic algorithm is proposed. The heuristic algorithm is first used to sort the connection requests and then the genetic algorithm is designed to look for an optimal routing and spectrum assignment scheme. In the genetic algorithm, tailor-made crossover, mutation and local search operators are designed. Moreover, simulation experiments are conducted with three heuristic strategies, and the experimental results indicate that the effectiveness of the proposed model and algorithm framework.
Zeng, Jianyang; Zhou, Pei; Donald, Bruce Randall
2011-01-01
One bottleneck in NMR structure determination lies in the laborious and time-consuming process of side-chain resonance and NOE assignments. Compared to the well-studied backbone resonance assignment problem, automated side-chain resonance and NOE assignments are relatively less explored. Most NOE assignment algorithms require nearly complete side-chain resonance assignments from a series of through-bond experiments such as HCCH-TOCSY or HCCCONH. Unfortunately, these TOCSY experiments perform poorly on large proteins. To overcome this deficiency, we present a novel algorithm, called NASCA (NOE Assignment and Side-Chain Assignment), to automate both side-chain resonance and NOE assignments and to perform high-resolution protein structure determination in the absence of any explicit through-bond experiment to facilitate side-chain resonance assignment, such as HCCH-TOCSY. After casting the assignment problem into a Markov Random Field (MRF), NASCA extends and applies combinatorial protein design algorithms to compute optimal assignments that best interpret the NMR data. The MRF captures the contact map information of the protein derived from NOESY spectra, exploits the backbone structural information determined by RDCs, and considers all possible side-chain rotamers. The complexity of the combinatorial search is reduced by using a dead-end elimination (DEE) algorithm, which prunes side-chain resonance assignments that are provably not part of the optimal solution. Then an A* search algorithm is employed to find a set of optimal side-chain resonance assignments that best fit the NMR data. These side-chain resonance assignments are then used to resolve the NOE assignment ambiguity and compute high-resolution protein structures. Tests on five proteins show that NASCA assigns resonances for more than 90% of side-chain protons, and achieves about 80% correct assignments. The final structures computed using the NOE distance restraints assigned by NASCA have backbone RMSD 0.8 – 1.5 Å from the reference structures determined by traditional NMR approaches. PMID:21706248
NASA Astrophysics Data System (ADS)
Seo, Junyeong; Sung, Youngchul
2018-06-01
In this paper, an efficient transmit beam design and user scheduling method is proposed for multi-user (MU) multiple-input single-output (MISO) non-orthogonal multiple access (NOMA) downlink, based on Pareto-optimality. The proposed beam design and user scheduling method groups simultaneously-served users into multiple clusters with practical two users in each cluster, and then applies spatical zeroforcing (ZF) across clusters to control inter-cluster interference (ICI) and Pareto-optimal beam design with successive interference cancellation (SIC) to two users in each cluster to remove interference to strong users and leverage signal-to-interference-plus-noise ratios (SINRs) of interference-experiencing weak users. The proposed method has flexibility to control the rates of strong and weak users and numerical results show that the proposed method yields good performance.
Game theory and traffic assignment.
DOT National Transportation Integrated Search
2013-09-01
Traffic assignment is used to determine the number of users on roadway links in a network. While this problem has : been widely studied in transportation literature, its use of the concept of equilibrium has attracted considerable interest : in the f...
Factors contributing to young moped rider accidents in Denmark.
Møller, Mette; Haustein, Sonja
2016-02-01
Young road users still constitute a high-risk group with regard to road traffic accidents. The crash rate of a moped is four times greater than that of a motorcycle, and the likelihood of being injured in a road traffic accident is 10-20 times higher among moped riders compared to car drivers. Nevertheless, research on the behaviour and accident involvement of young moped riders remains sparse. Based on analysis of 128 accident protocols, the purpose of this study was to increase knowledge about moped accidents. The study was performed in Denmark involving riders aged 16 or 17. A distinction was made between accident factors related to (1) the road and its surroundings, (2) the vehicle, and (3) the reported behaviour and condition of the road user. Thirteen accident factors were identified with the majority concerning the reported behaviour and condition of the road user. The average number of accident factors assigned per accident was 2.7. Riding speed was assigned in 45% of the accidents which made it the most frequently assigned factor on the part of the moped rider followed by attention errors (42%), a tuned up moped (29%) and position on the road (14%). For the other parties involved, attention error (52%) was the most frequently assigned accident factor. The majority (78%) of the accidents involved road rule breaching on the part of the moped rider. The results indicate that preventive measures should aim to eliminate violations and increase anticipatory skills among moped riders and awareness of mopeds among other road users. Due to their young age the effect of such measures could be enhanced by infrastructural measures facilitating safe interaction between mopeds and other road users. Copyright © 2015 Elsevier Ltd. All rights reserved.
FRANOPP: Framework for analysis and optimization problems user's guide
NASA Technical Reports Server (NTRS)
Riley, K. M.
1981-01-01
Framework for analysis and optimization problems (FRANOPP) is a software aid for the study and solution of design (optimization) problems which provides the driving program and plotting capability for a user generated programming system. In addition to FRANOPP, the programming system also contains the optimization code CONMIN, and two user supplied codes, one for analysis and one for output. With FRANOPP the user is provided with five options for studying a design problem. Three of the options utilize the plot capability and present an indepth study of the design problem. The study can be focused on a history of the optimization process or on the interaction of variables within the design problem.
HURON (HUman and Robotic Optimization Network) Multi-Agent Temporal Activity Planner/Scheduler
NASA Technical Reports Server (NTRS)
Hua, Hook; Mrozinski, Joseph J.; Elfes, Alberto; Adumitroaie, Virgil; Shelton, Kacie E.; Smith, Jeffrey H.; Lincoln, William P.; Weisbin, Charles R.
2012-01-01
HURON solves the problem of how to optimize a plan and schedule for assigning multiple agents to a temporal sequence of actions (e.g., science tasks). Developed as a generic planning and scheduling tool, HURON has been used to optimize space mission surface operations. The tool has also been used to analyze lunar architectures for a variety of surface operational scenarios in order to maximize return on investment and productivity. These scenarios include numerous science activities performed by a diverse set of agents: humans, teleoperated rovers, and autonomous rovers. Once given a set of agents, activities, resources, resource constraints, temporal constraints, and de pendencies, HURON computes an optimal schedule that meets a specified goal (e.g., maximum productivity or minimum time), subject to the constraints. HURON performs planning and scheduling optimization as a graph search in state-space with forward progression. Each node in the graph contains a state instance. Starting with the initial node, a graph is automatically constructed with new successive nodes of each new state to explore. The optimization uses a set of pre-conditions and post-conditions to create the children states. The Python language was adopted to not only enable more agile development, but to also allow the domain experts to easily define their optimization models. A graphical user interface was also developed to facilitate real-time search information feedback and interaction by the operator in the search optimization process. The HURON package has many potential uses in the fields of Operations Research and Management Science where this technology applies to many commercial domains requiring optimization to reduce costs. For example, optimizing a fleet of transportation truck routes, aircraft flight scheduling, and other route-planning scenarios involving multiple agent task optimization would all benefit by using HURON.
Incentive Mechanism for P2P Content Sharing over Heterogenous Access Networks
NASA Astrophysics Data System (ADS)
Sato, Kenichiro; Hashimoto, Ryo; Yoshino, Makoto; Shinkuma, Ryoichi; Takahashi, Tatsuro
In peer-to-peer (P2P) content sharing, users can share their content by contributing their own resources to one another. However, since there is no incentive for contributing contents or resources to others, users may attempt to obtain content without any contribution. To motivate users to contribute their resources to the service, incentive-rewarding mechanisms have been proposed. On the other hand, emerging wireless technologies, such as IEEE 802.11 wireless local area networks, beyond third generation (B3G) cellular networks and mobile WiMAX, provide high-speed Internet access for wireless users. Using these high-speed wireless access, wireless users can use P2P services and share their content with other wireless users and with fixed users. However, this diversification of access networks makes it difficult to appropriately assign rewards to each user according to their contributions. This is because the cost necessary for contribution is different in different access networks. In this paper, we propose a novel incentive-rewarding mechanism called EMOTIVER that can assign rewards to users appropriately. The proposed mechanism uses an external evaluator and interactive learning agents. We also investigate a way of appropriately controlling rewards based on the system service's quality and managing policy.
A Unified Framework for Creating Domain Dependent Polarity Lexicons from User Generated Reviews
Asghar, Muhammad Zubair; Khan, Aurangzeb; Ahmad, Shakeel; Khan, Imran Ali; Kundi, Fazal Masud
2015-01-01
The exponential increase in the explosion of Web-based user generated reviews has resulted in the emergence of Opinion Mining (OM) applications for analyzing the users’ opinions toward products, services, and policies. The polarity lexicons often play a pivotal role in the OM, indicating the positivity and negativity of a term along with the numeric score. However, the commonly available domain independent lexicons are not an optimal choice for all of the domains within the OM applications. The aforementioned is due to the fact that the polarity of a term changes from one domain to other and such lexicons do not contain the correct polarity of a term for every domain. In this work, we focus on the problem of adapting a domain dependent polarity lexicon from set of labeled user reviews and domain independent lexicon to propose a unified learning framework based on the information theory concepts that can assign the terms with correct polarity (+ive, -ive) scores. The benchmarking on three datasets (car, hotel, and drug reviews) shows that our approach improves the performance of the polarity classification by achieving higher accuracy. Moreover, using the derived domain dependent lexicon changed the polarity of terms, and the experimental results show that our approach is more effective than the base line methods. PMID:26466101
Hierarchical video summarization based on context clustering
NASA Astrophysics Data System (ADS)
Tseng, Belle L.; Smith, John R.
2003-11-01
A personalized video summary is dynamically generated in our video personalization and summarization system based on user preference and usage environment. The three-tier personalization system adopts the server-middleware-client architecture in order to maintain, select, adapt, and deliver rich media content to the user. The server stores the content sources along with their corresponding MPEG-7 metadata descriptions. In this paper, the metadata includes visual semantic annotations and automatic speech transcriptions. Our personalization and summarization engine in the middleware selects the optimal set of desired video segments by matching shot annotations and sentence transcripts with user preferences. Besides finding the desired contents, the objective is to present a coherent summary. There are diverse methods for creating summaries, and we focus on the challenges of generating a hierarchical video summary based on context information. In our summarization algorithm, three inputs are used to generate the hierarchical video summary output. These inputs are (1) MPEG-7 metadata descriptions of the contents in the server, (2) user preference and usage environment declarations from the user client, and (3) context information including MPEG-7 controlled term list and classification scheme. In a video sequence, descriptions and relevance scores are assigned to each shot. Based on these shot descriptions, context clustering is performed to collect consecutively similar shots to correspond to hierarchical scene representations. The context clustering is based on the available context information, and may be derived from domain knowledge or rules engines. Finally, the selection of structured video segments to generate the hierarchical summary efficiently balances between scene representation and shot selection.
2008-06-19
ground troop component of a deployed contingency, and not a stationary infrastructure. With respect to fast- moving vehicles and aircraft, troops...the rapidly- moving user. In fact, the Control Group users could have been randomly assigned the Stationary , Sea, or 134 Ground Mobility Category...additional re-keying on the non- stationary users, just as they induce no re-keying on the Stationary users (assuming those fast- moving aircraft have the
Incorporating User Preferences Within an Optimal Traffic Flow Management Framework
NASA Technical Reports Server (NTRS)
Rios, Joseph Lucio; Sheth, Kapil S.; Guiterrez-Nolasco, Sebastian Armardo
2010-01-01
The effectiveness of future decision support tools for Traffic Flow Management in the National Airspace System will depend on two major factors: computational burden and collaboration. Previous research has focused separately on these two aspects without consideration of their interaction. In this paper, their explicit combination is examined. It is shown that when user preferences are incorporated with an optimal approach to scheduling, runtime is not adversely affected. A benefit-cost ratio is used to measure the influence of user preferences on an optimal solution. This metric shows user preferences can be accommodated without inordinately, negatively affecting the overall system delay. Specifically, incorporating user preferences will increase delays proportionally to increased user satisfaction.
Fleet Assignment Using Collective Intelligence
NASA Technical Reports Server (NTRS)
Antoine, Nicolas E.; Bieniawski, Stefan R.; Kroo, Ilan M.; Wolpert, David H.
2004-01-01
Airline fleet assignment involves the allocation of aircraft to a set of flights legs in order to meet passenger demand, while satisfying a variety of constraints. Over the course of the day, the routing of each aircraft is determined in order to minimize the number of required flights for a given fleet. The associated flow continuity and aircraft count constraints have led researchers to focus on obtaining quasi-optimal solutions, especially at larger scales. In this paper, the authors propose the application of an agent-based integer optimization algorithm to a "cold start" fleet assignment problem. Results show that the optimizer can successfully solve such highly- constrained problems (129 variables, 184 constraints).
NASA Astrophysics Data System (ADS)
Supian, Sudradjat; Wahyuni, Sri; Nahar, Julita; Subiyanto
2018-01-01
In this paper, traveling time workers from the central post office Bandung in delivering the package to the destination location was optimized by using Hungarian method. Sensitivity analysis against data changes that may occur was also conducted. The sampled data in this study are 10 workers who will be assigned to deliver mail package to 10 post office delivery centers in Bandung that is Cikutra, Padalarang, Ujung Berung, Dayeuh Kolot, Asia- Africa, Soreang, Situ Saeur, Cimahi, Cipedes and Cikeruh. The result of this research is optimal traveling time from 10 workers to 10 destination locations. The optimal traveling time required by the workers is 387 minutes to reach the destination. Based on this result, manager of the central post office Bandung can make optimal decisions to assign tasks to their workers.
Encoder-Decoder Optimization for Brain-Computer Interfaces
Merel, Josh; Pianto, Donald M.; Cunningham, John P.; Paninski, Liam
2015-01-01
Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model") and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages. PMID:26029919
Encoder-decoder optimization for brain-computer interfaces.
Merel, Josh; Pianto, Donald M; Cunningham, John P; Paninski, Liam
2015-06-01
Neuroprosthetic brain-computer interfaces are systems that decode neural activity into useful control signals for effectors, such as a cursor on a computer screen. It has long been recognized that both the user and decoding system can adapt to increase the accuracy of the end effector. Co-adaptation is the process whereby a user learns to control the system in conjunction with the decoder adapting to learn the user's neural patterns. We provide a mathematical framework for co-adaptation and relate co-adaptation to the joint optimization of the user's control scheme ("encoding model") and the decoding algorithm's parameters. When the assumptions of that framework are respected, co-adaptation cannot yield better performance than that obtainable by an optimal initial choice of fixed decoder, coupled with optimal user learning. For a specific case, we provide numerical methods to obtain such an optimized decoder. We demonstrate our approach in a model brain-computer interface system using an online prosthesis simulator, a simple human-in-the-loop pyschophysics setup which provides a non-invasive simulation of the BCI setting. These experiments support two claims: that users can learn encoders matched to fixed, optimal decoders and that, once learned, our approach yields expected performance advantages.
Design Optimization Toolkit: Users' Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro
The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less
On Maximizing the Lifetime of Wireless Sensor Networks by Optimally Assigning Energy Supplies
Asorey-Cacheda, Rafael; García-Sánchez, Antonio Javier; García-Sánchez, Felipe; García-Haro, Joan; Gonzalez-Castaño, Francisco Javier
2013-01-01
The extension of the network lifetime of Wireless Sensor Networks (WSN) is an important issue that has not been appropriately solved yet. This paper addresses this concern and proposes some techniques to plan an arbitrary WSN. To this end, we suggest a hierarchical network architecture, similar to realistic scenarios, where nodes with renewable energy sources (denoted as primary nodes) carry out most message delivery tasks, and nodes equipped with conventional chemical batteries (denoted as secondary nodes) are those with less communication demands. The key design issue of this network architecture is the development of a new optimization framework to calculate the optimal assignment of renewable energy supplies (primary node assignment) to maximize network lifetime, obtaining the minimum number of energy supplies and their node assignment. We also conduct a second optimization step to additionally minimize the number of packet hops between the source and the sink. In this work, we present an algorithm that approaches the results of the optimization framework, but with much faster execution speed, which is a good alternative for large-scale WSN networks. Finally, the network model, the optimization process and the designed algorithm are further evaluated and validated by means of computer simulation under realistic conditions. The results obtained are discussed comparatively. PMID:23939582
On maximizing the lifetime of Wireless Sensor Networks by optimally assigning energy supplies.
Asorey-Cacheda, Rafael; García-Sánchez, Antonio Javier; García-Sánchez, Felipe; García-Haro, Joan; González-Castano, Francisco Javier
2013-08-09
The extension of the network lifetime of Wireless Sensor Networks (WSN) is an important issue that has not been appropriately solved yet. This paper addresses this concern and proposes some techniques to plan an arbitrary WSN. To this end, we suggest a hierarchical network architecture, similar to realistic scenarios, where nodes with renewable energy sources (denoted as primary nodes) carry out most message delivery tasks, and nodes equipped with conventional chemical batteries (denoted as secondary nodes) are those with less communication demands. The key design issue of this network architecture is the development of a new optimization framework to calculate the optimal assignment of renewable energy supplies (primary node assignment) to maximize network lifetime, obtaining the minimum number of energy supplies and their node assignment. We also conduct a second optimization step to additionally minimize the number of packet hops between the source and the sink. In this work, we present an algorithm that approaches the results of the optimization framework, but with much faster execution speed, which is a good alternative for large-scale WSN networks. Finally, the network model, the optimization process and the designed algorithm are further evaluated and validated by means of computer simulation under realistic conditions. The results obtained are discussed comparatively.
Classifying short genomic fragments from novel lineages using composition and homology
2011-01-01
Background The assignment of taxonomic attributions to DNA fragments recovered directly from the environment is a vital step in metagenomic data analysis. Assignments can be made using rank-specific classifiers, which assign reads to taxonomic labels from a predetermined level such as named species or strain, or rank-flexible classifiers, which choose an appropriate taxonomic rank for each sequence in a data set. The choice of rank typically depends on the optimal model for a given sequence and on the breadth of taxonomic groups seen in a set of close-to-optimal models. Homology-based (e.g., LCA) and composition-based (e.g., PhyloPythia, TACOA) rank-flexible classifiers have been proposed, but there is at present no hybrid approach that utilizes both homology and composition. Results We first develop a hybrid, rank-specific classifier based on BLAST and Naïve Bayes (NB) that has comparable accuracy and a faster running time than the current best approach, PhymmBL. By substituting LCA for BLAST or allowing the inclusion of suboptimal NB models, we obtain a rank-flexible classifier. This hybrid classifier outperforms established rank-flexible approaches on simulated metagenomic fragments of length 200 bp to 1000 bp and is able to assign taxonomic attributions to a subset of sequences with few misclassifications. We then demonstrate the performance of different classifiers on an enhanced biological phosphorous removal metagenome, illustrating the advantages of rank-flexible classifiers when representative genomes are absent from the set of reference genomes. Application to a glacier ice metagenome demonstrates that similar taxonomic profiles are obtained across a set of classifiers which are increasingly conservative in their classification. Conclusions Our NB-based classification scheme is faster than the current best composition-based algorithm, Phymm, while providing equally accurate predictions. The rank-flexible variant of NB, which we term ε-NB, is complementary to LCA and can be combined with it to yield conservative prediction sets of very high confidence. The simple parameterization of LCA and ε-NB allows for tuning of the balance between more predictions and increased precision, allowing the user to account for the sensitivity of downstream analyses to misclassified or unclassified sequences. PMID:21827705
Gender Assignment in Contemporary Standard Russian: A Comprehensive Analysis in Optimality Theory
ERIC Educational Resources Information Center
Galbreath, Blake Lee Everett
2010-01-01
The purpose of this dissertation is to provide a comprehensive analysis of gender assignment in Contemporary Standard Russian within the framework of Optimality Theory (Prince and Smolensky 1993). The result of the dissertation is the establishment of the phonological, morphological, semantic, and faithfulness constraints necessary to assign…
A real-time expert system for self-repairing flight control
NASA Technical Reports Server (NTRS)
Gaither, S. A.; Agarwal, A. K.; Shah, S. C.; Duke, E. L.
1989-01-01
An integrated environment for specifying, prototyping, and implementing a self-repairing flight-control (SRFC) strategy is described. At an interactive workstation, the user can select paradigms such as rule-based expert systems, state-transition diagrams, and signal-flow graphs and hierarchically nest them, assign timing and priority attributes, establish blackboard-type communication, and specify concurrent execution on single or multiple processors. High-fidelity nonlinear simulations of aircraft and SRFC systems can be performed off-line, with the possibility of changing SRFC rules, inference strategies, and other heuristics to correct for control deficiencies. Finally, the off-line-generated SRFC can be transformed into highly optimized application-specific real-time C-language code. An application of this environment to the design of aircraft fault detection, isolation, and accommodation algorithms is presented in detail.
NASA Technical Reports Server (NTRS)
Moerder, Daniel D.
2014-01-01
MADS (Minimization Assistant for Dynamical Systems) is a trajectory optimization code in which a user-specified performance measure is directly minimized, subject to constraints placed on a low-order discretization of user-supplied plant ordinary differential equations. This document describes the mathematical formulation of the set of trajectory optimization problems for which MADS is suitable, and describes the user interface. Usage examples are provided.
Systematic Propulsion Optimization Tools (SPOT)
NASA Technical Reports Server (NTRS)
Bower, Mark; Celestian, John
1992-01-01
This paper describes a computer program written by senior-level Mechanical Engineering students at the University of Alabama in Huntsville which is capable of optimizing user-defined delivery systems for carrying payloads into orbit. The custom propulsion system is designed by the user through the input of configuration, payload, and orbital parameters. The primary advantages of the software, called Systematic Propulsion Optimization Tools (SPOT), are a user-friendly interface and a modular FORTRAN 77 code designed for ease of modification. The optimization of variables in an orbital delivery system is of critical concern in the propulsion environment. The mass of the overall system must be minimized within the maximum stress, force, and pressure constraints. SPOT utilizes the Design Optimization Tools (DOT) program for the optimization techniques. The SPOT program is divided into a main program and five modules: aerodynamic losses, orbital parameters, liquid engines, solid engines, and nozzles. The program is designed to be upgraded easily and expanded to meet specific user needs. A user's manual and a programmer's manual are currently being developed to facilitate implementation and modification.
Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization.
Jung, Sang-Kyu; McDonald, Karen
2011-08-16
Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net.
Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization
2011-01-01
Background Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. Results The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Conclusion Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net. PMID:21846353
A Competitive and Experiential Assignment in Search Engine Optimization Strategy
ERIC Educational Resources Information Center
Clarke, Theresa B.; Clarke, Irvine, III
2014-01-01
Despite an increase in ad spending and demand for employees with expertise in search engine optimization (SEO), methods for teaching this important marketing strategy have received little coverage in the literature. Using Bloom's cognitive goals hierarchy as a framework, this experiential assignment provides a process for educators who may be new…
Optimal Weight Assignment for a Chinese Signature File.
ERIC Educational Resources Information Center
Liang, Tyne; And Others
1996-01-01
Investigates the performance of a character-based Chinese text retrieval scheme in which monogram keys and bigram keys are encoded into document signatures. Tests and verifies the theoretical predictions of the optimal weight assignments and the minimal false hit rate in experiments using a real Chinese corpus for disyllabic queries of different…
Self-Coexistence among IEEE 802.22 Networks: Distributed Allocation of Power and Channel
Sakin, Sayef Azad; Alamri, Atif; Tran, Nguyen H.
2017-01-01
Ensuring self-coexistence among IEEE 802.22 networks is a challenging problem owing to opportunistic access of incumbent-free radio resources by users in co-located networks. In this study, we propose a fully-distributed non-cooperative approach to ensure self-coexistence in downlink channels of IEEE 802.22 networks. We formulate the self-coexistence problem as a mixed-integer non-linear optimization problem for maximizing the network data rate, which is an NP-hard one. This work explores a sub-optimal solution by dividing the optimization problem into downlink channel allocation and power assignment sub-problems. Considering fairness, quality of service and minimum interference for customer-premises-equipment, we also develop a greedy algorithm for channel allocation and a non-cooperative game-theoretic framework for near-optimal power allocation. The base stations of networks are treated as players in a game, where they try to increase spectrum utilization by controlling power and reaching a Nash equilibrium point. We further develop a utility function for the game to increase the data rate by minimizing the transmission power and, subsequently, the interference from neighboring networks. A theoretical proof of the uniqueness and existence of the Nash equilibrium has been presented. Performance improvements in terms of data-rate with a degree of fairness compared to a cooperative branch-and-bound-based algorithm and a non-cooperative greedy approach have been shown through simulation studies. PMID:29215591
Self-Coexistence among IEEE 802.22 Networks: Distributed Allocation of Power and Channel.
Sakin, Sayef Azad; Razzaque, Md Abdur; Hassan, Mohammad Mehedi; Alamri, Atif; Tran, Nguyen H; Fortino, Giancarlo
2017-12-07
Ensuring self-coexistence among IEEE 802.22 networks is a challenging problem owing to opportunistic access of incumbent-free radio resources by users in co-located networks. In this study, we propose a fully-distributed non-cooperative approach to ensure self-coexistence in downlink channels of IEEE 802.22 networks. We formulate the self-coexistence problem as a mixed-integer non-linear optimization problem for maximizing the network data rate, which is an NP-hard one. This work explores a sub-optimal solution by dividing the optimization problem into downlink channel allocation and power assignment sub-problems. Considering fairness, quality of service and minimum interference for customer-premises-equipment, we also develop a greedy algorithm for channel allocation and a non-cooperative game-theoretic framework for near-optimal power allocation. The base stations of networks are treated as players in a game, where they try to increase spectrum utilization by controlling power and reaching a Nash equilibrium point. We further develop a utility function for the game to increase the data rate by minimizing the transmission power and, subsequently, the interference from neighboring networks. A theoretical proof of the uniqueness and existence of the Nash equilibrium has been presented. Performance improvements in terms of data-rate with a degree of fairness compared to a cooperative branch-and-bound-based algorithm and a non-cooperative greedy approach have been shown through simulation studies.
Robust stochastic optimization for reservoir operation
NASA Astrophysics Data System (ADS)
Pan, Limeng; Housh, Mashor; Liu, Pan; Cai, Ximing; Chen, Xin
2015-01-01
Optimal reservoir operation under uncertainty is a challenging engineering problem. Application of classic stochastic optimization methods to large-scale problems is limited due to computational difficulty. Moreover, classic stochastic methods assume that the estimated distribution function or the sample inflow data accurately represents the true probability distribution, which may be invalid and the performance of the algorithms may be undermined. In this study, we introduce a robust optimization (RO) approach, Iterative Linear Decision Rule (ILDR), so as to provide a tractable approximation for a multiperiod hydropower generation problem. The proposed approach extends the existing LDR method by accommodating nonlinear objective functions. It also provides users with the flexibility of choosing the accuracy of ILDR approximations by assigning a desired number of piecewise linear segments to each uncertainty. The performance of the ILDR is compared with benchmark policies including the sampling stochastic dynamic programming (SSDP) policy derived from historical data. The ILDR solves both the single and multireservoir systems efficiently. The single reservoir case study results show that the RO method is as good as SSDP when implemented on the original historical inflows and it outperforms SSDP policy when tested on generated inflows with the same mean and covariance matrix as those in history. For the multireservoir case study, which considers water supply in addition to power generation, numerical results show that the proposed approach performs as well as in the single reservoir case study in terms of optimal value and distributional robustness.
NASA Astrophysics Data System (ADS)
Chuan, Ngam Min; Thiruchelvam, Sivadass; Nasharuddin Mustapha, Kamal; Che Muda, Zakaria; Mat Husin, Norhayati; Yong, Lee Choon; Ghazali, Azrul; Ezanee Rusli, Mohd; Itam, Zarina Binti; Beddu, Salmia; Liyana Mohd Kamal, Nur
2016-03-01
This paper intends to fathom the current state of procurement system in Malaysia specifically in the construction industry in the aspect of supplier selection. This paper propose a comprehensive study on the supplier selection metrics for infrastructure building, weight the importance of each metrics assigned and to find the relationship between the metrics among initiators, decision makers, buyers and users. With the metrics hierarchy of criteria importance, a supplier selection process can be defined, repeated and audited with lesser complications or difficulties. This will help the field of procurement to improve as this research is able to develop and redefine policies and procedures that have been set in supplier selection. Developing this systematic process will enable optimization of supplier selection and thus increasing the value for every stakeholders as the process of selection is greatly simplified. With a new redefined policy and procedure, it does not only increase the company’s effectiveness and profit, but also make it available for the company to reach greater heights in the advancement of procurement in Malaysia.
Löck, Steffen; Roth, Klaus; Skripcak, Tomas; Worbs, Mario; Helmbrecht, Stephan; Jakobi, Annika; Just, Uwe; Krause, Mechthild; Baumann, Michael; Enghardt, Wolfgang; Lühr, Armin
2015-09-01
To guarantee equal access to optimal radiotherapy, a concept of patient assignment to photon or particle radiotherapy using remote treatment plan exchange and comparison - ReCompare - was proposed. We demonstrate the implementation of this concept and present its clinical applicability. The ReCompare concept was implemented using a client-server based software solution. A clinical workflow for the remote treatment plan exchange and comparison was defined. The steps required by the user and performed by the software for a complete plan transfer were described and an additional module for dose-response modeling was added. The ReCompare software was successfully tested in cooperation with three external partner clinics and worked meeting all required specifications. It was compatible with several standard treatment planning systems, ensured patient data protection, and integrated in the clinical workflow. The ReCompare software can be applied to support non-particle radiotherapy institutions with the patient-specific treatment decision on the optimal irradiation modality by remote treatment plan exchange and comparison. Copyright © 2015. Published by Elsevier GmbH.
A Concept for Flexible Operations and Optimized Traffic into Metroplex Regions
NASA Technical Reports Server (NTRS)
DeLaurentis, Daniel; Landry, Steve; Sun, Dengfeng; Wieland, Fred; Tyagi, Ankit
2011-01-01
A "Flexible Flight Operations" concept for airport metroplexes was studied. A flexible flight is one whose destination airport is not assigned until a threshold is reached near the arrival area at which time the runway which reduces overall delay is assigned. The concept seeks to increase throughput by exploiting flexibility. The quantification of best-case benefits from the concept was pursued to establish whether concept research is warranted. Findings indicate that indeed the concept has potential for significant reductions in delay (and cost due to delay) in the N90 (NY/NJ) and SCT (Southern California) metroplexes. Delay reductions of nearly 26% are possible in N90 when 30% of the commercial airline flights are flexible (smartly selected by their low probability of connecting passengers); nearly 40% delay reduction is found when 50% of the flights are flexible. In the SCT metroplex, delay reductions estimates are greater. Greater reductions result at SCT since it is less constrained currently than N90, providing "more room" to take advantage of flexibility. Using the flexible operations concept for on-demand/air taxi and General Aviation flights were found to be beneficial at NY/NJ, indicating the flexible operations concepts may be useful to wide variety of users..
NASA Astrophysics Data System (ADS)
Xu, Shuo; Ji, Ze; Truong Pham, Duc; Yu, Fan
2011-11-01
The simultaneous mission assignment and home allocation for hospital service robots studied is a Multidimensional Assignment Problem (MAP) with multiobjectives and multiconstraints. A population-based metaheuristic, the Binary Bees Algorithm (BBA), is proposed to optimize this NP-hard problem. Inspired by the foraging mechanism of honeybees, the BBA's most important feature is an explicit functional partitioning between global search and local search for exploration and exploitation, respectively. Its key parts consist of adaptive global search, three-step elitism selection (constraint handling, non-dominated solutions selection, and diversity preservation), and elites-centred local search within a Hamming neighbourhood. Two comparative experiments were conducted to investigate its single objective optimization, optimization effectiveness (indexed by the S-metric and C-metric) and optimization efficiency (indexed by computational burden and CPU time) in detail. The BBA outperformed its competitors in almost all the quantitative indices. Hence, the above overall scheme, and particularly the searching history-adapted global search strategy was validated.
High capacity low delay packet broadcasting multiaccess schemes for satellite repeater systems
NASA Astrophysics Data System (ADS)
Bose, S. K.
1980-12-01
Demand assigned packet radio schemes using satellite repeaters can achieve high capacities but often exhibit relatively large delays under low traffic conditions when compared to random access. Several schemes which improve delay performance at low traffic but which have high capacity are presented and analyzed. These schemes allow random acess attempts by users, who are waiting for channel assignments. The performance of these are considered in the context of a multiple point communication system carrying fixed length messages between geographically distributed (ground) user terminals which are linked via a satellite repeater. Channel assignments are done following a BCC queueing discipline by a (ground) central controller on the basis of requests correctly received over a collision type access channel. In TBACR Scheme A, some of the forward message channels are set aside for random access transmissions; the rest are used in a demand assigned mode. Schemes B and C operate all their forward message channels in a demand assignment mode but, by means of appropriate algorithms for trailer channel selection, allow random access attempts on unassigned channels. The latter scheme also introduces framing and slotting of the time axis to implement a more efficient algorithm for trailer channel selection than the former.
14 CFR 1215.104 - Apportionment and assignment of services.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Apportionment and assignment of services. 1215.104 Section 1215.104 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND DATA RELAY SATELLITE SYSTEM (TDRSS) Use and Reimbursement Policy for Non-U.S. Government Users...
14 CFR 1215.104 - Apportionment and assignment of services.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Apportionment and assignment of services. 1215.104 Section 1215.104 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION TRACKING AND DATA RELAY SATELLITE SYSTEM (TDRSS) Use and Reimbursement Policy for Non-U.S. Government Users...
NASA Astrophysics Data System (ADS)
Guo, Peng; Cheng, Wenming; Wang, Yi
2014-10-01
The quay crane scheduling problem (QCSP) determines the handling sequence of tasks at ship bays by a set of cranes assigned to a container vessel such that the vessel's service time is minimized. A number of heuristics or meta-heuristics have been proposed to obtain the near-optimal solutions to overcome the NP-hardness of the problem. In this article, the idea of generalized extremal optimization (GEO) is adapted to solve the QCSP with respect to various interference constraints. The resulting GEO is termed the modified GEO. A randomized searching method for neighbouring task-to-QC assignments to an incumbent task-to-QC assignment is developed in executing the modified GEO. In addition, a unidirectional search decoding scheme is employed to transform a task-to-QC assignment to an active quay crane schedule. The effectiveness of the developed GEO is tested on a suite of benchmark problems introduced by K.H. Kim and Y.M. Park in 2004 (European Journal of Operational Research, Vol. 156, No. 3). Compared with other well-known existing approaches, the experiment results show that the proposed modified GEO is capable of obtaining the optimal or near-optimal solution in a reasonable time, especially for large-sized problems.
47 CFR 80.231 - Technical Requirements for Class B Automatic Identification System (AIS) equipment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... data in the device shall also be included in the user's manual for the device. The entry of static data... Communications Commission to input an MMSI that has not been properly assigned to the end user, or to otherwise... shall the entry of static data into a Class B AIS device be performed by the user of the device or the...
Fleet Assignment Using Collective Intelligence
NASA Technical Reports Server (NTRS)
Antoine, Nicolas E.; Bieniawski, Stefan R.; Kroo, Ilan M.; Wolpert, David H.
2004-01-01
Product distribution theory is a new collective intelligence-based framework for analyzing and controlling distributed systems. Its usefulness in distributed stochastic optimization is illustrated here through an airline fleet assignment problem. This problem involves the allocation of aircraft to a set of flights legs in order to meet passenger demand, while satisfying a variety of linear and non-linear constraints. Over the course of the day, the routing of each aircraft is determined in order to minimize the number of required flights for a given fleet. The associated flow continuity and aircraft count constraints have led researchers to focus on obtaining quasi-optimal solutions, especially at larger scales. In this paper, the authors propose the application of this new stochastic optimization algorithm to a non-linear objective cold start fleet assignment problem. Results show that the optimizer can successfully solve such highly-constrained problems (130 variables, 184 constraints).
Optimal assignment of workers to supporting services in a hospital
NASA Astrophysics Data System (ADS)
Sawik, Bartosz; Mikulik, Jerzy
2008-01-01
Supporting services play an important role in health care institutions such as hospitals. This paper presents an application of operations research model for optimal allocation of workers among supporting services in a public hospital. The services include logistics, inventory management, financial management, operations management, medical analysis, etc. The optimality criterion of the problem is to minimize operations costs of supporting services subject to some specific constraints. The constraints represent specific conditions for resource allocation in a hospital. The overall problem is formulated as an integer program in the literature known as the assignment problem, where the decision variables represent the assignment of people to various jobs. The results of some computational experiments modeled on a real data from a selected Polish hospital are reported.
NASA Astrophysics Data System (ADS)
Milani, G.; Milani, F.
A GUI software (GURU) for experimental data fitting of rheometer curves in Natural Rubber (NR) vulcanized with sulphur at different curing temperatures is presented. Experimental data are automatically loaded in GURU from an Excel spreadsheet coming from the output of the experimental machine (moving die rheometer). To fit the experimental data, the general reaction scheme proposed by Han and co-workers for NR vulcanized with sulphur is considered. From the simplified kinetic scheme adopted, a closed form solution can be found for the crosslink density, with the only limitation that the induction period is excluded from computations. Three kinetic constants must be determined in such a way to minimize the absolute error between normalized experimental data and numerical prediction. Usually, this result is achieved by means of standard least-squares data fitting. On the contrary, GURU works interactively by means of a Graphical User Interface (GUI) to minimize the error and allows an interactive calibration of the kinetic constants by means of sliders. A simple mouse click on the sliders allows the assignment of a value for each kinetic constant and a visual comparison between numerical and experimental curves. Users will thus find optimal values of the constants by means of a classic trial and error strategy. An experimental case of technical relevance is shown as benchmark.
Program Aids Analysis And Optimization Of Design
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Lamarsh, William J., II
1994-01-01
NETS/ PROSSS (NETS Coupled With Programming System for Structural Synthesis) computer program developed to provide system for combining NETS (MSC-21588), neural-network application program and CONMIN (Constrained Function Minimization, ARC-10836), optimization program. Enables user to reach nearly optimal design. Design then used as starting point in normal optimization process, possibly enabling user to converge to optimal solution in significantly fewer iterations. NEWT/PROSSS written in C language and FORTRAN 77.
Leveraging search and content exploration by exploiting context in folksonomy systems
NASA Astrophysics Data System (ADS)
Abel, Fabian; Baldoni, Matteo; Baroglio, Cristina; Henze, Nicola; Kawase, Ricardo; Krause, Daniel; Patti, Viviana
2010-04-01
With the advent of Web 2.0 tagging became a popular feature in social media systems. People tag diverse kinds of content, e.g. products at Amazon, music at Last.fm, images at Flickr, etc. In the last years several researchers analyzed the impact of tags on information retrieval. Most works focused on tags only and ignored context information. In this article we present context-aware approaches for learning semantics and improve personalized information retrieval in tagging systems. We investigate how explorative search, initialized by clicking on tags, can be enhanced with automatically produced context information so that search results better fit to the actual information needs of the users. We introduce the SocialHITS algorithm and present an experiment where we compare different algorithms for ranking users, tags, and resources in a contextualized way. We showcase our approaches in the domain of images and present the TagMe! system that enables users to explore and tag Flickr pictures. In TagMe! we further demonstrate how advanced context information can easily be generated: TagMe! allows users to attach tag assignments to a specific area within an image and to categorize tag assignments. In our corresponding evaluation we show that those additional facets of tag assignments gain valuable semantics, which can be applied to improve existing search and ranking algorithms significantly.
Meaningless comparisons lead to false optimism in medical machine learning
Kording, Konrad; Recht, Benjamin
2017-01-01
A new trend in medicine is the use of algorithms to analyze big datasets, e.g. using everything your phone measures about you for diagnostics or monitoring. However, these algorithms are commonly compared against weak baselines, which may contribute to excessive optimism. To assess how well an algorithm works, scientists typically ask how well its output correlates with medically assigned scores. Here we perform a meta-analysis to quantify how the literature evaluates their algorithms for monitoring mental wellbeing. We find that the bulk of the literature (∼77%) uses meaningless comparisons that ignore patient baseline state. For example, having an algorithm that uses phone data to diagnose mood disorders would be useful. However, it is possible to explain over 80% of the variance of some mood measures in the population by simply guessing that each patient has their own average mood—the patient-specific baseline. Thus, an algorithm that just predicts that our mood is like it usually is can explain the majority of variance, but is, obviously, entirely useless. Comparing to the wrong (population) baseline has a massive effect on the perceived quality of algorithms and produces baseless optimism in the field. To solve this problem we propose “user lift” that reduces these systematic errors in the evaluation of personalized medical monitoring. PMID:28949964
Software and Systems Producibility Collaboration and Experimentation Environment (SPRUCE)
2014-04-01
represent course materials and assignments from Vanderbilt University’s Dr . Gokhale’s courses. 3.2.4. Communities of Interest Current list of...blogging platforms of Twitter, Facebook and LinkedIn today, these user interactions represent low-effort means for users to start getting involved. A
Lei, Yang; Yu, Dai; Bin, Zhang; Yang, Yang
2017-01-01
Clustering algorithm as a basis of data analysis is widely used in analysis systems. However, as for the high dimensions of the data, the clustering algorithm may overlook the business relation between these dimensions especially in the medical fields. As a result, usually the clustering result may not meet the business goals of the users. Then, in the clustering process, if it can combine the knowledge of the users, that is, the doctor's knowledge or the analysis intent, the clustering result can be more satisfied. In this paper, we propose an interactive K -means clustering method to improve the user's satisfactions towards the result. The core of this method is to get the user's feedback of the clustering result, to optimize the clustering result. Then, a particle swarm optimization algorithm is used in the method to optimize the parameters, especially the weight settings in the clustering algorithm to make it reflect the user's business preference as possible. After that, based on the parameter optimization and adjustment, the clustering result can be closer to the user's requirement. Finally, we take an example in the breast cancer, to testify our method. The experiments show the better performance of our algorithm.
Big Data on a Smaller Scale: A Social Media Analytics Assignment
ERIC Educational Resources Information Center
Fischbach, Sarah; Zarzosa, Jennifer
2018-01-01
It is truly important for students to understand how to monitor online marketing buzz. This assignment, social media analytics, utilizes the content analysis research method to build student's in-depth understanding on how to evaluate and interpret user-generated content (UGC) to create social media campaigns. The authors adapted Resnik and…
Students' Experiences with an Automated Essay Scorer
ERIC Educational Resources Information Center
Scharber, Cassandra; Dexter, Sara; Riedel, Eric
2008-01-01
The purpose of this research is to analyze preservice teachers' use of and reactions to an automated essay scorer used within an online, case-based learning environment called ETIPS. Data analyzed include post-assignment surveys, a user log of students' actions within the cases, instructor-assigned scores on final essays, and interviews with four…
Kuhn, Stefan; Schlörer, Nils E
2015-08-01
nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.
Multi-objective optimization for model predictive control.
Wojsznis, Willy; Mehta, Ashish; Wojsznis, Peter; Thiele, Dirk; Blevins, Terry
2007-06-01
This paper presents a technique of multi-objective optimization for Model Predictive Control (MPC) where the optimization has three levels of the objective function, in order of priority: handling constraints, maximizing economics, and maintaining control. The greatest weights are assigned dynamically to control or constraint variables that are predicted to be out of their limits. The weights assigned for economics have to out-weigh those assigned for control objectives. Control variables (CV) can be controlled at fixed targets or within one- or two-sided ranges around the targets. Manipulated Variables (MV) can have assigned targets too, which may be predefined values or current actual values. This MV functionality is extremely useful when economic objectives are not defined for some or all the MVs. To achieve this complex operation, handle process outputs predicted to go out of limits, and have a guaranteed solution for any condition, the technique makes use of the priority structure, penalties on slack variables, and redefinition of the constraint and control model. An engineering implementation of this approach is shown in the MPC embedded in an industrial control system. The optimization and control of a distillation column, the standard Shell heavy oil fractionator (HOF) problem, is adequately achieved with this MPC.
Optimal Periodic Cooperative Spectrum Sensing Based on Weight Fusion in Cognitive Radio Networks
Liu, Xin; Jia, Min; Gu, Xuemai; Tan, Xuezhi
2013-01-01
The performance of cooperative spectrum sensing in cognitive radio (CR) networks depends on the sensing mode, the sensing time and the number of cooperative users. In order to improve the sensing performance and reduce the interference to the primary user (PU), a periodic cooperative spectrum sensing model based on weight fusion is proposed in this paper. Moreover, the sensing period, the sensing time and the searching time are optimized, respectively. Firstly the sensing period is optimized to improve the spectrum utilization and reduce the interference, then the joint optimization algorithm of the local sensing time and the number of cooperative users, is proposed to obtain the optimal sensing time for improving the throughput of the cognitive radio user (CRU) during each period, and finally the water-filling principle is applied to optimize the searching time in order to make the CRU find an idle channel within the shortest time. The simulation results show that compared with the previous algorithms, the optimal sensing period can improve the spectrum utilization of the CRU and decrease the interference to the PU significantly, the optimal sensing time can make the CRU achieve the largest throughput, and the optimal searching time can make the CRU find an idle channel with the least time. PMID:23604027
NASA Astrophysics Data System (ADS)
Yeh, Cheng-Ta; Lin, Yi-Kuei; Yang, Jo-Yun
2018-07-01
Network reliability is an important performance index for many real-life systems, such as electric power systems, computer systems and transportation systems. These systems can be modelled as stochastic-flow networks (SFNs) composed of arcs and nodes. Most system supervisors respect the network reliability maximization by finding the optimal multi-state resource assignment, which is one resource to each arc. However, a disaster may cause correlated failures for the assigned resources, affecting the network reliability. This article focuses on determining the optimal resource assignment with maximal network reliability for SFNs. To solve the problem, this study proposes a hybrid algorithm integrating the genetic algorithm and tabu search to determine the optimal assignment, called the hybrid GA-TS algorithm (HGTA), and integrates minimal paths, recursive sum of disjoint products and the correlated binomial distribution to calculate network reliability. Several practical numerical experiments are adopted to demonstrate that HGTA has better computational quality than several popular soft computing algorithms.
Diversifying customer review rankings.
Krestel, Ralf; Dokoohaki, Nima
2015-06-01
E-commerce Web sites owe much of their popularity to consumer reviews accompanying product descriptions. On-line customers spend hours and hours going through heaps of textual reviews to decide which products to buy. At the same time, each popular product has thousands of user-generated reviews, making it impossible for a buyer to read everything. Current approaches to display reviews to users or recommend an individual review for a product are based on the recency or helpfulness of each review. In this paper, we present a framework to rank product reviews by optimizing the coverage of the ranking with respect to sentiment or aspects, or by summarizing all reviews with the top-K reviews in the ranking. To accomplish this, we make use of the assigned star rating for a product as an indicator for a review's sentiment polarity and compare bag-of-words (language model) with topic models (latent Dirichlet allocation) as a mean to represent aspects. Our evaluation on manually annotated review data from a commercial review Web site demonstrates the effectiveness of our approach, outperforming plain recency ranking by 30% and obtaining best results by combining language and topic model representations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Computer program documentation: CYBER to Univac binary conversion user's guide
NASA Technical Reports Server (NTRS)
Martin, E. W.
1980-01-01
A user's guide for a computer program which will convert SINDA temperature history data from CDC (Cyber) binary format to UNIVAC 1100 binary format is presented. The various options available, the required input, the optional output, file assignments, and the restrictions of the program are discussed.
Sweet, Kevin; Sturm, Amy C; Rettig, Amy; McElroy, Joseph; Agnese, Doreen
2015-06-01
A descriptive retrospective study was performed using two separate user cohorts to determine the effectiveness of Family HealthLink as a clinical triage tool. Cohort 1 consisted of 2,502 users who accessed the public website. Cohort 2 consisted of 194 new patients in a Comprehensive Breast Center setting. For patient users, we assessed documentation of family history and genetics referral. For all users seen in a genetics clinic, the Family HealthLink assessment was compared with that performed by genetic counselors and genetic testing outcomes. For general public users, the percentage meeting high-risk criteria were: for cancer only, 22.2%; for coronary heart disease only, 24.3%; and for both diseases, 10.4%. These risk stratification percentages were similar for the patient users. For the patient users, there often was documentation of family history of certain cancer types by oncology professionals, but age of onset and coronary heart disease family history were less complete. Of 142 with high-risk assignments seen in a genetics clinic, 130 (91.5%) of these assignments were corroborated. Forty-two underwent genetic testing and 17 (40.5%) had new molecular diagnoses established. A significant percentage of individuals are at high familial risk and may require more intensive screening and referral. Interactive family history triage tools can aid this process.Genet Med 17 6, 493-500.
Models based on "out-of Kilter" algorithm
NASA Astrophysics Data System (ADS)
Adler, M. J.; Drobot, R.
2012-04-01
In case of many water users along the river stretches, it is very important, in case of low flows and droughty periods to develop an optimization model for water allocation, to cover all needs under certain predefined constraints, depending of the Contingency Plan for drought management. Such a program was developed during the implementation of the WATMAN Project, in Romania (WATMAN Project, 2005-2006, USTDA) for Arges-Dambovita-Ialomita Basins water transfers. This good practice was proposed for WATER CoRe Project- Good Practice Handbook for Drought Management, (InterregIVC, 2011), to be applied for the European Regions. Two types of simulation-optimization models based on an improved version of out-of-kilter algorithm as optimization technique have been developed and used in Romania: • models for founding of the short-term operation of a WMS, • models generically named SIMOPT that aim to the analysis of long-term WMS operation and have as the main results the statistical WMS functional parameters. A real WMS is modeled by an arcs-nodes network so the real WMS operation problem becomes a problem of flows in networks. The nodes and oriented arcs as well as their characteristics such as lower and upper limits and associated costs are the direct analog of the physical and operational WMS characteristics. Arcs represent both physical and conventional elements of WMS such as river branches, channels or pipes, water user demands or other water management requirements, trenches of water reservoirs volumes, water levels in channels or rivers, nodes are junctions of at least two arcs and stand for locations of lakes or water reservoirs and/or confluences of river branches, water withdrawal or wastewater discharge points, etc. Quantitative features of water resources, water users and water reservoirs or other water works are expressed as constraints of non-violating the lower and upper limits assigned on arcs. Options of WMS functioning i.e. water retention/discharge in/from the reservoirs or diversion of water from one part of WMS to the other in order to meet water demands as well as the water user economic benefit or loss related to the degree of water demand, are the defining elements of the objective function and are conventionally expressed by the means of costs attached to the arcs. The problem of optimizing the WMS operation is formulated like a flow in networks problem as following: to find the flow that minimize the cost in the whole network while meeting the constraints of continuity in nodes and the constraints of non-exceeding lower and upper flow limits on arcs. Conversion of WMS in the arcs-nodes network and the adequate choice of costs and limits on arcs are steps of a unitary process and depend on the goal of the respective model.
Wang, Zhaocai; Pu, Jun; Cao, Liling; Tan, Jian
2015-10-23
The unbalanced assignment problem (UAP) is to optimally resolve the problem of assigning n jobs to m individuals (m < n), such that minimum cost or maximum profit obtained. It is a vitally important Non-deterministic Polynomial (NP) complete problem in operation management and applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn) time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.
Optimal joint management of a coastal aquifer and a substitute resource
NASA Astrophysics Data System (ADS)
Moreaux, M.; Reynaud, A.
2004-06-01
This article characterizes the optimal joint management of a coastal aquifer and a costly water substitute. For this purpose we use a mathematical representation of the aquifer that incorporates the displacement of the interface between the seawater and the freshwater of the aquifer. We identify the spatial cost externalities created by users on each other and we show that the optimal water supply depends on the location of users. Users located in the coastal zone exclusively use the costly substitute. Those located in the more upstream area are supplied from the aquifer. At the optimum their withdrawal must take into account the cost externalities they generate on users located downstream. Last, users located in a median zone use the aquifer with a surface transportation cost. We show that the optimum can be implemented in a decentralized economy through a very simple Pigouvian tax. Finally, the optimal and decentralized extraction policies are simulated on a very simple example.
NASA Technical Reports Server (NTRS)
Pindera, Marek-Jerzy; Salzar, Robert S.
1996-01-01
A user's guide for the computer program OPTCOMP2 is presented in this report. This program provides a capability to optimize the fabrication or service-induced residual stresses in unidirectional metal matrix composites subjected to combined thermomechanical axisymmetric loading by altering the processing history, as well as through the microstructural design of interfacial fiber coatings. The user specifies the initial architecture of the composite and the load history, with the constituent materials being elastic, plastic, viscoplastic, or as defined by the 'user-defined' constitutive model, in addition to the objective function and constraints, through a user-friendly data input interface. The optimization procedure is based on an efficient solution methodology for the inelastic response of a fiber/interface layer(s)/matrix concentric cylinder model where the interface layers can be either homogeneous or heterogeneous. The response of heterogeneous layers is modeled using Aboudi's three-dimensional method of cells micromechanics model. The commercial optimization package DOT is used for the nonlinear optimization problem. The solution methodology for the arbitrarily layered cylinder is based on the local-global stiffness matrix formulation and Mendelson's iterative technique of successive elastic solutions developed for elastoplastic boundary-value problems. The optimization algorithm employed in DOT is based on the method of feasible directions.
Leveraging technology and staffing in developing a new liaison program.
Williams, Jeff; McCrillis, Aileen; McGowan, Richard; Nicholson, Joey; Surkis, Alisa; Thompson, Holly; Vieira, Dorice
2014-01-01
With nearly all library resources and services delivered digitally, librarians working for the New York University Health Sciences Library struggled with maintaining awareness of changing user needs, understanding barriers faced in using library resources and services, and determining knowledge management challenges across the organization. A liaison program was created to provide opportunities for librarians to meaningfully engage with users. The program was directed toward a subset of high-priority user groups to provide focused engagement with these users. Responsibility for providing routine reference service was reduced for liaison librarians to provide maximum time to engage with their assigned user communities.
A CT and MRI scan to MCNP input conversion program.
Van Riper, Kenneth A
2005-01-01
We describe a new program to read a sequence of tomographic scans and prepare the geometry and material sections of an MCNP input file. Image processing techniques include contrast controls and mapping of grey scales to colour. The user interface provides several tools with which the user can associate a range of image intensities to an MCNP material. Materials are loaded from a library. A separate material assignment can be made to a pixel intensity or range of intensities when that intensity dominates the image boundaries; this material is assigned to all pixels with that intensity contiguous with the boundary. Material fractions are computed in a user-specified voxel grid overlaying the scans. New materials are defined by mixing the library materials using the fractions. The geometry can be written as an MCNP lattice or as individual cells. A combination algorithm can be used to join neighbouring cells with the same material.
NASA Astrophysics Data System (ADS)
Yang, Peng; Peng, Yongfei; Ye, Bin; Miao, Lixin
2017-09-01
This article explores the integrated optimization problem of location assignment and sequencing in multi-shuttle automated storage/retrieval systems under the modified 2n-command cycle pattern. The decision of storage and retrieval (S/R) location assignment and S/R request sequencing are jointly considered. An integer quadratic programming model is formulated to describe this integrated optimization problem. The optimal travel cycles for multi-shuttle S/R machines can be obtained to process S/R requests in the storage and retrieval request order lists by solving the model. The small-sized instances are optimally solved using CPLEX. For large-sized problems, two tabu search algorithms are proposed, in which the first come, first served and nearest neighbour are used to generate initial solutions. Various numerical experiments are conducted to examine the heuristics' performance and the sensitivity of algorithm parameters. Furthermore, the experimental results are analysed from the viewpoint of practical application, and a parameter list for applying the proposed heuristics is recommended under different real-life scenarios.
Stockburger, D W
1999-05-01
Active server pages permit a software developer to customize the Web experience for users by inserting server-side script and database access into Web pages. This paper describes applications of these techniques and provides a primer on the use of these methods. Applications include a system that generates and grades individualized homework assignments and tests for statistics students. The student accesses the system as a Web page, prints out the assignment, does the assignment, and enters the answers on the Web page. The server, running on NT Server 4.0, grades the assignment, updates the grade book (on a database), and returns the answer key to the student.
Analysis of labor employment assessment on production machine to minimize time production
NASA Astrophysics Data System (ADS)
Hernawati, Tri; Suliawati; Sari Gumay, Vita
2018-03-01
Every company both in the field of service and manufacturing always trying to pass efficiency of it’s resource use. One resource that has an important role is labor. Labor has different efficiency levels for different jobs anyway. Problems related to the optimal allocation of labor that has different levels of efficiency for different jobs are called assignment problems, which is a special case of linear programming. In this research, Analysis of Labor Employment Assesment on Production Machine to Minimize Time Production, in PT PDM is done by using Hungarian algorithm. The aim of the research is to get the assignment of optimal labor on production machine to minimize time production. The results showed that the assignment of existing labor is not suitable because the time of completion of the assignment is longer than the assignment by using the Hungarian algorithm. By applying the Hungarian algorithm obtained time savings of 16%.
Variational Trajectory Optimization Tool Set: Technical description and user's manual
NASA Technical Reports Server (NTRS)
Bless, Robert R.; Queen, Eric M.; Cavanaugh, Michael D.; Wetzel, Todd A.; Moerder, Daniel D.
1993-01-01
The algorithms that comprise the Variational Trajectory Optimization Tool Set (VTOTS) package are briefly described. The VTOTS is a software package for solving nonlinear constrained optimal control problems from a wide range of engineering and scientific disciplines. The VTOTS package was specifically designed to minimize the amount of user programming; in fact, for problems that may be expressed in terms of analytical functions, the user needs only to define the problem in terms of symbolic variables. This version of the VTOTS does not support tabular data; thus, problems must be expressed in terms of analytical functions. The VTOTS package consists of two methods for solving nonlinear optimal control problems: a time-domain finite-element algorithm and a multiple shooting algorithm. These two algorithms, under the VTOTS package, may be run independently or jointly. The finite-element algorithm generates approximate solutions, whereas the shooting algorithm provides a more accurate solution to the optimization problem. A user's manual, some examples with results, and a brief description of the individual subroutines are included.
Speechlinks: Robust Cross-Lingual Tactical Communication Aids
2008-06-01
domain, the ontology based translation has proven to be challenging to build in this domain, however recent developments show promising results...assignments, and the effect of domain knowledge on those requirements. • Improving the front end of the speech recognizer remains one of the most challenging ...users by being very selective. 4.2.3.2 Analysis of the Normal user type inference result Figure 4.11 shows one of the most challenging users to
The emergence of Zipf's law - Spontaneous encoding optimization by users of a command language
NASA Technical Reports Server (NTRS)
Ellis, S. R.; Hitchcock, R. J.
1986-01-01
The distribution of commands issued by experienced users of a computer operating system allowing command customization tends to conform to Zipf's law. This result documents the emergence of a statistical property of natural language as users master an artificial language. Analysis of Zipf's law by Mandelbrot and Cherry shows that its emergence in the computer interaction of experienced users may be interpreted as evidence that these users optimize their encoding of commands. Accordingly, the extent to which users of a command language exhibit Zipf's law can provide a metric of the naturalness and efficiency with which that language is used.
Content-aware photo collage using circle packing.
Yu, Zongqiao; Lu, Lin; Guo, Yanwen; Fan, Rongfei; Liu, Mingming; Wang, Wenping
2014-02-01
In this paper, we present a novel approach for automatically creating the photo collage that assembles the interest regions of a given group of images naturally. Previous methods on photo collage are generally built upon a well-defined optimization framework, which computes all the geometric parameters and layer indices for input photos on the given canvas by optimizing a unified objective function. The complex nonlinear form of optimization function limits their scalability and efficiency. From the geometric point of view, we recast the generation of collage as a region partition problem such that each image is displayed in its corresponding region partitioned from the canvas. The core of this is an efficient power-diagram-based circle packing algorithm that arranges a series of circles assigned to input photos compactly in the given canvas. To favor important photos, the circles are associated with image importances determined by an image ranking process. A heuristic search process is developed to ensure that salient information of each photo is displayed in the polygonal area resulting from circle packing. With our new formulation, each factor influencing the state of a photo is optimized in an independent stage, and computation of the optimal states for neighboring photos are completely decoupled. This improves the scalability of collage results and ensures their diversity. We also devise a saliency-based image fusion scheme to generate seamless compositive collage. Our approach can generate the collages on nonrectangular canvases and supports interactive collage that allows the user to refine collage results according to his/her personal preferences. We conduct extensive experiments and show the superiority of our algorithm by comparing against previous methods.
Effects of Talker Variability on Vowel Recognition in Cochlear Implants
ERIC Educational Resources Information Center
Chang, Yi-ping; Fu, Qian-Jie
2006-01-01
Purpose: To investigate the effects of talker variability on vowel recognition by cochlear implant (CI) users and by normal-hearing (NH) participants listening to 4-channel acoustic CI simulations. Method: CI users were tested with their clinically assigned speech processors. For NH participants, 3 CI processors were simulated, using different…
Frequency Allocation; The Radio Spectrum.
ERIC Educational Resources Information Center
Federal Communications Commission, Washington, DC.
The Federal Communications Commission (FCC) assigns segments of the radio spectrum to categories of users, and specific frequencies within each segment to individual users. Since demand for channel space exceeds supply, the process is complex. The radio spectrum can be compared to a long ruler: the portion from 10-540 kiloHertz has been set aside…
Evaluation of new multimedia formats for cancer communications.
Bader, Judith L; Strickman-Stein, Nancy
2003-01-01
Providing quality, current cancer information to cancer patients and their families is a key function of the National Cancer Institute (NCI) Web site. This information is now provided in predominantly-text format, but could be provided in formats using multimedia, including animation and sound. Since users have many choices about where to get their information, it is important to provide the information in a format that is helpful and that they prefer. To pilot and evaluate multimedia strategies for future cancer-information program formats for lay users, the National Cancer Institute created new multimedia versions of existing text programs. We sought to evaluate user performance and preference on these 3 new formats and on the 2 existing text formats. The National Cancer Institute's "What You Need to Know About Lung Cancer" program was the test vehicle. There were 5 testing sessions, 1 dedicated to each format. Each session lasted about 1 hour, with 9 participants per session and 45 users overall. Users were exposed to the assigned cancer program from beginning to end in 1 of 5 formats: text paperback booklet, paperback booklet formatted in HTML on the Web, spoken audio alone, spoken audio synchronized with a text Web page, and Flash multimedia (animation, spoken audio, and text). Immediately thereafter, the features and design of the 4 alternative formats were demonstrated in detail. A multiple-choice pre-test and post-test quiz on the cancer content was used to assess user learning (performance) before and after experiencing the assigned program. The quiz was administered using an Authorware software interface writing to an Access database. Users were asked to rank from 1 to 5 their preference for the 5 program formats, and provide structured and open-ended comments about usability of the 5 formats. Significant improvement in scores from pre-test to post-test was seen for the total study population. Average scores for users in each of the 5 format groups improved significantly. Increments in improvement, however, were not statistically different between any of the format groups. Significant improvements in quiz scores were seen irrespective of age group or education level. Of the users, 71.1% ranked the Flash program first among the 5 formats, and 84.4% rated Flash as their first or second choice. Audio was the least-preferred format, ranking fifth among 46.7% of users and first among none. Flash was ranked first among users regardless of education level, age group, or format group to which the user was assigned. Under the pilot study conditions, users overwhelmingly preferred the Flash format to the other 4 formats. Learning occurred equally in all formats. Use of multimedia should be considered as communication strategies are developed for updating cancer content and attracting new users.
Evaluation of New Multimedia Formats for Cancer Communications
Strickman-Stein, Nancy
2003-01-01
Background Providing quality, current cancer information to cancer patients and their families is a key function of the National Cancer Institute (NCI) Web site. This information is now provided in predominantly-text format, but could be provided in formats using multimedia, including animation and sound. Since users have many choices about where to get their information, it is important to provide the information in a format that is helpful and that they prefer. Objective To pilot and evaluate multimedia strategies for future cancer-information program formats for lay users, the National Cancer Institute created new multimedia versions of existing text programs. We sought to evaluate user performance and preference on these 3 new formats and on the 2 existing text formats. Methods The National Cancer Institute's "What You Need to Know About Lung Cancer" program was the test vehicle. There were 5 testing sessions, 1 dedicated to each format. Each session lasted about 1 hour, with 9 participants per session and 45 users overall. Users were exposed to the assigned cancer program from beginning to end in 1 of 5 formats: text paperback booklet, paperback booklet formatted in HTML on the Web, spoken audio alone, spoken audio synchronized with a text Web page, and Flash multimedia (animation, spoken audio, and text). Immediately thereafter, the features and design of the 4 alternative formats were demonstrated in detail. A multiple-choice pre-test and post-test quiz on the cancer content was used to assess user learning (performance) before and after experiencing the assigned program. The quiz was administered using an Authorware software interface writing to an Access database. Users were asked to rank from 1 to 5 their preference for the 5 program formats, and provide structured and open-ended comments about usability of the 5 formats. Results Significant improvement in scores from pre-test to post-test was seen for the total study population. Average scores for users in each of the 5 format groups improved significantly. Increments in improvement, however, were not statistically different between any of the format groups. Significant improvements in quiz scores were seen irrespective of age group or education level. Of the users, 71.1% ranked the Flash program first among the 5 formats, and 84.4% rated Flash as their first or second choice. Audio was the least-preferred format, ranking fifth among 46.7% of users and first among none. Flash was ranked first among users regardless of education level, age group, or format group to which the user was assigned. Conclusions Under the pilot study conditions, users overwhelmingly preferred the Flash format to the other 4 formats. Learning occurred equally in all formats. Use of multimedia should be considered as communication strategies are developed for updating cancer content and attracting new users. PMID:14517107
SpaceWire Protocol ID: What Does It Mean To You?
NASA Technical Reports Server (NTRS)
Rakow, Glenn; Schnurr, Richard; Gilley, Daniel; Parks, Steve
2006-01-01
Spacewire is becoming a popular solution for satellite high-speed data buses because it is a simple standard that provides great flexibility for a wide range of system requirements. It is simple in packet format and protocol, allowing users to easily tailor their implementation for their specific application. Some of the attractive aspects of Spacewire that make it easy to implement also make it hard for future reuse. Protocol reuse is difficult because Spacewire does not have a defined mechanism to communicate with the higher layers of the protocol stack. This has forced users of Spacewire to define unique packet formats and define how these packets are to be processed. Each mission writes their own Interface Control Document (ICD) and tailors Spacewire for their specific requirements making reuse difficult. Part of the reason for this habit may be because engineers typically optimize designs for their own requirements in the absence of a standard. This is an inefficient use of project resources and costs more to develop missions. A new packet format for Spacewire has been defined as a solution for this problem. This new packet format is a compliment to the Spacewire standard that will support protocol development upon Spacewire. The new packet definition does not replace the current packet structure, i.e., does not make the standard obsolete, but merely extends the standard for those who want to develop protocols over Spacewire. The Spacewire packet is defined with the first part being the Destination Address, which may be one or more bytes. This is followed by the packet cargo, which is user defined. The cargo is truncated with an End-Of-Packet (EOP) marker. This packet structure offers low packet overhead and allows the user to define how the contents are to be formatted. It also provides for many different addressing schemes, which provide flexibility in the system. This packet flexibility is typically an attractive part of the Spacewire. The new extended packet format adds one new field to the packet that greatly enhances the capability of Spacewire. This new field called the Protocol Identifier (ID) is used to identify the packet contents and the associated processing for the packet. This feature along with the restriction in the packet format that uses the Protocol ID, allows a deterministic method of decoding packets that was not before possible. The first part of the packet is still the Destination Address, which still conforms to the original standard but with one restriction. The restriction is that the first byte seen at the destination by the user needs to be a logical address, independent of the addressing scheme used. The second field is defined as the Protocol ID, which is usually one byte in length. The packet cargo (user defined) follows the Protocol ID. After the packet cargo is the EOP, which defines the end of packet. The value of the Protocol ID is assigned by the Spacewire working group and the protocol description published for others to use. The development of Protocols for Spacewire is currently the area of greatest activity by the Spacewire working group. The first protocol definition by the working group has been completed and is now in the process of formal standardization. There are many other protocols in development for missions that have not yet received formal Protocol ID assignment, but even if the protocols are not formally assigned a value, this effort will provide synergism for future developments.
MacKinnon, Neil; Somashekar, Bagganahalli S; Tripathi, Pratima; Ge, Wencheng; Rajendiran, Thekkelnaycke M; Chinnaiyan, Arul M; Ramamoorthy, Ayyalusamy
2013-01-01
Nuclear magnetic resonance based measurements of small molecule mixtures continues to be confronted with the challenge of spectral assignment. While multi-dimensional experiments are capable of addressing this challenge, the imposed time constraint becomes prohibitive, particularly with the large sample sets commonly encountered in metabolomic studies. Thus, one-dimensional spectral assignment is routinely performed, guided by two-dimensional experiments on a selected sample subset; however, a publicly available graphical interface for aiding in this process is currently unavailable. We have collected spectral information for 360 unique compounds from publicly available databases including chemical shift lists and authentic full resolution spectra, supplemented with spectral information for 25 compounds collected in-house at a proton NMR frequency of 900 MHz. This library serves as the basis for MetaboID, a Matlab-based user interface designed to aid in the one-dimensional spectral assignment process. The tools of MetaboID were built to guide resonance assignment in order of increasing confidence, starting from cursory compound searches based on chemical shift positions to analysis of authentic spike experiments. Together, these tools streamline the often repetitive task of spectral assignment. The overarching goal of the integrated toolbox of MetaboID is to centralize the one dimensional spectral assignment process, from providing access to large chemical shift libraries to providing a straightforward, intuitive means of spectral comparison. Such a toolbox is expected to be attractive to both experienced and new metabolomic researchers as well as general complex mixture analysts. Copyright © 2012 Elsevier Inc. All rights reserved.
The Optimal Well Locator ( OWL) program was designed and developed by USEPA to be a screening tool to evaluate and optimize the placement of wells in long term monitoring networks at small sites. The first objective of the OWL program is to allow the user to visualize the change ...
Pricing Resources in LTE Networks through Multiobjective Optimization
Lai, Yung-Liang; Jiang, Jehn-Ruey
2014-01-01
The LTE technology offers versatile mobile services that use different numbers of resources. This enables operators to provide subscribers or users with differential quality of service (QoS) to boost their satisfaction. On one hand, LTE operators need to price the resources high for maximizing their profits. On the other hand, pricing also needs to consider user satisfaction with allocated resources and prices to avoid “user churn,” which means subscribers will unsubscribe services due to dissatisfaction with allocated resources or prices. In this paper, we study the pricing resources with profits and satisfaction optimization (PRPSO) problem in the LTE networks, considering the operator profit and subscribers' satisfaction at the same time. The problem is modelled as nonlinear multiobjective optimization with two optimal objectives: (1) maximizing operator profit and (2) maximizing user satisfaction. We propose to solve the problem based on the framework of the NSGA-II. Simulations are conducted for evaluating the proposed solution. PMID:24526889
Pricing resources in LTE networks through multiobjective optimization.
Lai, Yung-Liang; Jiang, Jehn-Ruey
2014-01-01
The LTE technology offers versatile mobile services that use different numbers of resources. This enables operators to provide subscribers or users with differential quality of service (QoS) to boost their satisfaction. On one hand, LTE operators need to price the resources high for maximizing their profits. On the other hand, pricing also needs to consider user satisfaction with allocated resources and prices to avoid "user churn," which means subscribers will unsubscribe services due to dissatisfaction with allocated resources or prices. In this paper, we study the pricing resources with profits and satisfaction optimization (PRPSO) problem in the LTE networks, considering the operator profit and subscribers' satisfaction at the same time. The problem is modelled as nonlinear multiobjective optimization with two optimal objectives: (1) maximizing operator profit and (2) maximizing user satisfaction. We propose to solve the problem based on the framework of the NSGA-II. Simulations are conducted for evaluating the proposed solution.
Playing Games with Optimal Competitive Scheduling
NASA Technical Reports Server (NTRS)
Frank, Jeremy; Crawford, James; Khatib, Lina; Brafman, Ronen
2005-01-01
This paper is concerned with the problem of allocating a unit capacity resource to multiple users within a pre-defined time period. The resource is indivisible, so that at most one user can use it at each time instance. However, different users may use it at different times. The users have independent, selfish preferences for when and for how long they are allocated this resource. Thus, they value different resource access durations differently, and they value different time slots differently. We seek an optimal allocation schedule for this resource.
Transmit Designs for the MIMO Broadcast Channel With Statistical CSI
NASA Astrophysics Data System (ADS)
Wu, Yongpeng; Jin, Shi; Gao, Xiqi; McKay, Matthew R.; Xiao, Chengshan
2014-09-01
We investigate the multiple-input multiple-output broadcast channel with statistical channel state information available at the transmitter. The so-called linear assignment operation is employed, and necessary conditions are derived for the optimal transmit design under general fading conditions. Based on this, we introduce an iterative algorithm to maximize the linear assignment weighted sum-rate by applying a gradient descent method. To reduce complexity, we derive an upper bound of the linear assignment achievable rate of each receiver, from which a simplified closed-form expression for a near-optimal linear assignment matrix is derived. This reveals an interesting construction analogous to that of dirty-paper coding. In light of this, a low complexity transmission scheme is provided. Numerical examples illustrate the significant performance of the proposed low complexity scheme.
Wang, Zhaocai; Pu, Jun; Cao, Liling; Tan, Jian
2015-01-01
The unbalanced assignment problem (UAP) is to optimally resolve the problem of assigning n jobs to m individuals (m < n), such that minimum cost or maximum profit obtained. It is a vitally important Non-deterministic Polynomial (NP) complete problem in operation management and applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn) time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation. PMID:26512650
Scheduling Jobs and a Variable Maintenance on a Single Machine with Common Due-Date Assignment
Wan, Long
2014-01-01
We investigate a common due-date assignment scheduling problem with a variable maintenance on a single machine. The goal is to minimize the total earliness, tardiness, and due-date cost. We derive some properties on an optimal solution for our problem. For a special case with identical jobs we propose an optimal polynomial time algorithm followed by a numerical example. PMID:25147861
Ubiquitous Green Computing Techniques for High Demand Applications in Smart Environments
Zapater, Marina; Sanchez, Cesar; Ayala, Jose L.; Moya, Jose M.; Risco-Martín, José L.
2012-01-01
Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time. PMID:23112621
Ubiquitous green computing techniques for high demand applications in Smart environments.
Zapater, Marina; Sanchez, Cesar; Ayala, Jose L; Moya, Jose M; Risco-Martín, José L
2012-01-01
Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.
Essential issues in multiprocessor systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gajski, D.D.; Peir, J.K.
1985-06-01
During the past several years, a great number of proposals have been made with the objective to increase supercomputer performance by an order of magnitude on the basis of a utilization of new computer architectures. The present paper is concerned with a suitable classification scheme for comparing these architectures. It is pointed out that there are basically four schools of thought as to the most important factor for an enhancement of computer performance. According to one school, the development of faster circuits will make it possible to retain present architectures, except, possibly, for a mechanism providing synchronization of parallel processes.more » A second school assigns priority to the optimization and vectorization of compilers, which will detect parallelism and help users to write better parallel programs. A third school believes in the predominant importance of new parallel algorithms, while the fourth school supports new models of computation. The merits of the four approaches are critically evaluated. 50 references.« less
Rise, Marit By; Solbjør, Marit; Lara, Mariela C; Westerlund, Heidi; Grimstad, Hilde; Steinsbekk, Aslak
2013-09-01
Patient and public involvement in health care is important, but the existing definitions of the concept do not integrate the stakeholders' own perceptions. To investigate and compare service users' and service providers' own definitions of patient and public involvement and their implications. Qualitative study with mainly individual in-depth semi-structured interviews conducted between June 2007 and June 2009. Data were analysed using a grounded theory approach. A total of 20 patients, 13 public representatives and 44 health service providers/managers in both somatic and mental health care were interviewed. A common definition of patient and public involvement emerged: It is founded on mutual respect, carried out through dialogue aiming at achieving shared decision making. Nevertheless, users and providers assigned different values to the core aspects: Respect was imperative for service users and implied for providers, dialogue was a way to gain respect for service users and to achieve good outcome for providers, and both worried that the other party wanted to make sole decisions. Users and providers need to consider that although they have a common definition of involvement in health care, they assign different values to its aspects. Increasing and improving patient and public involvement therefore requires knowledge on and dialogue between the parties about these differences. © 2011 John Wiley & Sons Ltd.
Intelligent viewing control for robotic and automation systems
NASA Astrophysics Data System (ADS)
Schenker, Paul S.; Peters, Stephen F.; Paljug, Eric D.; Kim, Won S.
1994-10-01
We present a new system for supervisory automated control of multiple remote cameras. Our primary purpose in developing this system has been to provide capability for knowledge- based, `hands-off' viewing during execution of teleoperation/telerobotic tasks. The reported technology has broader applicability to remote surveillance, telescience observation, automated manufacturing workcells, etc. We refer to this new capability as `Intelligent Viewing Control (IVC),' distinguishing it from a simple programmed camera motion control. In the IVC system, camera viewing assignment, sequencing, positioning, panning, and parameter adjustment (zoom, focus, aperture, etc.) are invoked and interactively executed by real-time by a knowledge-based controller, drawing on a priori known task models and constraints, including operator preferences. This multi-camera control is integrated with a real-time, high-fidelity 3D graphics simulation, which is correctly calibrated in perspective to the actual cameras and their platform kinematics (translation/pan-tilt). Such merged graphics- with-video design allows the system user to preview and modify the planned (`choreographed') viewing sequences. Further, during actual task execution, the system operator has available both the resulting optimized video sequence, as well as supplementary graphics views from arbitrary perspectives. IVC, including operator-interactive designation of robot task actions, is presented to the user as a well-integrated video-graphic single screen user interface allowing easy access to all relevant telerobot communication/command/control resources. We describe and show pictorial results of a preliminary IVC system implementation for telerobotic servicing of a satellite.
Link failure detection in a parallel computer
Archer, Charles J.; Blocksome, Michael A.; Megerian, Mark G.; Smith, Brian E.
2010-11-09
Methods, apparatus, and products are disclosed for link failure detection in a parallel computer including compute nodes connected in a rectangular mesh network, each pair of adjacent compute nodes in the rectangular mesh network connected together using a pair of links, that includes: assigning each compute node to either a first group or a second group such that adjacent compute nodes in the rectangular mesh network are assigned to different groups; sending, by each of the compute nodes assigned to the first group, a first test message to each adjacent compute node assigned to the second group; determining, by each of the compute nodes assigned to the second group, whether the first test message was received from each adjacent compute node assigned to the first group; and notifying a user, by each of the compute nodes assigned to the second group, whether the first test message was received.
User's manual for the BNW-I optimization code for dry-cooled power plants. Volume III. [PLCIRI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braun, D.J.; Daniel, D.J.; De Mier, W.V.
1977-01-01
This appendix to User's Manual for the BNW-1 Optimization Code for Dry-Cooled Power Plants provides a listing of the BNW-I optimization code for determining, for a particular size power plant, the optimum dry cooling tower design using a plastic tube cooling surface and circular tower arrangement of the tube bundles. (LCL)
A Language for Specifying Compiler Optimizations for Generic Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcock, Jeremiah J.
2007-01-01
Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allowmore » the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.« less
GeMS: an advanced software package for designing synthetic genes.
Jayaraj, Sebastian; Reid, Ralph; Santi, Daniel V
2005-01-01
A user-friendly, advanced software package for gene design is described. The software comprises an integrated suite of programs-also provided as stand-alone tools-that automatically performs the following tasks in gene design: restriction site prediction, codon optimization for any expression host, restriction site inclusion and exclusion, separation of long sequences into synthesizable fragments, T(m) and stem-loop determinations, optimal oligonucleotide component design and design verification/error-checking. The output is a complete design report and a list of optimized oligonucleotides to be prepared for subsequent gene synthesis. The user interface accommodates both inexperienced and experienced users. For inexperienced users, explanatory notes are provided such that detailed instructions are not necessary; for experienced users, a streamlined interface is provided without such notes. The software has been extensively tested in the design and successful synthesis of over 400 kb of genes, many of which exceeded 5 kb in length.
Lim, Meng-Hui; Teoh, Andrew Beng Jin; Toh, Kar-Ann
2013-06-01
Biometric discretization is a key component in biometric cryptographic key generation. It converts an extracted biometric feature vector into a binary string via typical steps such as segmentation of each feature element into a number of labeled intervals, mapping of each interval-captured feature element onto a binary space, and concatenation of the resulted binary output of all feature elements into a binary string. Currently, the detection rate optimized bit allocation (DROBA) scheme is one of the most effective biometric discretization schemes in terms of its capability to assign binary bits dynamically to user-specific features with respect to their discriminability. However, we learn that DROBA suffers from potential discriminative feature misdetection and underdiscretization in its bit allocation process. This paper highlights such drawbacks and improves upon DROBA based on a novel two-stage algorithm: 1) a dynamic search method to efficiently recapture such misdetected features and to optimize the bit allocation of underdiscretized features and 2) a genuine interval concealment technique to alleviate crucial information leakage resulted from the dynamic search. Improvements in classification accuracy on two popular face data sets vindicate the feasibility of our approach compared with DROBA.
A multimedia retrieval framework based on semi-supervised ranking and relevance feedback.
Yang, Yi; Nie, Feiping; Xu, Dong; Luo, Jiebo; Zhuang, Yueting; Pan, Yunhe
2012-04-01
We present a new framework for multimedia content analysis and retrieval which consists of two independent algorithms. First, we propose a new semi-supervised algorithm called ranking with Local Regression and Global Alignment (LRGA) to learn a robust Laplacian matrix for data ranking. In LRGA, for each data point, a local linear regression model is used to predict the ranking scores of its neighboring points. A unified objective function is then proposed to globally align the local models from all the data points so that an optimal ranking score can be assigned to each data point. Second, we propose a semi-supervised long-term Relevance Feedback (RF) algorithm to refine the multimedia data representation. The proposed long-term RF algorithm utilizes both the multimedia data distribution in multimedia feature space and the history RF information provided by users. A trace ratio optimization problem is then formulated and solved by an efficient algorithm. The algorithms have been applied to several content-based multimedia retrieval applications, including cross-media retrieval, image retrieval, and 3D motion/pose data retrieval. Comprehensive experiments on four data sets have demonstrated its advantages in precision, robustness, scalability, and computational efficiency.
NASA Astrophysics Data System (ADS)
Weinmann, Martin; Jutzi, Boris; Hinz, Stefan; Mallet, Clément
2015-07-01
3D scene analysis in terms of automatically assigning 3D points a respective semantic label has become a topic of great importance in photogrammetry, remote sensing, computer vision and robotics. In this paper, we address the issue of how to increase the distinctiveness of geometric features and select the most relevant ones among these for 3D scene analysis. We present a new, fully automated and versatile framework composed of four components: (i) neighborhood selection, (ii) feature extraction, (iii) feature selection and (iv) classification. For each component, we consider a variety of approaches which allow applicability in terms of simplicity, efficiency and reproducibility, so that end-users can easily apply the different components and do not require expert knowledge in the respective domains. In a detailed evaluation involving 7 neighborhood definitions, 21 geometric features, 7 approaches for feature selection, 10 classifiers and 2 benchmark datasets, we demonstrate that the selection of optimal neighborhoods for individual 3D points significantly improves the results of 3D scene analysis. Additionally, we show that the selection of adequate feature subsets may even further increase the quality of the derived results while significantly reducing both processing time and memory consumption.
Gouge, Brian; Dowlatabadi, Hadi; Ries, Francis J
2013-04-16
In contrast to capital control strategies (i.e., investments in new technology), the potential of operational control strategies (e.g., vehicle scheduling optimization) to reduce the health and climate impacts of the emissions from public transportation bus fleets has not been widely considered. This case study demonstrates that heterogeneity in the emission levels of different bus technologies and the exposure potential of bus routes can be exploited though optimization (e.g., how vehicles are assigned to routes) to minimize these impacts as well as operating costs. The magnitude of the benefits of the optimization depend on the specific transit system and region. Health impacts were found to be particularly sensitive to different vehicle assignments and ranged from worst to best case assignment by more than a factor of 2, suggesting there is significant potential to reduce health impacts. Trade-offs between climate, health, and cost objectives were also found. Transit agencies that do not consider these objectives in an integrated framework and, for example, optimize for costs and/or climate impacts alone, risk inadvertently increasing health impacts by as much as 49%. Cost-benefit analysis was used to evaluate trade-offs between objectives, but large uncertainties make identifying an optimal solution challenging.
ERIC Educational Resources Information Center
Caputi, Peter; Chan, Amy; Jayasuriya, Rohan
2011-01-01
This paper examined the impact of training strategies on the types of errors that novice users make when learning a commonly used spreadsheet application. Fifty participants were assigned to a counterfactual thinking training (CFT) strategy, an error management training strategy, or a combination of both strategies, and completed an easy task…
Mining Social Tagging Data for Enhanced Subject Access for Readers and Researchers
ERIC Educational Resources Information Center
Lawson, Karen G.
2009-01-01
Social tagging enables librarians to partner with users to provide enhanced subject access. This paper quantifies and compares LC subject headings from each of 31 different subject divisions with user tags from Amazon.com and LibraryThing assigned to the same titles. The intersection and integration of these schemas is described and evaluated.…
Corsi, Karen F; Lehman, Wayne E; Min, Sung-Joon; Lance, Shannon P; Speer, Nicole; Booth, Robert E; Shoptaw, Steve
2012-06-04
This paper reports on a feasibility study that examined contingency management among out-of-treatment, heterosexual methamphetamine users and the reduction of drug use and HIV risk. Fifty-eight meth users were recruited through street outreach in Denver from November 2006 through March 2007. The low sample size reflects that this was a pilot study to see if CM is feasible in an out-of-treatment, street-recruited population of meth users. Secondary aims were to examine if reductions and drug use and risk behavior could be found. Subjects were randomly assigned to contingency management (CM) or CM plus strengths-based case management (CM/SBCM), with follow-up at 4 and 8 months. Participants were primarily White (90%), 52% male and averaged 38 years old. Eighty-three percent attended at least one CM session, with 29% attending at least fifteen. All participants reduced meth use significantly at follow-up. Those who attended more sessions submitted more stimulant-free urines than those who attended fewer sessions. Participants assigned to CM/SBCM attended more sessions and earned more vouchers than clients in CM. Similarly, participants reported reduced needle-sharing and sex risk. Findings demonstrate that CM and SBCM may help meth users reduce drug use and HIV risk.
Corsi, Karen F.; Lehman, Wayne E.; Min, Sung-Joon; Lance, Shannon P.; Speer, Nicole; Booth, Robert E.; Shoptaw, Steve
2013-01-01
This paper reports on a feasibility study that examined contingency management among out-of-treatment, heterosexual methamphetamine users and the reduction of drug use and HIV risk. Fifty-eight meth users were recruited through street outreach in Denver from November 2006 through March 2007. The low sample size reflects that this was a pilot study to see if CM is feasible in an out-of-treatment, street-recruited population of meth users. Secondary aims were to examine if reductions and drug use and risk behavior could be found. Subjects were randomly assigned to contingency management (CM) or CM plus strengths-based case management (CM/SBCM), with follow-up at 4 and 8 months. Participants were primarily White (90%), 52% male and averaged 38 years old. Eighty-three percent attended at least one CM session, with 29% attending at least fifteen. All participants reduced meth use significantly at follow-up. Those who attended more sessions submitted more stimulant-free urines than those who attended fewer sessions. Participants assigned to CM/SBCM attended more sessions and earned more vouchers than clients in CM. Similarly, participants reported reduced needle-sharing and sex risk. Findings demonstrate that CM and SBCM may help meth users reduce drug use and HIV risk. PMID:23493796
NASA Technical Reports Server (NTRS)
Pindera, Marek-Jerzy; Salzar, Robert S.; Williams, Todd O.
1994-01-01
A user's guide for the computer program OPTCOMP is presented in this report. This program provides a capability to optimize the fabrication or service-induced residual stresses in uni-directional metal matrix composites subjected to combined thermo-mechanical axisymmetric loading using compensating or compliant layers at the fiber/matrix interface. The user specifies the architecture and the initial material parameters of the interfacial region, which can be either elastic or elastoplastic, and defines the design variables, together with the objective function, the associated constraints and the loading history through a user-friendly data input interface. The optimization procedure is based on an efficient solution methodology for the elastoplastic response of an arbitrarily layered multiple concentric cylinder model that is coupled to the commercial optimization package DOT. The solution methodology for the arbitrarily layered cylinder is based on the local-global stiffness matrix formulation and Mendelson's iterative technique of successive elastic solutions developed for elastoplastic boundary-value problems. The optimization algorithm employed in DOT is based on the method of feasible directions.
Fog computing job scheduling optimization based on bees swarm
NASA Astrophysics Data System (ADS)
Bitam, Salim; Zeadally, Sherali; Mellouk, Abdelhamid
2018-04-01
Fog computing is a new computing architecture, composed of a set of near-user edge devices called fog nodes, which collaborate together in order to perform computational services such as running applications, storing an important amount of data, and transmitting messages. Fog computing extends cloud computing by deploying digital resources at the premise of mobile users. In this new paradigm, management and operating functions, such as job scheduling aim at providing high-performance, cost-effective services requested by mobile users and executed by fog nodes. We propose a new bio-inspired optimization approach called Bees Life Algorithm (BLA) aimed at addressing the job scheduling problem in the fog computing environment. Our proposed approach is based on the optimized distribution of a set of tasks among all the fog computing nodes. The objective is to find an optimal tradeoff between CPU execution time and allocated memory required by fog computing services established by mobile users. Our empirical performance evaluation results demonstrate that the proposal outperforms the traditional particle swarm optimization and genetic algorithm in terms of CPU execution time and allocated memory.
A Human-Centered Smart Home System with Wearable-Sensor Behavior Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ji, Jianting; Liu, Ting; Shen, Chao
Smart home has recently attracted much research interest owing to its potential in improving the quality of human life. How to obtain user's demand is the most important and challenging task for appliance optimal scheduling in smart home, since it is highly related to user's unpredictable behavior. In this paper, a human-centered smart home system is proposed to identify user behavior, predict their demand and schedule the household appliances. Firstly, the sensor data from user's wearable devices are monitored to profile user's full-day behavior. Then, the appliance-demand matrix is constructed to predict user's demand on home environment, which is extractedmore » from the history of appliance load data and user behavior. Two simulations are designed to demonstrate user behavior identification, appliance-demand matrix construction and strategy of appliance optimal scheduling generation.« less
Assigning Resources to Health Care Use for Health Services Research: Options and Consequences
Fishman, Paul A.; Hornbrook, Mark C.
2013-01-01
Aims Our goals are threefold: 1) to review the leading options for assigning resource coefficients to health services utilization; 2) to discuss the relative advantages of each option; and, 3) provide examples where the research question had marked implications for the choice of which resource measure to employ. Methods Three approaches have been used to establish relative resource weights in health services research: a) direct estimation of production costs through micro-costing or step down allocation methods; b) macro-costing/regression analysis; and, c) standardized resource assignment. We describe each of these methods and provide examples of how the study question drove the choice of resource use measure. Findings All empirical resource-intensity weighting systems contain distortions that limit their universal application. Hence, users must select the weighting system that matches the needs of their specific analysis. All systems require significant data resources and data processing. However, inattention to the distortions contained in a complex resource weighting system may undermine the validity and generalizability of an economic evaluation. Conclusions Direct estimation of production costs are useful for empirical analyses, but they contain distortions that undermine optimal resource allocation decisions. Researchers must ensure that the data being used meets both the study design and the question being addressed. They also should ensure that the choice of resource measure is the best fit for the analysis. Implications for Research and Policy Researchers should consider which of the available measures is the most appropriate for the question being addressed rather than take ‘cost’ or utilization as a variable over which they have no control PMID:19536002
Zilcha-Mano, Sigal; Keefe, John R; Chui, Harold; Rubin, Avinadav; Barrett, Marna S; Barber, Jacques P
2016-12-01
Premature discontinuation of therapy is a widespread problem that hampers the delivery of mental health treatment. A high degree of variability has been found among rates of premature treatment discontinuation, suggesting that rates may differ depending on potential moderators. In the current study, our aim was to identify demographic and interpersonal variables that moderate the association between treatment assignment and dropout. Data from a randomized controlled trial conducted from November 2001 through June 2007 (N = 156) comparing supportive-expressive therapy, antidepressant medication, and placebo for the treatment of depression (based on DSM-IV criteria) were used. Twenty prerandomization variables were chosen based on previous literature. These variables were subjected to exploratory bootstrapped variable selection and included in the logistic regression models if they passed variable selection. Three variables were found to moderate the association between treatment assignment and dropout: age, pretreatment therapeutic alliance expectations, and the presence of vindictive tendencies in interpersonal relationships. When patients were divided into those randomly assigned to their optimal treatment and those assigned to their least optimal treatment, dropout rates in the optimal treatment group (24.4%) were significantly lower than those in the least optimal treatment group (47.4%; P = .03). Present findings suggest that a patient's age and pretreatment interpersonal characteristics predict the association between common depression treatments and dropout rate. If validated by further studies, these characteristics can assist in reducing dropout through targeted treatment assignment. Secondary analysis of data from ClinicalTrials.gov identifier: NCT00043550. © Copyright 2016 Physicians Postgraduate Press, Inc.
Trajectory Optimization: OTIS 4
NASA Technical Reports Server (NTRS)
Riehl, John P.; Sjauw, Waldy K.; Falck, Robert D.; Paris, Stephen W.
2010-01-01
The latest release of the Optimal Trajectories by Implicit Simulation (OTIS4) allows users to simulate and optimize aerospace vehicle trajectories. With OTIS4, one can seamlessly generate optimal trajectories and parametric vehicle designs simultaneously. New features also allow OTIS4 to solve non-aerospace continuous time optimal control problems. The inputs and outputs of OTIS4 have been updated extensively from previous versions. Inputs now make use of objectoriented constructs, including one called a metastring. Metastrings use a greatly improved calculator and common nomenclature to reduce the user s workload. They allow for more flexibility in specifying vehicle physical models, boundary conditions, and path constraints. The OTIS4 calculator supports common mathematical functions, Boolean operations, and conditional statements. This allows users to define their own variables for use as outputs, constraints, or objective functions. The user-defined outputs can directly interface with other programs, such as spreadsheets, plotting packages, and visualization programs. Internally, OTIS4 has more explicit and implicit integration procedures, including high-order collocation methods, the pseudo-spectral method, and several variations of multiple shooting. Users may switch easily between the various methods. Several unique numerical techniques such as automated variable scaling and implicit integration grid refinement, support the integration methods. OTIS4 is also significantly more user friendly than previous versions. The installation process is nearly identical on various platforms, including Microsoft Windows, Apple OS X, and Linux operating systems. Cross-platform scripts also help make the execution of OTIS and post-processing of data easier. OTIS4 is supplied free by NASA and is subject to ITAR (International Traffic in Arms Regulations) restrictions. Users must have a Fortran compiler, and a Python interpreter is highly recommended.
Optimizing Marine Security Guard Assignments
2011-06-01
Bangkok, Thailand East Asia and Pacific 18 4 Fort Lauderdale, Florida Western Hemisphere - South 13 5 Frankfurt, Germany Western Europe and Scandinavia 15...Marine-billet assignment using similar fit criteria, such as rank and gender . 3. Minimize the amount of movement when filling the billets. That is, their...more than once. Gender Female MSGs will not be assigned to embassies that are not configured for females. DC Only DC-qualified MSGs are assigned to DC
NASA Astrophysics Data System (ADS)
Foronda, Augusto; Ohta, Chikara; Tamaki, Hisashi
Dirty paper coding (DPC) is a strategy to achieve the region capacity of multiple input multiple output (MIMO) downlink channels and a DPC scheduler is throughput optimal if users are selected according to their queue states and current rates. However, DPC is difficult to implement in practical systems. One solution, zero-forcing beamforming (ZFBF) strategy has been proposed to achieve the same asymptotic sum rate capacity as that of DPC with an exhaustive search over the entire user set. Some suboptimal user group selection schedulers with reduced complexity based on ZFBF strategy (ZFBF-SUS) and proportional fair (PF) scheduling algorithm (PF-ZFBF) have also been proposed to enhance the throughput and fairness among the users, respectively. However, they are not throughput optimal, fairness and throughput decrease if each user queue length is different due to different users channel quality. Therefore, we propose two different scheduling algorithms: a throughput optimal scheduling algorithm (ZFBF-TO) and a reduced complexity scheduling algorithm (ZFBF-RC). Both are based on ZFBF strategy and, at every time slot, the scheduling algorithms have to select some users based on user channel quality, user queue length and orthogonality among users. Moreover, the proposed algorithms have to produce the rate allocation and power allocation for the selected users based on a modified water filling method. We analyze the schedulers complexity and numerical results show that ZFBF-RC provides throughput and fairness improvements compared to the ZFBF-SUS and PF-ZFBF scheduling algorithms.
Optimal Assignment Methods in Three-Form Planned Missing Data Designs for Longitudinal Panel Studies
ERIC Educational Resources Information Center
Jorgensen, Terrence D.; Rhemtulla, Mijke; Schoemann, Alexander; McPherson, Brent; Wu, Wei; Little, Todd D.
2014-01-01
Planned missing designs are becoming increasingly popular, but because there is no consensus on how to implement them in longitudinal research, we simulated longitudinal data to distinguish between strategies of assigning items to forms and of assigning forms to participants across measurement occasions. Using relative efficiency as the criterion,…
Takanokura, Masato
2010-03-22
A four-wheeled walker is a valuable tool for assisting elderly persons with walking. The handgrip height is one of the most important factor determining the usefulness of the walker. However, the optimal handgrip height for elderly users has not been considered from a biomechanical viewpoint. In this study, the handgrip height was optimized by a two-dimensional mechanical model to reduce muscular loads in the lower body as well as in the upper body with various road conditions during steady walking. A critical height of the handgrip existed at 48% of the body height for the user regardless of gender and body dimension. A lower handgrip relieved muscular load for stooping users with a lower standing height. The stooping user pushed the handgrip strongly in the perpendicular direction by leaning the upper body on the walker. However, upright users with a higher standing height should use a four-wheeled walker with a higher handgrip for maintaining his or her upright posture. For downhill movement, the optimal handgrip height depended on the slope angle and the friction coefficient between the road and the wheels of the walker. On a low-friction downhill such as asphalt with a steeper slope angle, the user was required to maintain an erect trunk with a higher handgrip and to press on the handgrip strongly in the perpendicular direction. Movement on a low-friction road was easier for users on a flat road and an uphill road, but it compelled distinct effort from users when moving downhill. Copyright (c) 2009 Elsevier Ltd. All rights reserved.
The Trip Itinerary Optimization Platform: A Framework for Personalized Travel Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwasnik, Ted; Carmichael, Scott P.; Arent, Douglas J
The New Concepts Incubator team at the National Renewable Energy Laboratory (NREL) developed a three-stage online platform for travel diary collection, personal travel plan optimization and travel itinerary visualization. In the first stage, users provide a travel diary for the previous day through an interactive map and calendar interface and survey for travel attitudes and behaviors. One or more days later, users are invited via email to engage in a second stage where they view a personal mobility dashboard displaying recommended travel itineraries generated from a novel framework that optimizes travel outcomes over a sequence of interrelated trips. A weekmore » or more after viewing these recommended travel itineraries on the dashboard, users are emailed again to engage in a third stage where they complete a final survey about travel attitudes and behaviors. A usability study of the platform conducted online showed that, in general, users found the system valuable for informing their travel decisions. A total of 274 individuals were recruited through Amazon Mechanical Turk, an online survey platform, to participate in a transportation study using this platform. On average, the platform distilled 65 feasible travel plans per individual into two recommended itineraries, each optimal according to one or more outcomes and dependent on the fixed times and locations from the travel diary. For 45 percent of users, the trip recommendation algorithm returned only a single, typically automobile-centric, itinerary because there were no other viable alternative transportation modes available. Platform users generally agreed that the dashboard was enjoyable and easy to use, and that it would be a helpful tool in adopting new travel behaviors. Users generally agreed most that the time, cost and user preferred recommendations 'made sense' to them, and were most willing to implement these itineraries. Platform users typically expressed low willingness to try the carbon and calories optimized itineraries. Of the platform users who viewed the dashboard, 13 percent reported changing their travel behavior, most adopting the time, calories or carbon optimized itineraries. While the algorithm incorporates a wealth of travel data obtained from online APIs pertaining to a travelers route such as historic traffic condition data, public transit time-tables, and bike path routes, open-ended responses from users expressed an interest in the integration of even more fine-grained traffic data and the ability to dynamically model the effect of changes in travel times. Users also commonly expressed concerns over the safety of walking and biking recommendations. Responses indicate that more information about the amenities available to cyclists and pedestrians (sidewalks, shade from trees, access to food) and routes that avoid areas of perceived elevated danger would reduce barriers to implementing these recommendations. More accurate representations of personal vehicle trips (based on vehicle make and model, implications of parking) and the identification of routes that optimize caloric intensity (seeking out elevation changes or longer walks to public transit) are promising avenues for future research.« less
21 CFR 803.33 - If I am a user facility, what must I include when I submit an annual report?
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES MEDICAL DEVICE REPORTING User Facility... medical device reports, or the number assigned by us for reporting purposes in accordance with § 803.3; (2...) Date of the annual report and report numbers identifying the range of medical device reports that you...
21 CFR 803.33 - If I am a user facility, what must I include when I submit an annual report?
Code of Federal Regulations, 2010 CFR
2010-04-01
..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES MEDICAL DEVICE REPORTING User Facility... medical device reports, or the number assigned by us for reporting purposes in accordance with § 803.3; (2...) Date of the annual report and report numbers identifying the range of medical device reports that you...
1986-01-01
Chapter it Research Methodology This chapter describes the methodology and the experimental design used for this research. Prior to discussing the...50 Experimental Design ............................... 50 Task/Treatm ent ................................... 55 Task Design ...Figure 3.3 Interface Experiment Elements ............... 54 Figure 3.4 Experimental Design ....................... 55 Figure 3.5 Subject Assignment
ERIC Educational Resources Information Center
Burkholder-Juhasz, Rose A.; Levi, Susannah V.; Dillon, Caitlin M.; Pisoni, David B.
2007-01-01
Nonword repetition skills were examined in 24 pediatric cochlear implant (CI) users and 18 normal-hearing (NH) adult listeners listening through a CI simulator. Two separate groups of NH adult listeners assigned accuracy ratings to the nonword responses of the pediatric CI users and the NH adult speakers. Overall, the nonword repetitions of…
Words That Fascinate the Listener: Predicting Affective Ratings of On-Line Lectures
ERIC Educational Resources Information Center
Weninger, Felix; Staudt, Pascal; Schuller, Björn
2013-01-01
In a large scale study on 843 transcripts of Technology, Entertainment and Design (TED) talks, the authors address the relation between word usage and categorical affective ratings of lectures by a large group of internet users. Users rated the lectures by assigning one or more predefined tags which relate to the affective state evoked in the…
A new algorithm for reliable and general NMR resonance assignment.
Schmidt, Elena; Güntert, Peter
2012-08-01
The new FLYA automated resonance assignment algorithm determines NMR chemical shift assignments on the basis of peak lists from any combination of multidimensional through-bond or through-space NMR experiments for proteins. Backbone and side-chain assignments can be determined. All experimental data are used simultaneously, thereby exploiting optimally the redundancy present in the input peak lists and circumventing potential pitfalls of assignment strategies in which results obtained in a given step remain fixed input data for subsequent steps. Instead of prescribing a specific assignment strategy, the FLYA resonance assignment algorithm requires only experimental peak lists and the primary structure of the protein, from which the peaks expected in a given spectrum can be generated by applying a set of rules, defined in a straightforward way by specifying through-bond or through-space magnetization transfer pathways. The algorithm determines the resonance assignment by finding an optimal mapping between the set of expected peaks that are assigned by definition but have unknown positions and the set of measured peaks in the input peak lists that are initially unassigned but have a known position in the spectrum. Using peak lists obtained by purely automated peak picking from the experimental spectra of three proteins, FLYA assigned correctly 96-99% of the backbone and 90-91% of all resonances that could be assigned manually. Systematic studies quantified the impact of various factors on the assignment accuracy, namely the extent of missing real peaks and the amount of additional artifact peaks in the input peak lists, as well as the accuracy of the peak positions. Comparing the resonance assignments from FLYA with those obtained from two other existing algorithms showed that using identical experimental input data these other algorithms yielded significantly (40-142%) more erroneous assignments than FLYA. The FLYA resonance assignment algorithm thus has the reliability and flexibility to replace most manual and semi-automatic assignment procedures for NMR studies of proteins.
A framework supporting the development of a Grid portal for analysis based on ROI.
Ichikawa, K; Date, S; Kaishima, T; Shimojo, S
2005-01-01
In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.
Space Station engineering and technology development
NASA Technical Reports Server (NTRS)
1985-01-01
Historical background, costs, organizational assignments, technology development, user requirements, mission evolution, systems analyses and design, systems engineering and integration, contracting, and policies of the space station are discussed.
Gasparini, Roberto; Bonanni, Paolo; Icardi, Giancarlo; Amicizia, Daniela; Arata, Lucia; Carozzo, Stefano; Signori, Alessio; Bechini, Angela; Boccalini, Sara
2016-01-01
Background The recently launched Pneumo Rischio eHealth project, which consists of an app, a website, and social networking activity, is aimed at increasing public awareness of invasive pneumococcal disease (IPD). The launch of this project was prompted by the inadequate awareness of IPD among both laypeople and health care workers, the heavy socioeconomic burden of IPD, and the far from optimal vaccination coverage in Italy, despite the availability of safe and effective vaccines. Objective The objectives of our study were to analyze trends in Pneumo Rischio usage before and after a promotional campaign, to characterize its end users, and to assess its user-rated quality. Methods At 7 months after launching Pneumo Rischio, we established a 4-month marketing campaign to promote the project. This intervention used various approaches and channels, including both traditional and digital marketing strategies. To highlight usage trends, we used different techniques of time series analysis and modeling, including a modified Mann-Kendall test, change-point detection, and segmented negative binomial regression of interrupted time series. Users were characterized in terms of demographics and IPD risk categories. Customer-rated quality was evaluated by means of a standardized tool in a sample of app users. Results Over 1 year, the app was accessed by 9295 users and the website was accessed by 143,993 users, while the project’s Facebook page had 1216 fans. The promotional intervention was highly effective in increasing the daily number of users. In particular, the Mann-Kendall trend test revealed a significant (P ≤.01) increasing trend in both app and website users, while change-point detection analysis showed that the first significant change corresponded to the start of the promotional campaign. Regression analysis showed a significant immediate effect of the intervention, with a mean increase in daily numbers of users of 1562% (95% CI 456%-4870%) for the app and 620% (95% CI 176%-1777%) for the website. Similarly, the postintervention daily trend in the number of users was positive, with a relative increase of 0.9% (95% CI 0.0%-1.8%) for the app and 1.4% (95% CI 0.7%-2.1%) for the website. Demographics differed between app and website users and Facebook fans. A total of 69.15% (10,793/15,608) of users could be defined as being at risk of IPD, while 4729 users expressed intentions to ask their doctor for further information on IPD. The mean app quality score assigned by end users was approximately 79.5% (397/500). Conclusions Despite its specific topic, Pneumo Rischio was accessed by a considerable number of users, who ranked it as a high-quality project. In order to reach their target populations, however, such projects should be promoted. PMID:27913372
Villarubia, Gabriel; De Paz, Juan F.; Bajo, Javier
2017-01-01
The use of electric bikes (e-bikes) has grown in popularity, especially in large cities where overcrowding and traffic congestion are common. This paper proposes an intelligent engine management system for e-bikes which uses the information collected from sensors to optimize battery energy and time. The intelligent engine management system consists of a built-in network of sensors in the e-bike, which is used for multi-sensor data fusion; the collected data is analysed and fused and on the basis of this information the system can provide the user with optimal and personalized assistance. The user is given recommendations related to battery consumption, sensors, and other parameters associated with the route travelled, such as duration, speed, or variation in altitude. To provide a user with these recommendations, artificial neural networks are used to estimate speed and consumption for each of the segments of a route. These estimates are incorporated into evolutionary algorithms in order to make the optimizations. A comparative analysis of the results obtained has been conducted for when routes were travelled with and without the optimization system. From the experiments, it is evident that the use of an engine management system results in significant energy and time savings. Moreover, user satisfaction increases as the level of assistance adapts to user behavior and the characteristics of the route. PMID:29088087
De La Iglesia, Daniel H; Villarrubia, Gabriel; De Paz, Juan F; Bajo, Javier
2017-10-31
The use of electric bikes (e-bikes) has grown in popularity, especially in large cities where overcrowding and traffic congestion are common. This paper proposes an intelligent engine management system for e-bikes which uses the information collected from sensors to optimize battery energy and time. The intelligent engine management system consists of a built-in network of sensors in the e-bike, which is used for multi-sensor data fusion; the collected data is analysed and fused and on the basis of this information the system can provide the user with optimal and personalized assistance. The user is given recommendations related to battery consumption, sensors, and other parameters associated with the route travelled, such as duration, speed, or variation in altitude. To provide a user with these recommendations, artificial neural networks are used to estimate speed and consumption for each of the segments of a route. These estimates are incorporated into evolutionary algorithms in order to make the optimizations. A comparative analysis of the results obtained has been conducted for when routes were travelled with and without the optimization system. From the experiments, it is evident that the use of an engine management system results in significant energy and time savings. Moreover, user satisfaction increases as the level of assistance adapts to user behavior and the characteristics of the route.
NASA Astrophysics Data System (ADS)
Arévalo, Germán. V.; Hincapié, Roberto C.; Sierra, Javier E.
2015-09-01
UDWDM PON is a leading technology oriented to provide ultra-high bandwidth to final users while profiting the physical channels' capability. One of the main drawbacks of UDWDM technique is the fact that the nonlinear effects, like FWM, become stronger due to the close spectral proximity among channels. This work proposes a model for the optimal deployment of this type of networks taking into account the fiber length limitations imposed by physical restrictions related with the fiber's data transmission as well as the users' asymmetric distribution in a provided region. The proposed model employs the data transmission related effects in UDWDM PON as restrictions in the optimization problem and also considers the user's asymmetric clustering and the subdivision of the users region though a Voronoi geometric partition technique. Here it is considered de Voronoi dual graph, it is the Delaunay Triangulation, as the planar graph for resolving the problem related with the minimum weight of the fiber links.
Measuring diagnoses: ICD code accuracy.
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-10-01
To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Main error sources along the "patient trajectory" include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the "paper trail" include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways.
NASA Astrophysics Data System (ADS)
Lali, Mehdi
2009-03-01
A comprehensive computer program is designed in MATLAB to analyze, design and optimize the propulsion, dynamics, thermodynamics, and kinematics of any serial multi-staging rocket for a set of given data. The program is quite user-friendly. It comprises two main sections: "analysis and design" and "optimization." Each section has a GUI (Graphical User Interface) in which the rocket's data are entered by the user and by which the program is run. The first section analyzes the performance of the rocket that is previously devised by the user. Numerous plots and subplots are provided to display the performance of the rocket. The second section of the program finds the "optimum trajectory" via billions of iterations and computations which are done through sophisticated algorithms using numerical methods and incremental integrations. Innovative techniques are applied to calculate the optimal parameters for the engine and designing the "optimal pitch program." This computer program is stand-alone in such a way that it calculates almost every design parameter in regards to rocket propulsion and dynamics. It is meant to be used for actual launch operations as well as educational and research purposes.
CEASAW: A User-Friendly Computer Environment Analysis for the Sawmill Owner
Guillermo Mendoza; William Sprouse; Philip A. Araman; William G. Luppold
1991-01-01
Improved spreadsheet software capabilities have brought optimization to users with little or no background in mathematical programming. Better interface capabilities of spreadsheet models now make it possible to combine optimization models with a spreadsheet system. Sawmill production and inventory systems possess many features that make them suitable application...
Middleton, John; Vaks, Jeffrey E
2007-04-01
Errors of calibrator-assigned values lead to errors in the testing of patient samples. The ability to estimate the uncertainties of calibrator-assigned values and other variables minimizes errors in testing processes. International Organization of Standardization guidelines provide simple equations for the estimation of calibrator uncertainty with simple value-assignment processes, but other methods are needed to estimate uncertainty in complex processes. We estimated the assigned-value uncertainty with a Monte Carlo computer simulation of a complex value-assignment process, based on a formalized description of the process, with measurement parameters estimated experimentally. This method was applied to study uncertainty of a multilevel calibrator value assignment for a prealbumin immunoassay. The simulation results showed that the component of the uncertainty added by the process of value transfer from the reference material CRM470 to the calibrator is smaller than that of the reference material itself (<0.8% vs 3.7%). Varying the process parameters in the simulation model allowed for optimizing the process, while keeping the added uncertainty small. The patient result uncertainty caused by the calibrator uncertainty was also found to be small. This method of estimating uncertainty is a powerful tool that allows for estimation of calibrator uncertainty for optimization of various value assignment processes, with a reduced number of measurements and reagent costs, while satisfying the requirements to uncertainty. The new method expands and augments existing methods to allow estimation of uncertainty in complex processes.
Kononowicz, Andrzej A; Berman, Anne H; Stathakarou, Natalia; McGrath, Cormac; Bartyński, Tomasz; Nowakowski, Piotr; Malawski, Maciej; Zary, Nabil
2015-09-10
Massive open online courses (MOOCs) have been criticized for focusing on presentation of short video clip lectures and asking theoretical multiple-choice questions. A potential way of vitalizing these educational activities in the health sciences is to introduce virtual patients. Experiences from such extensions in MOOCs have not previously been reported in the literature. This study analyzes technical challenges and solutions for offering virtual patients in health-related MOOCs and describes patterns of virtual patient use in one such course. Our aims are to reduce the technical uncertainty related to these extensions, point to aspects that could be optimized for a better learner experience, and raise prospective research questions by describing indicators of virtual patient use on a massive scale. The Behavioral Medicine MOOC was offered by Karolinska Institutet, a medical university, on the EdX platform in the autumn of 2014. Course content was enhanced by two virtual patient scenarios presented in the OpenLabyrinth system and hosted on the VPH-Share cloud infrastructure. We analyzed web server and session logs and a participant satisfaction survey. Navigation pathways were summarized using a visual analytics tool developed for the purpose of this study. The number of course enrollments reached 19,236. At the official closing date, 2317 participants (12.1% of total enrollment) had declared completing the first virtual patient assignment and 1640 (8.5%) participants confirmed completion of the second virtual patient assignment. Peak activity involved 359 user sessions per day. The OpenLabyrinth system, deployed on four virtual servers, coped well with the workload. Participant survey respondents (n=479) regarded the activity as a helpful exercise in the course (83.1%). Technical challenges reported involved poor or restricted access to videos in certain areas of the world and occasional problems with lost sessions. The visual analyses of user pathways display the parts of virtual patient scenarios that elicited less interest and may have been perceived as nonchallenging options. Analyzing the user navigation pathways allowed us to detect indications of both surface and deep approaches to the content material among the MOOC participants. This study reported on first inclusion of virtual patients in a MOOC. It adds to the body of knowledge by demonstrating how a biomedical cloud provider service can ensure technical capacity and flexible design of a virtual patient platform on a massive scale. The study also presents a new way of analyzing the use of branched virtual patients by visualization of user navigation pathways. Suggestions are offered on improvements to the design of virtual patients in MOOCs.
Berman, Anne H; Stathakarou, Natalia; McGrath, Cormac; Bartyński, Tomasz; Nowakowski, Piotr; Malawski, Maciej; Zary, Nabil
2015-01-01
Background Massive open online courses (MOOCs) have been criticized for focusing on presentation of short video clip lectures and asking theoretical multiple-choice questions. A potential way of vitalizing these educational activities in the health sciences is to introduce virtual patients. Experiences from such extensions in MOOCs have not previously been reported in the literature. Objective This study analyzes technical challenges and solutions for offering virtual patients in health-related MOOCs and describes patterns of virtual patient use in one such course. Our aims are to reduce the technical uncertainty related to these extensions, point to aspects that could be optimized for a better learner experience, and raise prospective research questions by describing indicators of virtual patient use on a massive scale. Methods The Behavioral Medicine MOOC was offered by Karolinska Institutet, a medical university, on the EdX platform in the autumn of 2014. Course content was enhanced by two virtual patient scenarios presented in the OpenLabyrinth system and hosted on the VPH-Share cloud infrastructure. We analyzed web server and session logs and a participant satisfaction survey. Navigation pathways were summarized using a visual analytics tool developed for the purpose of this study. Results The number of course enrollments reached 19,236. At the official closing date, 2317 participants (12.1% of total enrollment) had declared completing the first virtual patient assignment and 1640 (8.5%) participants confirmed completion of the second virtual patient assignment. Peak activity involved 359 user sessions per day. The OpenLabyrinth system, deployed on four virtual servers, coped well with the workload. Participant survey respondents (n=479) regarded the activity as a helpful exercise in the course (83.1%). Technical challenges reported involved poor or restricted access to videos in certain areas of the world and occasional problems with lost sessions. The visual analyses of user pathways display the parts of virtual patient scenarios that elicited less interest and may have been perceived as nonchallenging options. Analyzing the user navigation pathways allowed us to detect indications of both surface and deep approaches to the content material among the MOOC participants. Conclusions This study reported on first inclusion of virtual patients in a MOOC. It adds to the body of knowledge by demonstrating how a biomedical cloud provider service can ensure technical capacity and flexible design of a virtual patient platform on a massive scale. The study also presents a new way of analyzing the use of branched virtual patients by visualization of user navigation pathways. Suggestions are offered on improvements to the design of virtual patients in MOOCs. PMID:27731844
NASA Astrophysics Data System (ADS)
Ding, Zhongan; Gao, Chen; Yan, Shengteng; Yang, Canrong
2017-10-01
The power user electric energy data acquire system (PUEEDAS) is an important part of smart grid. This paper builds a multi-objective optimization model for the performance of the PUEEADS from the point of view of the combination of the comprehensive benefits and cost. Meanwhile, the Chebyshev decomposition approach is used to decompose the multi-objective optimization problem. We design a MOEA/D evolutionary algorithm to solve the problem. By analyzing the Pareto optimal solution set of multi-objective optimization problem and comparing it with the monitoring value to grasp the direction of optimizing the performance of the PUEEDAS. Finally, an example is designed for specific analysis.
Fulfillment of HTTP Authentication Based on Alcatel OmniSwitch 9700
NASA Astrophysics Data System (ADS)
Liu, Hefu
This paper provides a way of HTTP authentication On Alcatel OmniSwitch 9700. Authenticated VLANs control user access to network resources based on VLAN assignment and user authentication. The user can be authenticated through the switch via any standard Web browser software. Web browser client displays the username and password prompts. Then a way for HTML forms can be given to pass HTTP authentication data when it's submitted. A radius server will provide a database of user information that the switch checks whenever it tries to authenticate through the switch. Before or after authentication, the client can get an address from a Dhcp server.
ERIC Educational Resources Information Center
College and Univ. Computer Users Association, Columbia, SC.
The 36 papers contained in this collection from the College and University Computer Users Conference (CUMREC '93) are grouped under six topic areas. The main subject areas and examples of the topics covered are: (1) computer-based student support systems, including telecounseling and recruiting, a student advising system, the assignment of…
Code of Federal Regulations, 2010 CFR
2010-07-01
... characteristics; i.e., levels of biochemical oxygen demand, suspended solids, etc. Each class is then assigned its... works. Factors such as strength, volume, and delivery flow rate characteristics shall be considered and... user charges can be developed on a volume basis in accordance with the model below: Cu = CT/VT(Vu) (2...
Kent, Angela D.; Smith, Dan J.; Benson, Barbara J.; Triplett, Eric W.
2003-01-01
Culture-independent DNA fingerprints are commonly used to assess the diversity of a microbial community. However, relating species composition to community profiles produced by community fingerprint methods is not straightforward. Terminal restriction fragment length polymorphism (T-RFLP) is a community fingerprint method in which phylogenetic assignments may be inferred from the terminal restriction fragment (T-RF) sizes through the use of web-based resources that predict T-RF sizes for known bacteria. The process quickly becomes computationally intensive due to the need to analyze profiles produced by multiple restriction digests and the complexity of profiles generated by natural microbial communities. A web-based tool is described here that rapidly generates phylogenetic assignments from submitted community T-RFLP profiles based on a database of fragments produced by known 16S rRNA gene sequences. Users have the option of submitting a customized database generated from unpublished sequences or from a gene other than the 16S rRNA gene. This phylogenetic assignment tool allows users to employ T-RFLP to simultaneously analyze microbial community diversity and species composition. An analysis of the variability of bacterial species composition throughout the water column in a humic lake was carried out to demonstrate the functionality of the phylogenetic assignment tool. This method was validated by comparing the results generated by this program with results from a 16S rRNA gene clone library. PMID:14602639
Re-Engineering the United States Marine Corps’ Enlisted Assignment Model (EAM)
1998-06-01
and Execution button opens the switchboard in Figure 12. This form accesses all of the VBA code that is associated with this form and the code that...the prototype to prompt the user and to inform him of what he is about to do. Each of the buttons that are on these forms is connected to an SQL ...into the field for building a rule are the values 2;3;4;5;6;7. The user can enter an SQL statement that would determine the values or the user could
Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah
2016-01-01
The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them. PMID:26819585
Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah
2016-01-01
The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.
Three-Dimensional Online Visualization and Engagement Tools for the Geosciences
NASA Astrophysics Data System (ADS)
Cockett, R.; Moran, T.; Pidlisecky, A.
2013-12-01
Educational tools often sacrifice interactivity in favour of scalability so they can reach more users. This compromise leads to tools that may be viewed as second tier when compared to more engaging activities performed in a laboratory; however, the resources required to deliver laboratory exercises that are scalable is often impractical. Geoscience education is well situated to benefit from interactive online learning tools that allow users to work in a 3D environment. Visible Geology (http://3ptscience.com/visiblegeology) is an innovative web-based application designed to enable visualization of geologic structures and processes through the use of interactive 3D models. The platform allows users to conceptualize difficult, yet important geologic principles in a scientifically accurate manner by developing unique geologic models. The environment allows students to interactively practice their visualization and interpretation skills by creating and interacting with their own models and terrains. Visible Geology has been designed from a user centric perspective resulting in a simple and intuitive interface. The platform directs students to build there own geologic models by adding beds and creating geologic events such as tilting, folding, or faulting. The level of ownership and interactivity encourages engagement, leading learners to discover geologic relationships on their own, in the context of guided assignments. In January 2013, an interactive geologic history assignment was developed for a 700-student introductory geology class at The University of British Columbia. The assignment required students to distinguish the relative age of geologic events to construct a geologic history. Traditionally this type of exercise has been taught through the use of simple geologic cross-sections showing crosscutting relationships; from these cross-sections students infer the relative age of geologic events. In contrast, the Visible Geology assignment offers students a unique experience where they first create their own geologic events allowing them to directly see how the timing of a geologic event manifests in the model and resulting cross-sections. By creating each geologic event in the model themselves, the students gain a deeper understanding of the processes and relative order of events. The resulting models can be shared amongst students, and provide instructors with a basis for guiding inquiry to address misconceptions. The ease of use of the assignment, including automatic assessment, made this tool practical for deployment in this 700 person class. The outcome of this type of large scale deployment is that students, who would normally not experience a lab exercise, gain exposure to interactive 3D thinking. Engaging tools and software that puts the user in control of their learning experiences is critical for moving to scalable, yet engaging, online learning environments.
ERIC Educational Resources Information Center
Schmidt, Aaron
2010-01-01
User experience (UX) is about arranging the elements of a product or service to optimize how people will interact with it. In this article, the author talks about the importance of user experience and discusses the design of user experiences in libraries. He first looks at what UX is. Then he describes three kinds of user experience design: (1)…
Dynamics of backlight luminance for using smartphone in dark environment
NASA Astrophysics Data System (ADS)
Na, Nooree; Jang, Jiho; Suk, Hyeon-Jeong
2014-02-01
This study developed dynamic backlight luminance, which gradually changes as time passes for comfortable use of a smartphone display in a dark environment. The study was carried out in two stages. In the first stage, a user test was conducted to identify the optimal luminance by assessing the facial squint level, subjective glare evaluation, eye blink frequency and users' subjective preferences. Based on the results of the user test, the dynamics of backlight luminance was designed. It has two levels of luminance: the optimal level for initial viewing to avoid sudden glare or fatigue to users' eyes, and the optimal level for constant viewing, which is comfortable, but also bright enough for constant reading of the displayed material. The luminance for initial viewing starts from 10 cd/m2, and it gradually increases to 40 cd/m2 for users' visual comfort at constant viewing for 20 seconds; In the second stage, a validation test on dynamics of backlight luminance was conducted to verify the effectiveness of the developed dynamics. It involving users' subjective preferences, eye blink frequency, and brainwave analysis using the electroencephalogram (EEG) to confirm that the proposed dynamic backlighting enhances users' visual comfort and visual cognition, particularly for using smartphones in a dark environment.
Systematic Sensor Selection Strategy (S4) User Guide
NASA Technical Reports Server (NTRS)
Sowers, T. Shane
2012-01-01
This paper describes a User Guide for the Systematic Sensor Selection Strategy (S4). S4 was developed to optimally select a sensor suite from a larger pool of candidate sensors based on their performance in a diagnostic system. For aerospace systems, selecting the proper sensors is important for ensuring adequate measurement coverage to satisfy operational, maintenance, performance, and system diagnostic criteria. S4 optimizes the selection of sensors based on the system fault diagnostic approach while taking conflicting objectives such as cost, weight and reliability into consideration. S4 can be described as a general architecture structured to accommodate application-specific components and requirements. It performs combinational optimization with a user defined merit or cost function to identify optimum or near-optimum sensor suite solutions. The S4 User Guide describes the sensor selection procedure and presents an example problem using an open source turbofan engine simulation to demonstrate its application.
On-demand high-capacity ride-sharing via dynamic trip-vehicle assignment
Alonso-Mora, Javier; Samaranayake, Samitha; Wallar, Alex; Frazzoli, Emilio; Rus, Daniela
2017-01-01
Ride-sharing services are transforming urban mobility by providing timely and convenient transportation to anybody, anywhere, and anytime. These services present enormous potential for positive societal impacts with respect to pollution, energy consumption, congestion, etc. Current mathematical models, however, do not fully address the potential of ride-sharing. Recently, a large-scale study highlighted some of the benefits of car pooling but was limited to static routes with two riders per vehicle (optimally) or three (with heuristics). We present a more general mathematical model for real-time high-capacity ride-sharing that (i) scales to large numbers of passengers and trips and (ii) dynamically generates optimal routes with respect to online demand and vehicle locations. The algorithm starts from a greedy assignment and improves it through a constrained optimization, quickly returning solutions of good quality and converging to the optimal assignment over time. We quantify experimentally the tradeoff between fleet size, capacity, waiting time, travel delay, and operational costs for low- to medium-capacity vehicles, such as taxis and van shuttles. The algorithm is validated with ∼3 million rides extracted from the New York City taxicab public dataset. Our experimental study considers ride-sharing with rider capacity of up to 10 simultaneous passengers per vehicle. The algorithm applies to fleets of autonomous vehicles and also incorporates rebalancing of idling vehicles to areas of high demand. This framework is general and can be used for many real-time multivehicle, multitask assignment problems. PMID:28049820
On-demand high-capacity ride-sharing via dynamic trip-vehicle assignment.
Alonso-Mora, Javier; Samaranayake, Samitha; Wallar, Alex; Frazzoli, Emilio; Rus, Daniela
2017-01-17
Ride-sharing services are transforming urban mobility by providing timely and convenient transportation to anybody, anywhere, and anytime. These services present enormous potential for positive societal impacts with respect to pollution, energy consumption, congestion, etc. Current mathematical models, however, do not fully address the potential of ride-sharing. Recently, a large-scale study highlighted some of the benefits of car pooling but was limited to static routes with two riders per vehicle (optimally) or three (with heuristics). We present a more general mathematical model for real-time high-capacity ride-sharing that (i) scales to large numbers of passengers and trips and (ii) dynamically generates optimal routes with respect to online demand and vehicle locations. The algorithm starts from a greedy assignment and improves it through a constrained optimization, quickly returning solutions of good quality and converging to the optimal assignment over time. We quantify experimentally the tradeoff between fleet size, capacity, waiting time, travel delay, and operational costs for low- to medium-capacity vehicles, such as taxis and van shuttles. The algorithm is validated with ∼3 million rides extracted from the New York City taxicab public dataset. Our experimental study considers ride-sharing with rider capacity of up to 10 simultaneous passengers per vehicle. The algorithm applies to fleets of autonomous vehicles and also incorporates rebalancing of idling vehicles to areas of high demand. This framework is general and can be used for many real-time multivehicle, multitask assignment problems.
How to Develop a User Interface That Your Real Users Will Love
ERIC Educational Resources Information Center
Phillips, Donald
2012-01-01
A "user interface" is the part of an interactive system that bridges the user and the underlying functionality of the system. But people sometimes forget that the best interfaces will provide a platform to optimize the users' interactions so that they support and extend the users' activities in effective, useful, and usable ways. To look at it…
Optimization Model for Web Based Multimodal Interactive Simulations.
Halic, Tansel; Ahn, Woojin; De, Suvranu
2015-07-15
This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.
Optimization Model for Web Based Multimodal Interactive Simulations
Halic, Tansel; Ahn, Woojin; De, Suvranu
2015-01-01
This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update. In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach. PMID:26085713
Optimal Coordination of Building Loads and Energy Storage for Power Grid and End User Services
Hao, He; Wu, Di; Lian, Jianming; ...
2017-01-18
Demand response and energy storage play a profound role in the smart grid. The focus of this study is to evaluate benefits of coordinating flexible loads and energy storage to provide power grid and end user services. We present a Generalized Battery Model (GBM) to describe the flexibility of building loads and energy storage. An optimization-based approach is proposed to characterize the parameters (power and energy limits) of the GBM for flexible building loads. We then develop optimal coordination algorithms to provide power grid and end user services such as energy arbitrage, frequency regulation, spinning reserve, as well as energymore » cost and demand charge reduction. Several case studies have been performed to demonstrate the efficacy of the GBM and coordination algorithms, and evaluate the benefits of using their flexibility for power grid and end user services. We show that optimal coordination yields significant cost savings and revenue. Moreover, the best option for power grid services is to provide energy arbitrage and frequency regulation. Finally and furthermore, when coordinating flexible loads with energy storage to provide end user services, it is recommended to consider demand charge in addition to time-of-use price in order to flatten the aggregate power profile.« less
Neerincx, Pieter BT; Casel, Pierrot; Prickett, Dennis; Nie, Haisheng; Watson, Michael; Leunissen, Jack AM; Groenen, Martien AM; Klopp, Christophe
2009-01-01
Background Reliable annotation linking oligonucleotide probes to target genes is essential for functional biological analysis of microarray experiments. We used the IMAD, OligoRAP and sigReannot pipelines to update the annotation for the ARK-Genomics Chicken 20 K array as part of a joined EADGENE/SABRE workshop. In this manuscript we compare their annotation strategies and results. Furthermore, we analyse the effect of differences in updated annotation on functional analysis for an experiment involving Eimeria infected chickens and finally we propose guidelines for optimal annotation strategies. Results IMAD, OligoRAP and sigReannot update both annotation and estimated target specificity. The 3 pipelines can assign oligos to target specificity categories although with varying degrees of resolution. Target specificity is judged based on the amount and type of oligo versus target-gene alignments (hits), which are determined by filter thresholds that users can adjust based on their experimental conditions. Linking oligos to annotation on the other hand is based on rigid rules, which differ between pipelines. For 52.7% of the oligos from a subset selected for in depth comparison all pipelines linked to one or more Ensembl genes with consensus on 44.0%. In 31.0% of the cases none of the pipelines could assign an Ensembl gene to an oligo and for the remaining 16.3% the coverage differed between pipelines. Differences in updated annotation were mainly due to different thresholds for hybridisation potential filtering of oligo versus target-gene alignments and different policies for expanding annotation using indirect links. The differences in updated annotation packages had a significant effect on GO term enrichment analysis with consensus on only 67.2% of the enriched terms. Conclusion In addition to flexible thresholds to determine target specificity, annotation tools should provide metadata describing the relationships between oligos and the annotation assigned to them. These relationships can then be used to judge the varying degrees of reliability allowing users to fine-tune the balance between reliability and coverage. This is important as it can have a significant effect on functional microarray analysis as exemplified by the lack of consensus on almost one third of the terms found with GO term enrichment analysis based on updated IMAD, OligoRAP or sigReannot annotation. PMID:19615109
Neerincx, Pieter Bt; Casel, Pierrot; Prickett, Dennis; Nie, Haisheng; Watson, Michael; Leunissen, Jack Am; Groenen, Martien Am; Klopp, Christophe
2009-07-16
Reliable annotation linking oligonucleotide probes to target genes is essential for functional biological analysis of microarray experiments. We used the IMAD, OligoRAP and sigReannot pipelines to update the annotation for the ARK-Genomics Chicken 20 K array as part of a joined EADGENE/SABRE workshop. In this manuscript we compare their annotation strategies and results. Furthermore, we analyse the effect of differences in updated annotation on functional analysis for an experiment involving Eimeria infected chickens and finally we propose guidelines for optimal annotation strategies. IMAD, OligoRAP and sigReannot update both annotation and estimated target specificity. The 3 pipelines can assign oligos to target specificity categories although with varying degrees of resolution. Target specificity is judged based on the amount and type of oligo versus target-gene alignments (hits), which are determined by filter thresholds that users can adjust based on their experimental conditions. Linking oligos to annotation on the other hand is based on rigid rules, which differ between pipelines.For 52.7% of the oligos from a subset selected for in depth comparison all pipelines linked to one or more Ensembl genes with consensus on 44.0%. In 31.0% of the cases none of the pipelines could assign an Ensembl gene to an oligo and for the remaining 16.3% the coverage differed between pipelines. Differences in updated annotation were mainly due to different thresholds for hybridisation potential filtering of oligo versus target-gene alignments and different policies for expanding annotation using indirect links. The differences in updated annotation packages had a significant effect on GO term enrichment analysis with consensus on only 67.2% of the enriched terms. In addition to flexible thresholds to determine target specificity, annotation tools should provide metadata describing the relationships between oligos and the annotation assigned to them. These relationships can then be used to judge the varying degrees of reliability allowing users to fine-tune the balance between reliability and coverage. This is important as it can have a significant effect on functional microarray analysis as exemplified by the lack of consensus on almost one third of the terms found with GO term enrichment analysis based on updated IMAD, OligoRAP or sigReannot annotation.
Equivalence between entanglement and the optimal fidelity of continuous variable teleportation.
Adesso, Gerardo; Illuminati, Fabrizio
2005-10-07
We devise the optimal form of Gaussian resource states enabling continuous-variable teleportation with maximal fidelity. We show that a nonclassical optimal fidelity of N-user teleportation networks is necessary and sufficient for N-party entangled Gaussian resources, yielding an estimator of multipartite entanglement. The entanglement of teleportation is equivalent to the entanglement of formation in a two-user protocol, and to the localizable entanglement in a multiuser one. Finally, we show that the continuous-variable tangle, quantifying entanglement sharing in three-mode Gaussian states, is defined operationally in terms of the optimal fidelity of a tripartite teleportation network.
DNASynth: a software application to optimization of artificial gene synthesis
NASA Astrophysics Data System (ADS)
Muczyński, Jan; Nowak, Robert M.
2017-08-01
DNASynth is a client-server software application in which the client runs in a web browser. The aim of this program is to support and optimize process of artificial gene synthesizing using Ligase Chain Reaction. Thanks to LCR it is possible to obtain DNA strand coding defined by user peptide. The DNA sequence is calculated by optimization algorithm that consider optimal codon usage, minimal energy of secondary structures and minimal number of required LCR. Additionally absence of sequences characteristic for defined by user set of restriction enzymes is guaranteed. The presented software was tested on synthetic and real data.
Comparative risk assessment and cessation information seeking among smokeless tobacco users.
Jun, Jungmi; Nan, Xiaoli
2018-05-01
This research examined (1) smokeless tobacco users' comparative optimism in assessing the health and addiction risks of their own product in comparison with cigarettes, and (2) the effects of comparative optimism on cessation information-seeking. A nationally-representative sample from the 2015 Health Information National Trends Survey (HINTS)-FDA was employed. The analyses revealed the presence of comparative optimism in assessing both health and addiction risks among smokeless tobacco users. Comparative optimism was negatively correlated with most cessation information-seeking variables. Health bias (the health risk rating gap between the subject's own tobacco product and cigarettes) was associated with decreased intent to use cessation support. However, the health bias and addiction bias (the addiction risk rating gap between the subject's own tobacco product and cigarettes) were not consistent predictors of all cessation information-seeking variables, when covariates of socio-demographics and tobacco use status were included. In addition, positive correlations between health bias and past/recent cessation-information searches were observed. Optimisic biases may negatively influence cessation behaviors not only directly but also indirectly by influencing an important moderator, cessation information-seeking. Future interventions should prioritize dispelling the comparative optimism in perceiving risks of smokeless tobacco use, as well as provide more reliable cessation information specific to smokeless tobacco users. Copyright © 2018 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Ku, David Tawei; Chang, Chia-Chi
2014-01-01
By conducting usability testing on a multilanguage Web site, this study analyzed the cultural differences between Taiwanese and American users in the performance of assigned tasks. To provide feasible insight into cross-cultural Web site design, Microsoft Office Online (MOO) that supports both traditional Chinese and English and contains an almost…
Code of Federal Regulations, 2014 CFR
2014-07-01
... characteristics; i.e., levels of biochemical oxygen demand, suspended solids, etc. Each class is then assigned its... all users per unit of time. Bc = O&M cost for treatment of a unit of biochemical oxygen demand (BOD... only in cases where the water charge is based on a constant cost per unit of consumption. [39 FR 5270...
Code of Federal Regulations, 2012 CFR
2012-07-01
... characteristics; i.e., levels of biochemical oxygen demand, suspended solids, etc. Each class is then assigned its... all users per unit of time. Bc = O&M cost for treatment of a unit of biochemical oxygen demand (BOD... only in cases where the water charge is based on a constant cost per unit of consumption. [39 FR 5270...
Code of Federal Regulations, 2013 CFR
2013-07-01
... characteristics; i.e., levels of biochemical oxygen demand, suspended solids, etc. Each class is then assigned its... all users per unit of time. Bc = O&M cost for treatment of a unit of biochemical oxygen demand (BOD... only in cases where the water charge is based on a constant cost per unit of consumption. [39 FR 5270...
NASA Technical Reports Server (NTRS)
Callac, Christopher; Lunsford, Michelle
2005-01-01
The NASA Records Database, comprising a Web-based application program and a database, is used to administer an archive of paper records at Stennis Space Center. The system begins with an electronic form, into which a user enters information about records that the user is sending to the archive. The form is smart : it provides instructions for entering information correctly and prompts the user to enter all required information. Once complete, the form is digitally signed and submitted to the database. The system determines which storage locations are not in use, assigns the user s boxes of records to some of them, and enters these assignments in the database. Thereafter, the software tracks the boxes and can be used to locate them. By use of search capabilities of the software, specific records can be sought by box storage locations, accession numbers, record dates, submitting organizations, or details of the records themselves. Boxes can be marked with such statuses as checked out, lost, transferred, and destroyed. The system can generate reports showing boxes awaiting destruction or transfer. When boxes are transferred to the National Archives and Records Administration (NARA), the system can automatically fill out NARA records-transfer forms. Currently, several other NASA Centers are considering deploying the NASA Records Database to help automate their records archives.
Data always getting bigger -- A scalable DOI architecture for big and expanding scientific data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prakash, Giri; Shrestha, Biva; Younkin, Katarina
The Atmospheric Radiation Measurement (ARM) Data Archive established a data citation strategy based on Digital Object Identifiers (DOIs) for the ARM datasets in order to facilitate citing continuous and diverse ARM datasets in articles and other papers. This strategy eases the tracking of data provided as supplements to articles and papers. Additionally, it allows future data users and the ARM Climate Research Facility to easily locate the exact data used in various articles. Traditionally, DOIs are assigned to individual digital objects (a report or a data table), but for ARM datasets, these DOIs are assigned to an ARM data product.more » This eliminates the need for creating DOIs for numerous components of the ARM data product, in turn making it easier for users to manage and cite the ARM data with fewer DOIs. In addition, the ARM data infrastructure team, with input from scientific users, developed a citation format and an online data citation generation tool for continuous data streams. As a result, this citation format includes DOIs along with additional details such as spatial and temporal information.« less
Data always getting bigger -- A scalable DOI architecture for big and expanding scientific data
Prakash, Giri; Shrestha, Biva; Younkin, Katarina; ...
2016-08-31
The Atmospheric Radiation Measurement (ARM) Data Archive established a data citation strategy based on Digital Object Identifiers (DOIs) for the ARM datasets in order to facilitate citing continuous and diverse ARM datasets in articles and other papers. This strategy eases the tracking of data provided as supplements to articles and papers. Additionally, it allows future data users and the ARM Climate Research Facility to easily locate the exact data used in various articles. Traditionally, DOIs are assigned to individual digital objects (a report or a data table), but for ARM datasets, these DOIs are assigned to an ARM data product.more » This eliminates the need for creating DOIs for numerous components of the ARM data product, in turn making it easier for users to manage and cite the ARM data with fewer DOIs. In addition, the ARM data infrastructure team, with input from scientific users, developed a citation format and an online data citation generation tool for continuous data streams. As a result, this citation format includes DOIs along with additional details such as spatial and temporal information.« less
Optimizing the LSST Dither Pattern for Survey Uniformity
NASA Astrophysics Data System (ADS)
Awan, Humna; Gawiser, Eric J.; Kurczynski, Peter; Carroll, Christopher M.; LSST Dark Energy Science Collaboration
2015-01-01
The Large Synoptic Survey Telescope (LSST) will gather detailed data of the southern sky, enabling unprecedented study of Baryonic Acoustic Oscillations, which are an important probe of dark energy. These studies require a survey with highly uniform depth, and we aim to find an observation strategy that optimizes this uniformity. We have shown that in the absence of dithering (large telescope-pointing offsets), the LSST survey will vary significantly in depth. Hence, we implemented various dithering strategies, including random and repulsive random pointing offsets and spiral patterns with the spiral reaching completion in either a few months or the entire ten-year run. We employed three different implementations of dithering strategies: a single offset assigned to all fields observed on each night, offsets assigned to each field independently whenever the field is observed, and offsets assigned to each field only when the field is observed on a new night. Our analysis reveals that large dithers are crucial to guarantee survey uniformity and that assigning dithers to each field independently whenever the field is observed significantly increases this uniformity. These results suggest paths towards an optimal observation strategy that will enable LSST to achieve its science goals.We gratefully acknowledge support from the National Science Foundation REU program at Rutgers, PHY-1263280, and the Department of Energy, DE-SC0011636.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sreepathi, Sarat; D'Azevedo, Eduardo; Philip, Bobby
On large supercomputers, the job scheduling systems may assign a non-contiguous node allocation for user applications depending on available resources. With parallel applications using MPI (Message Passing Interface), the default process ordering does not take into account the actual physical node layout available to the application. This contributes to non-locality in terms of physical network topology and impacts communication performance of the application. In order to mitigate such performance penalties, this work describes techniques to identify suitable task mapping that takes the layout of the allocated nodes as well as the application's communication behavior into account. During the first phasemore » of this research, we instrumented and collected performance data to characterize communication behavior of critical US DOE (United States - Department of Energy) applications using an augmented version of the mpiP tool. Subsequently, we developed several reordering methods (spectral bisection, neighbor join tree etc.) to combine node layout and application communication data for optimized task placement. We developed a tool called mpiAproxy to facilitate detailed evaluation of the various reordering algorithms without requiring full application executions. This work presents a comprehensive performance evaluation (14,000 experiments) of the various task mapping techniques in lowering communication costs on Titan, the leadership class supercomputer at Oak Ridge National Laboratory.« less
A hybrid symbolic/finite-element algorithm for solving nonlinear optimal control problems
NASA Technical Reports Server (NTRS)
Bless, Robert R.; Hodges, Dewey H.
1991-01-01
The general code described is capable of solving difficult nonlinear optimal control problems by using finite elements and a symbolic manipulator. Quick and accurate solutions are obtained with a minimum for user interaction. Since no user programming is required for most problems, there are tremendous savings to be gained in terms of time and money.
Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization
ERIC Educational Resources Information Center
Gelman, Andrew; Lee, Daniel; Guo, Jiqiang
2015-01-01
Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…
A Data-Driven Solution for Performance Improvement
NASA Technical Reports Server (NTRS)
2002-01-01
Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.
Frequency Assignment for Joint Aerial Layer Network High-Capacity Backbone
2017-08-11
Department of the Army position unless so designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an...performance of the proposed approach. Frequency Assignment, JALN, Resource Allocation, Network Optimization, Performance Evaluation 24 Peng Wang 410-278
Measuring Diagnoses: ICD Code Accuracy
O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M
2005-01-01
Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999
Tasking and sharing sensing assets using controlled natural language
NASA Astrophysics Data System (ADS)
Preece, Alun; Pizzocaro, Diego; Braines, David; Mott, David
2012-06-01
We introduce an approach to representing intelligence, surveillance, and reconnaissance (ISR) tasks at a relatively high level in controlled natural language. We demonstrate that this facilitates both human interpretation and machine processing of tasks. More specically, it allows the automatic assignment of sensing assets to tasks, and the informed sharing of tasks between collaborating users in a coalition environment. To enable automatic matching of sensor types to tasks, we created a machine-processable knowledge representation based on the Military Missions and Means Framework (MMF), and implemented a semantic reasoner to match task types to sensor types. We combined this mechanism with a sensor-task assignment procedure based on a well-known distributed protocol for resource allocation. In this paper, we re-formulate the MMF ontology in Controlled English (CE), a type of controlled natural language designed to be readable by a native English speaker whilst representing information in a structured, unambiguous form to facilitate machine processing. We show how CE can be used to describe both ISR tasks (for example, detection, localization, or identication of particular kinds of object) and sensing assets (for example, acoustic, visual, or seismic sensors, mounted on motes or unmanned vehicles). We show how these representations enable an automatic sensor-task assignment process. Where a group of users are cooperating in a coalition, we show how CE task summaries give users in the eld a high-level picture of ISR coverage of an area of interest. This allows them to make ecient use of sensing resources by sharing tasks.
1975-09-01
1TPLAX 00049650 PLAN ASSIGN 17,5 0004970U ENTER 1 00049750 ADVANCE MHIt6913) 0004983,) TABULATF 3 0G049850 TEST LF V139FN2,PLAL 000’.9900 PLAJ TEST NE P16...GATE LR V15,PLAM 00050300 TRANSFER ,FLTA 00050350 PLAN LEAVE 1 00050400 TRANSFER ,AAB 00050450 PLAK ASSIGN L99P11 00050500 REMOVE 28 00050550 ASSIGN...SFER RLAkA I’ , ,J PMCF L INK 27,FIFO I0 O - PMC6 ADVANCE MXl(IV32) 3006hO UNLINK 27,SMGQtltl4#Pl4 O~Of, c-71’n TRANSFER tPMCp 00066P’,) 84 PMCR SPLIT
A new implementation of the programming system for structural synthesis (PROSSS-2)
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.
1984-01-01
This new implementation of the PROgramming System for Structural Synthesis (PROSSS-2) combines a general-purpose finite element computer program for structural analysis, a state-of-the-art optimization program, and several user-supplied, problem-dependent computer programs. The results are flexibility of the optimization procedure, organization, and versatility of the formulation of constraints and design variables. The analysis-optimization process results in a minimized objective function, typically the mass. The analysis and optimization programs are executed repeatedly by looping through the system until the process is stopped by a user-defined termination criterion. However, some of the analysis, such as model definition, need only be one time and the results are saved for future use. The user must write some small, simple FORTRAN programs to interface between the analysis and optimization programs. One of these programs, the front processor, converts the design variables output from the optimizer into the suitable format for input into the analyzer. Another, the end processor, retrieves the behavior variables and, optionally, their gradients from the analysis program and evaluates the objective function and constraints and optionally their gradients. These quantities are output in a format suitable for input into the optimizer. These user-supplied programs are problem-dependent because they depend primarily upon which finite elements are being used in the model. PROSSS-2 differs from the original PROSSS in that the optimizer and front and end processors have been integrated into the finite element computer program. This was done to reduce the complexity and increase portability of the system, and to take advantage of the data handling features found in the finite element program.
NASA Technical Reports Server (NTRS)
Rash, James
2014-01-01
NASA's space data-communications infrastructure-the Space Network and the Ground Network-provide scheduled (as well as some limited types of unscheduled) data-communications services to user spacecraft. The Space Network operates several orbiting geostationary platforms (the Tracking and Data Relay Satellite System (TDRSS)), each with its own servicedelivery antennas onboard. The Ground Network operates service-delivery antennas at ground stations located around the world. Together, these networks enable data transfer between user spacecraft and their mission control centers on Earth. Scheduling data-communications events for spacecraft that use the NASA communications infrastructure-the relay satellites and the ground stations-can be accomplished today with software having an operational heritage dating from the 1980s or earlier. An implementation of the scheduling methods and algorithms disclosed and formally specified herein will produce globally optimized schedules with not only optimized service delivery by the space data-communications infrastructure but also optimized satisfaction of all user requirements and prescribed constraints, including radio frequency interference (RFI) constraints. Evolutionary algorithms, a class of probabilistic strategies for searching large solution spaces, is the essential technology invoked and exploited in this disclosure. Also disclosed are secondary methods and algorithms for optimizing the execution efficiency of the schedule-generation algorithms themselves. The scheduling methods and algorithms as presented are adaptable to accommodate the complexity of scheduling the civilian and/or military data-communications infrastructure within the expected range of future users and space- or ground-based service-delivery assets. Finally, the problem itself, and the methods and algorithms, are generalized and specified formally. The generalized methods and algorithms are applicable to a very broad class of combinatorial-optimization problems that encompasses, among many others, the problem of generating optimal space-data communications schedules.
Optimization of USMC Hornet Inventory
2016-06-01
maintenance activities while adhering to the required number of aircraft for 22 operational use. He introduced an optimization based on an ILP... operational requirements across the entire planning process. In dealing with tail assignment as an optimization problem instead of a feasibility...aircraft and the goal is to minimize the penalties associated with failing to meet operational requirements. This research focuses on the optimal
User's manual for the Macintosh version of PASCO
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Davis, Randall C.
1991-01-01
A user's manual for Macintosh PASCO is presented. Macintosh PASCO is an Apple Macintosh version of PASCO, an existing computer code for structural analysis and optimization of longitudinally stiffened composite panels. PASCO combines a rigorous buckling analysis program with a nonlinear mathematical optimization routine to minimize panel mass. Macintosh PASCO accepts the same input as mainframe versions of PASCO. As output, Macintosh PASCO produces a text file and mode shape plots in the form of Apple Macintosh PICT files. Only the user interface for Macintosh is discussed here.
Code Optimization and Parallelization on the Origins: Looking from Users' Perspective
NASA Technical Reports Server (NTRS)
Chang, Yan-Tyng Sherry; Thigpen, William W. (Technical Monitor)
2002-01-01
Parallel machines are becoming the main compute engines for high performance computing. Despite their increasing popularity, it is still a challenge for most users to learn the basic techniques to optimize/parallelize their codes on such platforms. In this paper, we present some experiences on learning these techniques for the Origin systems at the NASA Advanced Supercomputing Division. Emphasis of this paper will be on a few essential issues (with examples) that general users should master when they work with the Origins as well as other parallel systems.
Eye-gaze control of the computer interface: Discrimination of zoom intent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldberg, J.H.; Schryver, J.C.
1993-10-01
An analysis methodology and associated experiment were developed to assess whether definable and repeatable signatures of eye-gaze characteristics are evident, preceding a decision to zoom-in, zoom-out, or not to zoom at a computer interface. This user intent discrimination procedure can have broad application in disability aids and telerobotic control. Eye-gaze was collected from 10 subjects in a controlled experiment, requiring zoom decisions. The eye-gaze data were clustered, then fed into a multiple discriminant analysis (MDA) for optimal definition of heuristics separating the zoom-in, zoom-out, and no-zoom conditions. Confusion matrix analyses showed that a number of variable combinations classified at amore » statistically significant level, but practical significance was more difficult to establish. Composite contour plots demonstrated the regions in parameter space consistently assigned by the MDA to unique zoom conditions. Peak classification occurred at about 1200--1600 msec. Improvements in the methodology to achieve practical real-time zoom control are considered.« less
A User’s Guide to BISAM (BIvariate SAMple): The Bivariate Data Modeling Program.
1983-08-01
method for the null case specified and is then used to form the bivariate density-quantile function as described in section 4. If D(U) in stage...employed assigns average ranks for tied observations. Other methods for assigning ranks to tied observations are often employed but are not attempted...34 €.. . . . .. . .. . . . ,.. . ,•. . . ... *.., .. , - . . . . - - . . .. - -. .. observations will weaken the results obtained since underlying continuous distributions are assumed. One should avoid such situations if possible. Two methods
Panatto, Donatella; Domnich, Alexander; Gasparini, Roberto; Bonanni, Paolo; Icardi, Giancarlo; Amicizia, Daniela; Arata, Lucia; Carozzo, Stefano; Signori, Alessio; Bechini, Angela; Boccalini, Sara
2016-12-02
The recently launched Pneumo Rischio eHealth project, which consists of an app, a website, and social networking activity, is aimed at increasing public awareness of invasive pneumococcal disease (IPD). The launch of this project was prompted by the inadequate awareness of IPD among both laypeople and health care workers, the heavy socioeconomic burden of IPD, and the far from optimal vaccination coverage in Italy, despite the availability of safe and effective vaccines. The objectives of our study were to analyze trends in Pneumo Rischio usage before and after a promotional campaign, to characterize its end users, and to assess its user-rated quality. At 7 months after launching Pneumo Rischio, we established a 4-month marketing campaign to promote the project. This intervention used various approaches and channels, including both traditional and digital marketing strategies. To highlight usage trends, we used different techniques of time series analysis and modeling, including a modified Mann-Kendall test, change-point detection, and segmented negative binomial regression of interrupted time series. Users were characterized in terms of demographics and IPD risk categories. Customer-rated quality was evaluated by means of a standardized tool in a sample of app users. Over 1 year, the app was accessed by 9295 users and the website was accessed by 143,993 users, while the project's Facebook page had 1216 fans. The promotional intervention was highly effective in increasing the daily number of users. In particular, the Mann-Kendall trend test revealed a significant (P ≤.01) increasing trend in both app and website users, while change-point detection analysis showed that the first significant change corresponded to the start of the promotional campaign. Regression analysis showed a significant immediate effect of the intervention, with a mean increase in daily numbers of users of 1562% (95% CI 456%-4870%) for the app and 620% (95% CI 176%-1777%) for the website. Similarly, the postintervention daily trend in the number of users was positive, with a relative increase of 0.9% (95% CI 0.0%-1.8%) for the app and 1.4% (95% CI 0.7%-2.1%) for the website. Demographics differed between app and website users and Facebook fans. A total of 69.15% (10,793/15,608) of users could be defined as being at risk of IPD, while 4729 users expressed intentions to ask their doctor for further information on IPD. The mean app quality score assigned by end users was approximately 79.5% (397/500). Despite its specific topic, Pneumo Rischio was accessed by a considerable number of users, who ranked it as a high-quality project. In order to reach their target populations, however, such projects should be promoted. ©Donatella Panatto, Alexander Domnich, Roberto Gasparini, Paolo Bonanni, Giancarlo Icardi, Daniela Amicizia, Lucia Arata, Stefano Carozzo, Alessio Signori, Angela Bechini, Sara Boccalini. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 02.12.2016.
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Naghipour Ghezeljeh, Paria; Bednarcyk, Brett A.
2018-01-01
This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) and its application. MAC/GMC is a composite material and laminate analysis software package developed at NASA Glenn Research Center. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that helps users optimize highly nonlinear viscoplastic constitutive law parameters by fitting experimentally observed/measured stress-strain responses under various thermo-mechanical conditions for braided composites. The tool has been developed utilizing the MATrix LABoratory (MATLAB) (The Mathworks, Inc., Natick, MA) programming language. Illustrative examples shown are for a specific braided composite system wherein the matrix viscoplastic behavior is represented by a constitutive law described by seven parameters. The tool is general enough to fit any number of experimentally observed stress-strain responses of the material. The number of parameters to be optimized, as well as the importance given to each stress-strain response, are user choice. Three different optimization algorithms are included: (1) Optimization based on gradient method, (2) Genetic algorithm (GA) based optimization and (3) Particle Swarm Optimization (PSO). The user can mix and match the three algorithms. For example, one can start optimization with either 2 or 3 and then use the optimized solution to further fine tune with approach 1. The secondary focus of this paper is to demonstrate the application of this tool to optimize/calibrate parameters for a nonlinear viscoplastic matrix to predict stress-strain curves (for constituent and composite levels) at different rates, temperatures and/or loading conditions utilizing the Generalized Method of Cells. After preliminary validation of the tool through comparison with experimental results, a detailed virtual parametric study is presented wherein the combined effects of temperature and loading rate on the predicted response of a braided composite is investigated.
PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.
Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter
2016-04-01
Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Automatic Assignment of Methyl-NMR Spectra of Supramolecular Machines Using Graph Theory.
Pritišanac, Iva; Degiacomi, Matteo T; Alderson, T Reid; Carneiro, Marta G; Ab, Eiso; Siegal, Gregg; Baldwin, Andrew J
2017-07-19
Methyl groups are powerful probes for the analysis of structure, dynamics and function of supramolecular assemblies, using both solution- and solid-state NMR. Widespread application of the methodology has been limited due to the challenges associated with assigning spectral resonances to specific locations within a biomolecule. Here, we present Methyl Assignment by Graph Matching (MAGMA), for the automatic assignment of methyl resonances. A graph matching protocol examines all possibilities for each resonance in order to determine an exact assignment that includes a complete description of any ambiguity. MAGMA gives 100% accuracy in confident assignments when tested against both synthetic data, and 9 cross-validated examples using both solution- and solid-state NMR data. We show that this remarkable accuracy enables a user to distinguish between alternative protein structures. In a drug discovery application on HSP90, we show the method can rapidly and efficiently distinguish between possible ligand binding modes. By providing an exact and robust solution to methyl resonance assignment, MAGMA can facilitate significantly accelerated studies of supramolecular machines using methyl-based NMR spectroscopy.
Designing for User Cognition and Affect in Software Instructions
ERIC Educational Resources Information Center
van der Meij, Hans
2008-01-01
In this paper we examine how to design software instructions for user cognition and affect. A basic and co-user manual are compared. The first provides fundamental support for both; the latter includes a buddy to further optimize support for user affect. The basic manual was faster and judged as easier to process than the co-user manual. In…
Schneider, Francine; van Osch, Liesbeth; de Vries, Hein
2012-02-14
The Internet has become a popular medium for offering tailored and targeted health promotion programs to the general public. However, suboptimal levels of program use in the target population limit the public health impact of these programs. Optimizing program development is considered as one of the main processes to increase usage rates. To distinguish factors potentially related to optimal development of health-related websites by involving both experts and potential users. By considering and incorporating the opinions of experts and potential users in the development process, involvement in the program is expected to increase, consequently resulting in increased appreciation, lower levels of attrition, and higher levels of sustained use. We conducted a systematic three-round Delphi study through the Internet. Both national and international experts (from the fields of health promotion, health psychology, e-communication, and technical Web design) and potential users were invited via email to participate. During this study an extensive list of factors potentially related to optimal development of health-related websites was identified, by focusing on factors related to layout, general and risk information provision, questionnaire use, additional services, and ease of use. Furthermore, we assessed the extent to which experts and potential users agreed on the importance of these factors. Differences as well as similarities among experts and potentials users were deduced. In total, 20 of 62 contacted experts participated in the first round (32% response rate); 60 of 200 contacted experts (30% response rate) and 210 potential users (95% response rate) completed the second-round questionnaire, and 32 of 60 contacted experts completed the third round (53% response rate). Results revealed important factors consented upon by experts and potential users (eg, ease of use, clear structure, and detailed health information provision), as well as differences regarding important factors consented upon by experts (eg, visual aids, self-monitoring tool, and iterative health feedback) or by potential users only (eg, bread crumb navigation and prevention of receiving spam). This study is an important first step in determining the agreed-upon factors that should be taken into account when developing online health promotion programs. The public health impact of these programs will be improved by optimizing the development process in line with these factors.
Hybrid protection algorithms based on game theory in multi-domain optical networks
NASA Astrophysics Data System (ADS)
Guo, Lei; Wu, Jingjing; Hou, Weigang; Liu, Yejun; Zhang, Lincong; Li, Hongming
2011-12-01
With the network size increasing, the optical backbone is divided into multiple domains and each domain has its own network operator and management policy. At the same time, the failures in optical network may lead to a huge data loss since each wavelength carries a lot of traffic. Therefore, the survivability in multi-domain optical network is very important. However, existing survivable algorithms can achieve only the unilateral optimization for profit of either users or network operators. Then, they cannot well find the double-win optimal solution with considering economic factors for both users and network operators. Thus, in this paper we develop the multi-domain network model with involving multiple Quality of Service (QoS) parameters. After presenting the link evaluation approach based on fuzzy mathematics, we propose the game model to find the optimal solution to maximize the user's utility, the network operator's utility, and the joint utility of user and network operator. Since the problem of finding double-win optimal solution is NP-complete, we propose two new hybrid protection algorithms, Intra-domain Sub-path Protection (ISP) algorithm and Inter-domain End-to-end Protection (IEP) algorithm. In ISP and IEP, the hybrid protection means that the intelligent algorithm based on Bacterial Colony Optimization (BCO) and the heuristic algorithm are used to solve the survivability in intra-domain routing and inter-domain routing, respectively. Simulation results show that ISP and IEP have the similar comprehensive utility. In addition, ISP has better resource utilization efficiency, lower blocking probability, and higher network operator's utility, while IEP has better user's utility.
Evangelista, Daniela; Zuccaro, Antonio; Lančinskas, Algirdas; Žilinskas, Julius; Guarracino, Mario R
2016-02-17
The cost per patient of next generation sequencing for detection of rare mutations may be significantly reduced using pooled experiments. Recently, some techniques have been proposed for the planning of pooled experiments and for the optimal allocation of patients into pools. However, the lack of a user friendly resource for planning the design of pooled experiments forces the scientists to do frequent, complex and long computations. OPENDoRM is a powerful collection of novel mathematical algorithms usable via an intuitive graphical user interface. It enables researchers to speed up the planning of their routine experiments, as well as, to support scientists without specific bioinformatics expertises. Users can automatically carry out analysis in terms of costs associated with the optimal allocation of patients in pools. They are also able to choose between three distinct pooling mathematical methods, each of which also suggests the optimal configuration for the submitted experiment. Importantly, in order to keep track of the performed experiments, users can save and export the results of their experiments in standard tabular and charts contents. OPENDoRM is a freely available web-oriented application for the planning of pooled NGS experiments, available at: http://www-labgtp.na.icar.cnr.it/OPENDoRM. Its easy and intuitive graphical user interface enables researchers to plan theirs experiments using novel algorithms, and to interactively visualize the results.
Folksonomies and clustering in the collaborative system CiteULike
NASA Astrophysics Data System (ADS)
Capocci, Andrea; Caldarelli, Guido
2008-06-01
We analyze CiteULike, an online collaborative tagging system where users bookmark and annotate scientific papers. Such a system can be naturally represented as a tri-partite graph whose nodes represent papers, users and tags connected by individual tag assignments. The semantics of tags is studied here, in order to uncover the hidden relationships between tags. We find that the clustering coefficient can be used to analyze the semantical patterns among tags.
2012-09-01
scheduler to adapt its uplink and downlink assignments to channel conditions. Sleep mode is used by the MS to minimize power drain and radio...is addressed in one resource unit, while for multi-user (MU) schemes , multiple users can be scheduled in one resource unit. Open-loop techniques...17 7. Mobility and Power Management ......................................... 18 8. Scheduling Services
NASA Astrophysics Data System (ADS)
Flinders, Bryn; Beasley, Emma; Verlaan, Ricky M.; Cuypers, Eva; Francese, Simona; Bassindale, Tom; Clench, Malcolm R.; Heeren, Ron M. A.
2017-08-01
Matrix-assisted laser desorption/ionization-mass spectrometry imaging (MALDI-MSI) has been employed to rapidly screen longitudinally sectioned drug user hair samples for cocaine and its metabolites using continuous raster imaging. Optimization of the spatial resolution and raster speed were performed on intact cocaine contaminated hair samples. The optimized settings (100 × 150 μm at 0.24 mm/s) were subsequently used to examine longitudinally sectioned drug user hair samples. The MALDI-MS/MS images showed the distribution of the most abundant cocaine product ion at m/z 182. Using the optimized settings, multiple hair samples obtained from two users were analyzed in approximately 3 h: six times faster than the standard spot-to-spot acquisition method. Quantitation was achieved using longitudinally sectioned control hair samples sprayed with a cocaine dilution series. A multiple reaction monitoring (MRM) experiment was also performed using the `dynamic pixel' imaging method to screen for cocaine and a range of its metabolites, in order to differentiate between contaminated hairs and drug users. Cocaine, benzoylecgonine, and cocaethylene were detectable, in agreement with analyses carried out using the standard LC-MS/MS method. [Figure not available: see fulltext.
Performance tradeoffs in static and dynamic load balancing strategies
NASA Technical Reports Server (NTRS)
Iqbal, M. A.; Saltz, J. H.; Bokhart, S. H.
1986-01-01
The problem of uniformly distributing the load of a parallel program over a multiprocessor system was considered. A program was analyzed whose structure permits the computation of the optimal static solution. Then four strategies for load balancing were described and their performance compared. The strategies are: (1) the optimal static assignment algorithm which is guaranteed to yield the best static solution, (2) the static binary dissection method which is very fast but sub-optimal, (3) the greedy algorithm, a static fully polynomial time approximation scheme, which estimates the optimal solution to arbitrary accuracy, and (4) the predictive dynamic load balancing heuristic which uses information on the precedence relationships within the program and outperforms any of the static methods. It is also shown that the overhead incurred by the dynamic heuristic is reduced considerably if it is started off with a static assignment provided by either of the other three strategies.
75 FR 64245 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-19
... Federal and non-Federal users. The Web-based system provides a means for non-Federal applicants to rapidly... replaced the manual RF assignment process used by the Federal Communications Commission and NTIA. The...
Network connectivity enhancement by exploiting all optical multicast in semiconductor ring laser
NASA Astrophysics Data System (ADS)
Siraj, M.; Memon, M. I.; Shoaib, M.; Alshebeili, S.
2015-03-01
The use of smart phone and tablet applications will provide the troops for executing, controlling and analyzing sophisticated operations with the commanders providing crucial documents directly to troops wherever and whenever needed. Wireless mesh networks (WMNs) is a cutting edge networking technology which is capable of supporting Joint Tactical radio System (JTRS).WMNs are capable of providing the much needed bandwidth for applications like hand held radios and communication for airborne and ground vehicles. Routing management tasks can be efficiently handled through WMNs through a central command control center. As the spectrum space is congested, cognitive radios are a much welcome technology that will provide much needed bandwidth. They can self-configure themselves, can adapt themselves to the user requirement, provide dynamic spectrum access for minimizing interference and also deliver optimal power output. Sometimes in the indoor environment, there are poor signal issues and reduced coverage. In this paper, a solution utilizing (CR WMNs) over optical network is presented by creating nanocells (PCs) inside the indoor environment. The phenomenon of four-wave mixing (FWM) is exploited to generate all-optical multicast using semiconductor ring laser (SRL). As a result same signal is transmitted at different wavelengths. Every PC is assigned a unique wavelength. By using CR technology in conjunction with PC will not only solve network coverage issue but will provide a good bandwidth to the secondary users.
Moore, David J.; Montoya, Jessica L.; Blackstone, Kaitlin; Depp, Colin A.; Atkinson, J. Hampton; TMARC Group, The
2013-01-01
The feasibility, use, and acceptability of text messages to track methamphetamine use and promote antiretroviral treatment (ART) adherence among HIV-infected methamphetamine users was examined. From an ongoing randomized controlled trial, 30-day text response rates of participants assigned to the intervention (individualized texting for adherence building (iTAB), n = 20) were compared to those in the active comparison condition (n = 9). Both groups received daily texts assessing methamphetamine use, and the iTAB group additionally received personalized daily ART adherence reminder texts. Response rate for methamphetamine use texts was 72.9% with methamphetamine use endorsed 14.7% of the time. Text-derived methamphetamine use data was correlated with data from a structured substance use interview covering the same time period (P < 0.05). The iTAB group responded to 69.0% of adherence reminder texts; among those responses, 81.8% endorsed taking ART medication. Standardized feedback questionnaire responses indicated little difficulty with the texts, satisfaction with the study, and beliefs that future text-based interventions would be helpful. Moreover, most participants believed the intervention reduced methamphetamine use and improved adherence. Qualitative feedback regarding the intervention was positive. Future studies will refine and improve iTAB for optimal acceptability and efficacy. This trial is registered with ClinicalTrials.gov NCT01317277. PMID:24078868
Order-Constrained Solutions in K-Means Clustering: Even Better than Being Globally Optimal
ERIC Educational Resources Information Center
Steinley, Douglas; Hubert, Lawrence
2008-01-01
This paper proposes an order-constrained K-means cluster analysis strategy, and implements that strategy through an auxiliary quadratic assignment optimization heuristic that identifies an initial object order. A subsequent dynamic programming recursion is applied to optimally subdivide the object set subject to the order constraint. We show that…
Specialty Task Force: A Strategic Component to Electronic Health Record (EHR) Optimization.
Romero, Mary Rachel; Staub, Allison
2016-01-01
Post-implementation stage comes after an electronic health record (EHR) deployment. Analyst and end users deal with the reality that some of the concepts and designs initially planned and created may not be complementary to the workflow; creating anxiety, dissatisfaction, and failure with early adoption of system. Problems encountered during deployment are numerous and can vary from simple to complex. Redundant ticket submission creates backlog for Information Technology personnel resulting in delays in resolving concerns with EHR system. The process of optimization allows for evaluation of system and reassessment of users' needs. A solid and well executed optimization infrastructure can help minimize unexpected end-user disruptions and help tailor the system to meet regulatory agency goals and practice standards. A well device plan to resolve problems during post implementation is necessary for cost containment and to streamline communication efforts. Creating a specialty specific collaborative task force is efficacious and expedites resolution of users' concerns through a more structured process.
Constraint programming based biomarker optimization.
Zhou, Manli; Luo, Youxi; Sun, Guoquan; Mai, Guoqin; Zhou, Fengfeng
2015-01-01
Efficient and intuitive characterization of biological big data is becoming a major challenge for modern bio-OMIC based scientists. Interactive visualization and exploration of big data is proven to be one of the successful solutions. Most of the existing feature selection algorithms do not allow the interactive inputs from users in the optimizing process of feature selection. This study investigates this question as fixing a few user-input features in the finally selected feature subset and formulates these user-input features as constraints for a programming model. The proposed algorithm, fsCoP (feature selection based on constrained programming), performs well similar to or much better than the existing feature selection algorithms, even with the constraints from both literature and the existing algorithms. An fsCoP biomarker may be intriguing for further wet lab validation, since it satisfies both the classification optimization function and the biomedical knowledge. fsCoP may also be used for the interactive exploration of bio-OMIC big data by interactively adding user-defined constraints for modeling.
Ant colony optimization for solving university facility layout problem
NASA Astrophysics Data System (ADS)
Mohd Jani, Nurul Hafiza; Mohd Radzi, Nor Haizan; Ngadiman, Mohd Salihin
2013-04-01
Quadratic Assignment Problems (QAP) is classified as the NP hard problem. It has been used to model a lot of problem in several areas such as operational research, combinatorial data analysis and also parallel and distributed computing, optimization problem such as graph portioning and Travel Salesman Problem (TSP). In the literature, researcher use exact algorithm, heuristics algorithm and metaheuristic approaches to solve QAP problem. QAP is largely applied in facility layout problem (FLP). In this paper we used QAP to model university facility layout problem. There are 8 facilities that need to be assigned to 8 locations. Hence we have modeled a QAP problem with n ≤ 10 and developed an Ant Colony Optimization (ACO) algorithm to solve the university facility layout problem. The objective is to assign n facilities to n locations such that the minimum product of flows and distances is obtained. Flow is the movement from one to another facility, whereas distance is the distance between one locations of a facility to other facilities locations. The objective of the QAP is to obtain minimum total walking (flow) of lecturers from one destination to another (distance).
Many-to-Many Multicast Routing Schemes under a Fixed Topology
Ding, Wei; Wang, Hongfa; Wei, Xuerui
2013-01-01
Many-to-many multicast routing can be extensively applied in computer or communication networks supporting various continuous multimedia applications. The paper focuses on the case where all users share a common communication channel while each user is both a sender and a receiver of messages in multicasting as well as an end user. In this case, the multicast tree appears as a terminal Steiner tree (TeST). The problem of finding a TeST with a quality-of-service (QoS) optimization is frequently NP-hard. However, we discover that it is a good idea to find a many-to-many multicast tree with QoS optimization under a fixed topology. In this paper, we are concerned with three kinds of QoS optimization objectives of multicast tree, that is, the minimum cost, minimum diameter, and maximum reliability. All of three optimization problems are distributed into two types, the centralized and decentralized version. This paper uses the dynamic programming method to devise an exact algorithm, respectively, for the centralized and decentralized versions of each optimization problem. PMID:23589706
NASA Astrophysics Data System (ADS)
Helbing, Dirk; Schönhof, Martin; Kern, Daniel
2002-06-01
The coordinated and efficient distribution of limited resources by individual decisions is a fundamental, unsolved problem. When individuals compete for road capacities, time, space, money, goods, etc, they normally make decisions based on aggregate rather than complete information, such as TV news or stock market indices. In related experiments, we have observed a volatile decision dynamics and far-from-optimal payoff distributions. We have also identified methods of information presentation that can considerably improve the overall performance of the system. In order to determine optimal strategies of decision guidance by means of user-specific recommendations, a stochastic behavioural description is developed. These strategies manage to increase the adaptibility to changing conditions and to reduce the deviation from the time-dependent user equilibrium, thereby enhancing the average and individual payoffs. Hence, our guidance strategies can increase the performance of all users by reducing overreaction and stabilizing the decision dynamics. These results are highly significant for predicting decision behaviour, for reaching optimal behavioural distributions by decision support systems and for information service providers. One of the promising fields of application is traffic optimization.
Anderson, Jeffrey R; Barrett, Steven F
2009-01-01
Image segmentation is the process of isolating distinct objects within an image. Computer algorithms have been developed to aid in the process of object segmentation, but a completely autonomous segmentation algorithm has yet to be developed [1]. This is because computers do not have the capability to understand images and recognize complex objects within the image. However, computer segmentation methods [2], requiring user input, have been developed to quickly segment objects in serial sectioned images, such as magnetic resonance images (MRI) and confocal laser scanning microscope (CLSM) images. In these cases, the segmentation process becomes a powerful tool in visualizing the 3D nature of an object. The user input is an important part of improving the performance of many segmentation methods. A double threshold segmentation method has been investigated [3] to separate objects in gray scaled images, where the gray level of the object is among the gray levels of the background. In order to best determine the threshold values for this segmentation method the image must be manipulated for optimal contrast. The same is true of other segmentation and edge detection methods as well. Typically, the better the image contrast, the better the segmentation results. This paper describes a graphical user interface (GUI) that allows the user to easily change image contrast parameters that will optimize the performance of subsequent object segmentation. This approach makes use of the fact that the human brain is extremely effective in object recognition and understanding. The GUI provides the user with the ability to define the gray scale range of the object of interest. These lower and upper bounds of this range are used in a histogram stretching process to improve image contrast. Also, the user can interactively modify the gamma correction factor that provides a non-linear distribution of gray scale values, while observing the corresponding changes to the image. This interactive approach gives the user the power to make optimal choices in the contrast enhancement parameters.
Software-supported USER cloning strategies for site-directed mutagenesis and DNA assembly.
Genee, Hans Jasper; Bonde, Mads Tvillinggaard; Bagger, Frederik Otzen; Jespersen, Jakob Berg; Sommer, Morten O A; Wernersson, Rasmus; Olsen, Lars Rønn
2015-03-20
USER cloning is a fast and versatile method for engineering of plasmid DNA. We have developed a user friendly Web server tool that automates the design of optimal PCR primers for several distinct USER cloning-based applications. Our Web server, named AMUSER (Automated DNA Modifications with USER cloning), facilitates DNA assembly and introduction of virtually any type of site-directed mutagenesis by designing optimal PCR primers for the desired genetic changes. To demonstrate the utility, we designed primers for a simultaneous two-position site-directed mutagenesis of green fluorescent protein (GFP) to yellow fluorescent protein (YFP), which in a single step reaction resulted in a 94% cloning efficiency. AMUSER also supports degenerate nucleotide primers, single insert combinatorial assembly, and flexible parameters for PCR amplification. AMUSER is freely available online at http://www.cbs.dtu.dk/services/AMUSER/.
Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms
NASA Astrophysics Data System (ADS)
Kwok, Kwan S.; Driessen, Brian J.; Phillips, Cynthia A.; Tovey, Craig A.
1997-09-01
This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. We wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which we must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solution times for one hundred robots took only seconds on a silicon graphics crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. We have found these mobile robot problems to be a very interesting application of network optimization methods, and we expect this to be a fruitful area for future research.
Optimal Processor Assignment for Pipeline Computations
1991-10-01
the use of ratios: initially each task is assigned a procesbuor2 the remaining proceborb are distributed in proportion to the quantities f,(1), 1 < i...algorithmns. IEEE Trans. onl Parallel and Distributed Systemns, 1 (4):470-499, October 1990. [26] P. Al. Kogge. The Architeture of Pipelined Comnputers
Multiantenna Relay Beamforming Design for QoS Discrimination in Two-Way Relay Networks
Xiong, Ke; Zhang, Yu; Li, Dandan; Zhong, Zhangdui
2013-01-01
This paper investigates the relay beamforming design for quality of service (QoS) discrimination in two-way relay networks. The purpose is to keep legitimate two-way relay users exchange their information via a helping multiantenna relay with QoS guarantee while avoiding the exchanged information overhearing by unauthorized receiver. To this end, we propose a physical layer method, where the relay beamforming is jointly designed with artificial noise (AN) which is used to interfere in the unauthorized user's reception. We formulate the joint beamforming and AN (BFA) design into an optimization problem such that the received signal-to-interference-ratio (SINR) at the two legitimate users is over a predefined QoS threshold while limiting the received SINR at the unauthorized user which is under a certain secure threshold. The objective of the optimization problem is to seek the optimal AN and beamforming vectors to minimize the total power consumed by the relay node. Since the optimization problem is nonconvex, we solve it by using semidefinite program (SDP) relaxation. For comparison, we also study the optimal relay beamforming without using AN (BFO) under the same QoS discrimination constraints. Simulation results show that both the proposed BFA and BFO can achieve the QoS discrimination of the two-way transmission. However, the proposed BFA yields significant power savings and lower infeasible rates compared with the BFO method. PMID:24391459
Cooperation stimulation strategies for peer-to-peer wireless live video-sharing social networks.
Lin, W Sabrina; Zhao, H Vicky; Liu, K J Ray
2010-07-01
Human behavior analysis in video sharing social networks is an emerging research area, which analyzes the behavior of users who share multimedia content and investigates the impact of human dynamics on video sharing systems. Users watching live streaming in the same wireless network share the same limited bandwidth of backbone connection to the Internet, thus, they might want to cooperate with each other to obtain better video quality. These users form a wireless live-streaming social network. Every user wishes to watch video with high quality while paying as little as possible cost to help others. This paper focuses on providing incentives for user cooperation. We propose a game-theoretic framework to model user behavior and to analyze the optimal strategies for user cooperation simulation in wireless live streaming. We first analyze the Pareto optimality and the time-sensitive bargaining equilibrium of the two-person game. We then extend the solution to the multiuser scenario. We also consider potential selfish users' cheating behavior and malicious users' attacking behavior and analyze the performance of the proposed strategies with the existence of cheating users and malicious attackers. Both our analytical and simulation results show that the proposed strategies can effectively stimulate user cooperation, achieve cheat free and attack resistance, and help provide reliable services for wireless live streaming applications.
Deterministic Design Optimization of Structures in OpenMDAO Framework
NASA Technical Reports Server (NTRS)
Coroneos, Rula M.; Pai, Shantaram S.
2012-01-01
Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.
AutoFACT: An Automatic Functional Annotation and Classification Tool
Koski, Liisa B; Gray, Michael W; Lang, B Franz; Burger, Gertraud
2005-01-01
Background Assignment of function to new molecular sequence data is an essential step in genomics projects. The usual process involves similarity searches of a given sequence against one or more databases, an arduous process for large datasets. Results We present AutoFACT, a fully automated and customizable annotation tool that assigns biologically informative functions to a sequence. Key features of this tool are that it (1) analyzes nucleotide and protein sequence data; (2) determines the most informative functional description by combining multiple BLAST reports from several user-selected databases; (3) assigns putative metabolic pathways, functional classes, enzyme classes, GeneOntology terms and locus names; and (4) generates output in HTML, text and GFF formats for the user's convenience. We have compared AutoFACT to four well-established annotation pipelines. The error rate of functional annotation is estimated to be only between 1–2%. Comparison of AutoFACT to the traditional top-BLAST-hit annotation method shows that our procedure increases the number of functionally informative annotations by approximately 50%. Conclusion AutoFACT will serve as a useful annotation tool for smaller sequencing groups lacking dedicated bioinformatics staff. It is implemented in PERL and runs on LINUX/UNIX platforms. AutoFACT is available at . PMID:15960857
Propagation of Disturbances in Traffic Flow
DOT National Transportation Integrated Search
1977-09-01
The system-optimized static traffic-assignment problem in a freeway corridor network is the problem of choosing a distribution of vehicles in the network to minimize average travel time. It is of interest to know how sensitive the optimal steady-stat...
Engineering calculations for communications systems planning
NASA Technical Reports Server (NTRS)
Levis, C. A.; Martin, C. H.; Wang, C. W.; Gonsalvez, D.
1982-01-01
The single entry interference problem is treated for frequency sharing between the broadcasting satellite and intersatellite services near 23 GHz. It is recommended that very long (more than 120 longitude difference) intersatellite hops be relegated to the unshared portion of the band. When this is done, it is found that suitable orbit assignments can be determined easily with the aid of a set of universal curves. An attempt to develop synthesis procedures for optimally assigning frequencies and orbital slots for the broadcasting satellite service in region 2 was initiated. Several discrete programming and continuous optimization techniques are discussed.
Shortreed, Susan M.; Moodie, Erica E. M.
2012-01-01
Summary Treatment of schizophrenia is notoriously difficult and typically requires personalized adaption of treatment due to lack of efficacy of treatment, poor adherence, or intolerable side effects. The Clinical Antipsychotic Trials in Intervention Effectiveness (CATIE) Schizophrenia Study is a sequential multiple assignment randomized trial comparing the typical antipsychotic medication, perphenazine, to several newer atypical antipsychotics. This paper describes the marginal structural modeling method for estimating optimal dynamic treatment regimes and applies the approach to the CATIE Schizophrenia Study. Missing data and valid estimation of confidence intervals are also addressed. PMID:23087488
Optimizing real-time Web-based user interfaces for observatories
NASA Astrophysics Data System (ADS)
Gibson, J. Duane; Pickering, Timothy E.; Porter, Dallan; Schaller, Skip
2008-08-01
In using common HTML/Ajax approaches for web-based data presentation and telescope control user interfaces at the MMT Observatory (MMTO), we rapidly were confronted with web browser performance issues. Much of the operational data at the MMTO is highly dynamic and is constantly changing during normal operations. Status of telescope subsystems must be displayed with minimal latency to telescope operators and other users. A major motivation of migrating toward web-based applications at the MMTO is to provide easy access to current and past observatory subsystem data for a wide variety of users on their favorite operating system through a familiar interface, their web browser. Performance issues, especially for user interfaces that control telescope subsystems, led to investigations of more efficient use of HTML/Ajax and web server technologies as well as other web-based technologies, such as Java and Flash/Flex. The results presented here focus on techniques for optimizing HTML/Ajax web applications with near real-time data display. This study indicates that direct modification of the contents or "nodeValue" attribute of text nodes is the most efficient method of updating data values displayed on a web page. Other optimization techniques are discussed for web-based applications that display highly dynamic data.
NASA Astrophysics Data System (ADS)
Taiwo, Ambali; Alnassar, Ghusoon; Bakar, M. H. Abu; Khir, M. F. Abdul; Mahdi, Mohd Adzir; Mokhtar, M.
2018-05-01
One-weight authentication code for multi-user quantum key distribution (QKD) is proposed. The code is developed for Optical Code Division Multiplexing (OCDMA) based QKD network. A unique address assigned to individual user, coupled with degrading probability of predicting the source of the qubit transmitted in the channel offer excellent secure mechanism against any form of channel attack on OCDMA based QKD network. Flexibility in design as well as ease of modifying the number of users are equally exceptional quality presented by the code in contrast to Optical Orthogonal Code (OOC) earlier implemented for the same purpose. The code was successfully applied to eight simultaneous users at effective key rate of 32 bps over 27 km transmission distance.
DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS
Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...
FERRET adjustment code: status/use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.A.
1982-03-01
The least-squares data analysis code FERRET is reviewed. Recent enhancements are discussed along with illustrative applications. Particular features noted include the use of differential as well as integral data, and additional user options for assigning and storing covariance matrices.
Code of Federal Regulations, 2010 CFR
2010-01-01
... contracts, grants, agreements, understandings and other arrangements (including Cooperative Research and Development Agreements [CRADAs], Work for Others and User Facility agreements, which includes research, development, or demonstration work, and includes any assignment or substitution of the parties, entered into...
Interleaved Training and Training-Based Transmission Design for Hybrid Massive Antenna Downlink
NASA Astrophysics Data System (ADS)
Zhang, Cheng; Jing, Yindi; Huang, Yongming; Yang, Luxi
2018-06-01
In this paper, we study the beam-based training design jointly with the transmission design for hybrid massive antenna single-user (SU) and multiple-user (MU) systems where outage probability is adopted as the performance measure. For SU systems, we propose an interleaved training design to concatenate the feedback and training procedures, thus making the training length adaptive to the channel realization. Exact analytical expressions are derived for the average training length and the outage probability of the proposed interleaved training. For MU systems, we propose a joint design for the beam-based interleaved training, beam assignment, and MU data transmissions. Two solutions for the beam assignment are provided with different complexity-performance tradeoff. Analytical results and simulations show that for both SU and MU systems, the proposed joint training and transmission designs achieve the same outage performance as the traditional full-training scheme but with significant saving in the training overhead.
Near-optimal strategies for sub-decimeter satellite tracking with GPS
NASA Technical Reports Server (NTRS)
Yunck, Thomas P.; Wu, Sien-Chong; Wu, Jiun-Tsong
1986-01-01
Decimeter tracking of low Earth orbiters using differential Global Positioning System (GPS) techniques is discussed. A precisely known global network of GPS ground receivers and a receiver aboard the user satellite are needed, and all techniques simultaneously estimate the user and GPS satellite orbits. Strategies include a purely geometric, a fully dynamic, and a hybrid strategy. The last combines dynamic GPS solutions with a geometric user solution. Two powerful extensions of the hybrid strategy show the most promise. The first uses an optimized synthesis of dynamics and geometry in the user solution, while the second uses a gravity adjustment method to exploit data from repeat ground tracks. These techniques promise to deliver subdecimeter accuracy down to the lowest satellite altitudes.
Optimal Energy Efficiency Fairness of Nodes in Wireless Powered Communication Networks.
Zhang, Jing; Zhou, Qingjie; Ng, Derrick Wing Kwan; Jo, Minho
2017-09-15
In wireless powered communication networks (WPCNs), it is essential to research energy efficiency fairness in order to evaluate the balance of nodes for receiving information and harvesting energy. In this paper, we propose an efficient iterative algorithm for optimal energy efficiency proportional fairness in WPCN. The main idea is to use stochastic geometry to derive the mean proportionally fairness utility function with respect to user association probability and receive threshold. Subsequently, we prove that the relaxed proportionally fairness utility function is a concave function for user association probability and receive threshold, respectively. At the same time, a sub-optimal algorithm by exploiting alternating optimization approach is proposed. Through numerical simulations, we demonstrate that our sub-optimal algorithm can obtain a result close to optimal energy efficiency proportional fairness with significant reduction of computational complexity.
Optimal Energy Efficiency Fairness of Nodes in Wireless Powered Communication Networks
Zhou, Qingjie; Ng, Derrick Wing Kwan; Jo, Minho
2017-01-01
In wireless powered communication networks (WPCNs), it is essential to research energy efficiency fairness in order to evaluate the balance of nodes for receiving information and harvesting energy. In this paper, we propose an efficient iterative algorithm for optimal energy efficiency proportional fairness in WPCN. The main idea is to use stochastic geometry to derive the mean proportionally fairness utility function with respect to user association probability and receive threshold. Subsequently, we prove that the relaxed proportionally fairness utility function is a concave function for user association probability and receive threshold, respectively. At the same time, a sub-optimal algorithm by exploiting alternating optimization approach is proposed. Through numerical simulations, we demonstrate that our sub-optimal algorithm can obtain a result close to optimal energy efficiency proportional fairness with significant reduction of computational complexity. PMID:28914818
User Access | Energy Systems Integration Facility | NREL
User Access User Access The ESIF houses an unparalleled collection of state-of-the-art capabilities user access program, the ESIF allows researchers access to its premier laboratories in support of research and development that aims to optimize our entire energy system at full power. Requests for access
NASA Astrophysics Data System (ADS)
Kotulla, Ralf; Gopu, Arvind; Hayashi, Soichi
2016-08-01
Processing astronomical data to science readiness was and remains a challenge, in particular in the case of multi detector instruments such as wide-field imagers. One such instrument, the WIYN One Degree Imager, is available to the astronomical community at large, and, in order to be scientifically useful to its varied user community on a short timescale, provides its users fully calibrated data in addition to the underlying raw data. However, time-efficient re-processing of the often large datasets with improved calibration data and/or software requires more than just a large number of CPU-cores and disk space. This is particularly relevant if all computing resources are general purpose and shared with a large number of users in a typical university setup. Our approach to address this challenge is a flexible framework, combining the best of both high performance (large number of nodes, internal communication) and high throughput (flexible/variable number of nodes, no dedicated hardware) computing. Based on the Advanced Message Queuing Protocol, we a developed a Server-Manager- Worker framework. In addition to the server directing the work flow and the worker executing the actual work, the manager maintains a list of available worker, adds and/or removes individual workers from the worker pool, and re-assigns worker to different tasks. This provides the flexibility of optimizing the worker pool to the current task and workload, improves load balancing, and makes the most efficient use of the available resources. We present performance benchmarks and scaling tests, showing that, today and using existing, commodity shared- use hardware we can process data with data throughputs (including data reduction and calibration) approaching that expected in the early 2020s for future observatories such as the Large Synoptic Survey Telescope.
Facilitating NCAR Data Discovery by Connecting Related Resources
NASA Astrophysics Data System (ADS)
Rosati, A.
2012-12-01
Linking datasets, creators, and users by employing the proper standards helps to increase the impact of funded research. In order for users to find a dataset, it must first be named. Data citations play the important role of giving datasets a persistent presence by assigning a formal "name" and location. This project focuses on the next step of the "name-find-use" sequence: enhancing discoverability of NCAR data by connecting related resources on the web. By examining metadata schemas that document datasets, I examined how Semantic Web approaches can help to ensure the widest possible range of data users. The focus was to move from search engine optimization (SEO) to information connectivity. Two main markup types are very visible in the Semantic Web and applicable to scientific dataset discovery: The Open Archives Initiative-Object Reuse and Exchange (OAI-ORE - www.openarchives.org) and Microdata (HTML5 and www.schema.org). My project creates pilot aggregations of related resources using both markup types for three case studies: The North American Regional Climate Change Assessment Program (NARCCAP) dataset and related publications, the Palmer Drought Severity Index (PSDI) animation and image files from NCAR's Visualization Lab (VisLab), and the multidisciplinary data types and formats from the Advanced Cooperative Arctic Data and Information Service (ACADIS). This project documents the differences between these markups and how each creates connectedness on the web. My recommendations point toward the most efficient and effective markup schema for aggregating resources within the three case studies based on the following assessment criteria: ease of use, current state of support and adoption of technology, integration with typical web tools, available vocabularies and geoinformatic standards, interoperability with current repositories and access portals (e.g. ESG, Java), and relation to data citation tools and methods.
Network-optimized congestion pricing : a parable, model and algorithm
DOT National Transportation Integrated Search
1995-05-31
This paper recites a parable, formulates a model and devises an algorithm for optimizing tolls on a road network. Such tolls induce an equilibrium traffic flow that is at once system-optimal and user-optimal. The parable introduces the network-wide c...
Emotional Mining: Tagging Emoticons to Online News
NASA Astrophysics Data System (ADS)
Kasinathan, Vinothini; Mustapha, Aida; Zhi Yong, Lee; Aida Zamnah, Z. A.
2017-08-01
This paper presents an emotion mining system, which assigns emoticons to newspaper articles into a pre-defined emotion category based on the underlying emotion in the news. Next, the system makes recommendation to the reader by tagging the news headline with the respective emoticons. Users are then able to decide whether to read the news based on the emoticons provided. The system also provides a filter for the users to choose the category of news to read following the emoticons.
A Crowdsensing Based Analytical Framework for Perceptional Degradation of OTT Web Browsing.
Li, Ke; Wang, Hai; Xu, Xiaolong; Du, Yu; Liu, Yuansheng; Ahmad, M Omair
2018-05-15
Service perception analysis is crucial for understanding both user experiences and network quality as well as for maintaining and optimizing of mobile networks. Given the rapid development of mobile Internet and over-the-top (OTT) services, the conventional network-centric mode of network operation and maintenance is no longer effective. Therefore, developing an approach to evaluate and optimizing users' service perceptions has become increasingly important. Meanwhile, the development of a new sensing paradigm, mobile crowdsensing (MCS), makes it possible to evaluate and analyze the user's OTT service perception from end-user's point of view other than from the network side. In this paper, the key factors that impact users' end-to-end OTT web browsing service perception are analyzed by monitoring crowdsourced user perceptions. The intrinsic relationships among the key factors and the interactions between key quality indicators (KQI) are evaluated from several perspectives. Moreover, an analytical framework of perceptional degradation and a detailed algorithm are proposed whose goal is to identify the major factors that impact the perceptional degradation of web browsing service as well as their significance of contribution. Finally, a case study is presented to show the effectiveness of the proposed method using a dataset crowdsensed from a large number of smartphone users in a real mobile network. The proposed analytical framework forms a valuable solution for mobile network maintenance and optimization and can help improve web browsing service perception and network quality.
Machine learning techniques for energy optimization in mobile embedded systems
NASA Astrophysics Data System (ADS)
Donohoo, Brad Kyoshi
Mobile smartphones and other portable battery operated embedded systems (PDAs, tablets) are pervasive computing devices that have emerged in recent years as essential instruments for communication, business, and social interactions. While performance, capabilities, and design are all important considerations when purchasing a mobile device, a long battery lifetime is one of the most desirable attributes. Battery technology and capacity has improved over the years, but it still cannot keep pace with the power consumption demands of today's mobile devices. This key limiter has led to a strong research emphasis on extending battery lifetime by minimizing energy consumption, primarily using software optimizations. This thesis presents two strategies that attempt to optimize mobile device energy consumption with negligible impact on user perception and quality of service (QoS). The first strategy proposes an application and user interaction aware middleware framework that takes advantage of user idle time between interaction events of the foreground application to optimize CPU and screen backlight energy consumption. The framework dynamically classifies mobile device applications based on their received interaction patterns, then invokes a number of different power management algorithms to adjust processor frequency and screen backlight levels accordingly. The second strategy proposes the usage of machine learning techniques to learn a user's mobile device usage pattern pertaining to spatiotemporal and device contexts, and then predict energy-optimal data and location interface configurations. By learning where and when a mobile device user uses certain power-hungry interfaces (3G, WiFi, and GPS), the techniques, which include variants of linear discriminant analysis, linear logistic regression, non-linear logistic regression, and k-nearest neighbor, are able to dynamically turn off unnecessary interfaces at runtime in order to save energy.
Estimation and detection information trade-off for x-ray system optimization
NASA Astrophysics Data System (ADS)
Cushing, Johnathan B.; Clarkson, Eric W.; Mandava, Sagar; Bilgin, Ali
2016-05-01
X-ray Computed Tomography (CT) systems perform complex imaging tasks to detect and estimate system parameters, such as a baggage imaging system performing threat detection and generating reconstructions. This leads to a desire to optimize both the detection and estimation performance of a system, but most metrics only focus on one of these aspects. When making design choices there is a need for a concise metric which considers both detection and estimation information parameters, and then provides the user with the collection of possible optimal outcomes. In this paper a graphical analysis of Estimation and Detection Information Trade-off (EDIT) will be explored. EDIT produces curves which allow for a decision to be made for system optimization based on design constraints and costs associated with estimation and detection. EDIT analyzes the system in the estimation information and detection information space where the user is free to pick their own method of calculating these measures. The user of EDIT can choose any desired figure of merit for detection information and estimation information then the EDIT curves will provide the collection of optimal outcomes. The paper will first look at two methods of creating EDIT curves. These curves can be calculated using a wide variety of systems and finding the optimal system by maximizing a figure of merit. EDIT could also be found as an upper bound of the information from a collection of system. These two methods allow for the user to choose a method of calculation which best fits the constraints of their actual system.
Fractional Programming for Communication Systems—Part II: Uplink Scheduling via Matching
NASA Astrophysics Data System (ADS)
Shen, Kaiming; Yu, Wei
2018-05-01
This two-part paper develops novel methodologies for using fractional programming (FP) techniques to design and optimize communication systems. Part I of this paper proposes a new quadratic transform for FP and treats its application for continuous optimization problems. In this Part II of the paper, we study discrete problems, such as those involving user scheduling, which are considerably more difficult to solve. Unlike the continuous problems, discrete or mixed discrete-continuous problems normally cannot be recast as convex problems. In contrast to the common heuristic of relaxing the discrete variables, this work reformulates the original problem in an FP form amenable to distributed combinatorial optimization. The paper illustrates this methodology by tackling the important and challenging problem of uplink coordinated multi-cell user scheduling in wireless cellular systems. Uplink scheduling is more challenging than downlink scheduling, because uplink user scheduling decisions significantly affect the interference pattern in nearby cells. Further, the discrete scheduling variable needs to be optimized jointly with continuous variables such as transmit power levels and beamformers. The main idea of the proposed FP approach is to decouple the interaction among the interfering links, thereby permitting a distributed and joint optimization of the discrete and continuous variables with provable convergence. The paper shows that the well-known weighted minimum mean-square-error (WMMSE) algorithm can also be derived from a particular use of FP; but our proposed FP-based method significantly outperforms WMMSE when discrete user scheduling variables are involved, both in term of run-time efficiency and optimizing results.
AUCTION MECHANISMS FOR IMPLEMENTING TRADABLE NETWORK PERMIT MARKETS
NASA Astrophysics Data System (ADS)
Wada, Kentaro; Akamatsu, Takashi
This paper proposes a new auction mechanism for implementing the tradable network permit markets. Assuming that each user makes a trip from an origin to a destination along a path in a specific time period, we design an auction mechanism that enables each user to purchase a bundle of permits corresponding to a set of links in the user's preferred path. The objective of the proposed mechanism is to achieve a socially optimal state with minimal revelation of users' private information. In order to achieve this, the mechanism employs an evolutionary approach that has an auction phase and a path capacity adjustment phase, which are repeated on a day-to-day basis. We prove that the proposed mechanism has the following desirable properties: (1) truthful bidding is the dominant strategy for each user and (2) the proposed mechanism converges to an approximate socially optimal state in the sense that the achieved value of the social surplus reaches its maximum value when the number of users is large.
Searching for cancer information on the internet: analyzing natural language search queries.
Bader, Judith L; Theofanos, Mary Frances
2003-12-11
Searching for health information is one of the most-common tasks performed by Internet users. Many users begin searching on popular search engines rather than on prominent health information sites. We know that many visitors to our (National Cancer Institute) Web site, cancer.gov, arrive via links in search engine result. To learn more about the specific needs of our general-public users, we wanted to understand what lay users really wanted to know about cancer, how they phrased their questions, and how much detail they used. The National Cancer Institute partnered with AskJeeves, Inc to develop a methodology to capture, sample, and analyze 3 months of cancer-related queries on the Ask.com Web site, a prominent United States consumer search engine, which receives over 35 million queries per week. Using a benchmark set of 500 terms and word roots supplied by the National Cancer Institute, AskJeeves identified a test sample of cancer queries for 1 week in August 2001. From these 500 terms only 37 appeared >or= 5 times/day over the trial test week in 17208 queries. Using these 37 terms, 204165 instances of cancer queries were found in the Ask.com query logs for the actual test period of June-August 2001. Of these, 7500 individual user questions were randomly selected for detailed analysis and assigned to appropriate categories. The exact language of sample queries is presented. Considering multiples of the same questions, the sample of 7500 individual user queries represented 76077 queries (37% of the total 3-month pool). Overall 78.37% of sampled Cancer queries asked about 14 specific cancer types. Within each cancer type, queries were sorted into appropriate subcategories including at least the following: General Information, Symptoms, Diagnosis and Testing, Treatment, Statistics, Definition, and Cause/Risk/Link. The most-common specific cancer types mentioned in queries were Digestive/Gastrointestinal/Bowel (15.0%), Breast (11.7%), Skin (11.3%), and Genitourinary (10.5%). Additional subcategories of queries about specific cancer types varied, depending on user input. Queries that were not specific to a cancer type were also tracked and categorized. Natural-language searching affords users the opportunity to fully express their information needs and can aid users naïve to the content and vocabulary. The specific queries analyzed for this study reflect news and research studies reported during the study dates and would surely change with different study dates. Analyzing queries from search engines represents one way of knowing what kinds of content to provide to users of a given Web site. Users ask questions using whole sentences and keywords, often misspelling words. Providing the option for natural-language searching does not obviate the need for good information architecture, usability engineering, and user testing in order to optimize user experience.
Searching for Cancer Information on the Internet: Analyzing Natural Language Search Queries
Theofanos, Mary Frances
2003-01-01
Background Searching for health information is one of the most-common tasks performed by Internet users. Many users begin searching on popular search engines rather than on prominent health information sites. We know that many visitors to our (National Cancer Institute) Web site, cancer.gov, arrive via links in search engine result. Objective To learn more about the specific needs of our general-public users, we wanted to understand what lay users really wanted to know about cancer, how they phrased their questions, and how much detail they used. Methods The National Cancer Institute partnered with AskJeeves, Inc to develop a methodology to capture, sample, and analyze 3 months of cancer-related queries on the Ask.com Web site, a prominent United States consumer search engine, which receives over 35 million queries per week. Using a benchmark set of 500 terms and word roots supplied by the National Cancer Institute, AskJeeves identified a test sample of cancer queries for 1 week in August 2001. From these 500 terms only 37 appeared ≥ 5 times/day over the trial test week in 17208 queries. Using these 37 terms, 204165 instances of cancer queries were found in the Ask.com query logs for the actual test period of June-August 2001. Of these, 7500 individual user questions were randomly selected for detailed analysis and assigned to appropriate categories. The exact language of sample queries is presented. Results Considering multiples of the same questions, the sample of 7500 individual user queries represented 76077 queries (37% of the total 3-month pool). Overall 78.37% of sampled Cancer queries asked about 14 specific cancer types. Within each cancer type, queries were sorted into appropriate subcategories including at least the following: General Information, Symptoms, Diagnosis and Testing, Treatment, Statistics, Definition, and Cause/Risk/Link. The most-common specific cancer types mentioned in queries were Digestive/Gastrointestinal/Bowel (15.0%), Breast (11.7%), Skin (11.3%), and Genitourinary (10.5%). Additional subcategories of queries about specific cancer types varied, depending on user input. Queries that were not specific to a cancer type were also tracked and categorized. Conclusions Natural-language searching affords users the opportunity to fully express their information needs and can aid users naïve to the content and vocabulary. The specific queries analyzed for this study reflect news and research studies reported during the study dates and would surely change with different study dates. Analyzing queries from search engines represents one way of knowing what kinds of content to provide to users of a given Web site. Users ask questions using whole sentences and keywords, often misspelling words. Providing the option for natural-language searching does not obviate the need for good information architecture, usability engineering, and user testing in order to optimize user experience. PMID:14713659
Optimization of locations of diffusion spots in indoor optical wireless local area networks
NASA Astrophysics Data System (ADS)
Eltokhey, Mahmoud W.; Mahmoud, K. R.; Ghassemlooy, Zabih; Obayya, Salah S. A.
2018-03-01
In this paper, we present a novel optimization of the locations of the diffusion spots in indoor optical wireless local area networks, based on the central force optimization (CFO) scheme. The users' performance uniformity is addressed by using the CFO algorithm, and adopting different objective function's configurations, while considering maximization and minimization of the signal to noise ratio and the delay spread, respectively. We also investigate the effect of varying the objective function's weights on the system and the users' performance as part of the adaptation process. The results show that the proposed objective function configuration-based optimization procedure offers an improvement of 65% in the standard deviation of individual receivers' performance.
Optimel: Software for selecting the optimal method
NASA Astrophysics Data System (ADS)
Popova, Olga; Popov, Boris; Romanov, Dmitry; Evseeva, Marina
Optimel: software for selecting the optimal method automates the process of selecting a solution method from the optimization methods domain. Optimel features practical novelty. It saves time and money when conducting exploratory studies if its objective is to select the most appropriate method for solving an optimization problem. Optimel features theoretical novelty because for obtaining the domain a new method of knowledge structuring was used. In the Optimel domain, extended quantity of methods and their properties are used, which allows identifying the level of scientific studies, enhancing the user's expertise level, expand the prospects the user faces and opening up new research objectives. Optimel can be used both in scientific research institutes and in educational institutions.
SEEK: A FORTRAN optimization program using a feasible directions gradient search
NASA Technical Reports Server (NTRS)
Savage, M.
1995-01-01
This report describes the use of computer program 'SEEK' which works in conjunction with two user-written subroutines and an input data file to perform an optimization procedure on a user's problem. The optimization method uses a modified feasible directions gradient technique. SEEK is written in ANSI standard Fortran 77, has an object size of about 46K bytes, and can be used on a personal computer running DOS. This report describes the use of the program and discusses the optimizing method. The program use is illustrated with four example problems: a bushing design, a helical coil spring design, a gear mesh design, and a two-parameter Weibull life-reliability curve fit.
NASA Technical Reports Server (NTRS)
Stahara, S. S.
1984-01-01
An investigation was carried out to complete the preliminary development of a combined perturbation/optimization procedure and associated computational code for designing optimized blade-to-blade profiles of turbomachinery blades. The overall purpose of the procedures developed is to provide demonstration of a rapid nonlinear perturbation method for minimizing the computational requirements associated with parametric design studies of turbomachinery flows. The method combines the multiple parameter nonlinear perturbation method, successfully developed in previous phases of this study, with the NASA TSONIC blade-to-blade turbomachinery flow solver, and the COPES-CONMIN optimization procedure into a user's code for designing optimized blade-to-blade surface profiles of turbomachinery blades. Results of several design applications and a documented version of the code together with a user's manual are provided.
Assignment Of Finite Elements To Parallel Processors
NASA Technical Reports Server (NTRS)
Salama, Moktar A.; Flower, Jon W.; Otto, Steve W.
1990-01-01
Elements assigned approximately optimally to subdomains. Mapping algorithm based on simulated-annealing concept used to minimize approximate time required to perform finite-element computation on hypercube computer or other network of parallel data processors. Mapping algorithm needed when shape of domain complicated or otherwise not obvious what allocation of elements to subdomains minimizes cost of computation.
[Work and health status of workers of shoe manufacturing industries].
Mironov, A I; Kirillov, V F; Bul'bulian, M A; Golubeva, A P; Kraeva, G K; Kuznetsova, A I; Nikolaeva, G M
2001-01-01
According to work conditions, severity and intensity, the main shoe-making occupations are assigned to III class of I-II jeopardy grade. If new technology applied, the work is assigned to I-II jeopardy class, being optimal--allowable. Increased mortality with liver cancer and lympholeucosis was revealed among workers contacting chloroprene.
Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems.
Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika
2017-06-01
This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the individual users. The proposed methods can be easily integrated in devising more advanced SC schemes and/or strategies for automatic BCI self-adaptations.
Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems
NASA Astrophysics Data System (ADS)
Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika
2017-06-01
Objective. This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. Approach. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Main results. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Significance. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the individual users. The proposed methods can be easily integrated in devising more advanced SC schemes and/or strategies for automatic BCI self-adaptations.
Saving Time, Saving Money: The Economics of Unclogging America's Worst Bottlenecks
DOT National Transportation Integrated Search
2000-01-01
A 1999 study by the American Highway Users Alliance entitled "Unclogging America's Arteries: Prescriptions for Healthier Highways" identified the 166 worst bottlenecks in the country and evaluated the benefits of removing them. By assigning monetary ...
Carcelén, María; Abascal, Estefanía; Herranz, Marta; Santantón, Sheila; Zenteno, Roberto; Ruiz Serrano, María Jesús; Bouza, Emilio
2017-01-01
The assignation of lineages in Mycobacterium tuberculosis (MTB) provides valuable information for evolutionary and phylogeographic studies and makes for more accurate knowledge of the distribution of this pathogen worldwide. Differences in virulence have also been found for certain lineages. MTB isolates were initially assigned to lineages based on data obtained from genotyping techniques, such as spoligotyping or MIRU-VNTR analysis, some of which are more suitable for molecular epidemiology studies. However, since these methods are subject to a certain degree of homoplasy, other criteria have been chosen to assign lineages. These are based on targeting robust and specific SNPs for each lineage. Here, we propose two newly designed multiplex targeting methods—both of which are single-tube tests—to optimize the assignation of the six main lineages in MTB. The first method is based on ASO-PCR and offers an inexpensive and easy-to-implement assay for laboratories with limited resources. The other, which is based on SNaPshot, enables more refined standardized assignation of lineages for laboratories with better resources. Both methods performed well when assigning lineages from cultured isolates from a control panel, a test panel, and a problem panel from an unrelated population, Mexico, which included isolates in which standard genotyping was not able to classify lineages. Both tests were also able to assign lineages from stored isolates, without the need for subculture or purification of DNA, and even directly from clinical specimens with a medium-high bacilli burden. Our assays could broaden the contexts where information on lineages can be acquired, thus enabling us to quickly update data from retrospective collections and to merge data with those obtained at the time of diagnosis of a new TB case. PMID:29091913
Joint optimization of regional water-power systems
NASA Astrophysics Data System (ADS)
Pereira-Cardenal, Silvio J.; Mo, Birger; Gjelsvik, Anders; Riegels, Niels D.; Arnbjerg-Nielsen, Karsten; Bauer-Gottwein, Peter
2016-06-01
Energy and water resources systems are tightly coupled; energy is needed to deliver water and water is needed to extract or produce energy. Growing pressure on these resources has raised concerns about their long-term management and highlights the need to develop integrated solutions. A method for joint optimization of water and electric power systems was developed in order to identify methodologies to assess the broader interactions between water and energy systems. The proposed method is to include water users and power producers into an economic optimization problem that minimizes the cost of power production and maximizes the benefits of water allocation, subject to constraints from the power and hydrological systems. The method was tested on the Iberian Peninsula using simplified models of the seven major river basins and the power market. The optimization problem was successfully solved using stochastic dual dynamic programming. The results showed that current water allocation to hydropower producers in basins with high irrigation productivity, and to irrigation users in basins with high hydropower productivity was sub-optimal. Optimal allocation was achieved by managing reservoirs in very distinct ways, according to the local inflow, storage capacity, hydropower productivity, and irrigation demand and productivity. This highlights the importance of appropriately representing the water users' spatial distribution and marginal benefits and costs when allocating water resources optimally. The method can handle further spatial disaggregation and can be extended to include other aspects of the water-energy nexus.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernal, Andrés; Patiny, Luc; Castillo, Andrés M.
2015-02-21
Nuclear magnetic resonance (NMR) assignment of small molecules is presented as a typical example of a combinatorial optimization problem in chemical physics. Three strategies that help improve the efficiency of solution search by the branch and bound method are presented: 1. reduction of the size of the solution space by resort to a condensed structure formula, wherein symmetric nuclei are grouped together; 2. partitioning of the solution space based on symmetry, that becomes the basis for an efficient branching procedure; and 3. a criterion of selection of input restrictions that leads to increased gaps between branches and thus faster pruningmore » of non-viable solutions. Although the examples chosen to illustrate this work focus on small-molecule NMR assignment, the results are generic and might help solving other combinatorial optimization problems.« less
NASA Technical Reports Server (NTRS)
Generazio, Edward R. (Inventor)
2012-01-01
A method of validating a probability of detection (POD) testing system using directed design of experiments (DOE) includes recording an input data set of observed hit and miss or analog data for sample components as a function of size of a flaw in the components. The method also includes processing the input data set to generate an output data set having an optimal class width, assigning a case number to the output data set, and generating validation instructions based on the assigned case number. An apparatus includes a host machine for receiving the input data set from the testing system and an algorithm for executing DOE to validate the test system. The algorithm applies DOE to the input data set to determine a data set having an optimal class width, assigns a case number to that data set, and generates validation instructions based on the case number.
A thermal vacuum test optimization procedure
NASA Technical Reports Server (NTRS)
Kruger, R.; Norris, H. P.
1979-01-01
An analytical model was developed that can be used to establish certain parameters of a thermal vacuum environmental test program based on an optimization of program costs. This model is in the form of a computer program that interacts with a user insofar as the input of certain parameters. The program provides the user a list of pertinent information regarding an optimized test program and graphs of some of the parameters. The model is a first attempt in this area and includes numerous simplifications. The model appears useful as a general guide and provides a way for extrapolating past performance to future missions.
SynGenics Optimization System (SynOptSys)
NASA Technical Reports Server (NTRS)
Ventresca, Carol; McMilan, Michelle L.; Globus, Stephanie
2013-01-01
The SynGenics Optimization System (SynOptSys) software application optimizes a product with respect to multiple, competing criteria using statistical Design of Experiments, Response-Surface Methodology, and the Desirability Optimization Methodology. The user is not required to be skilled in the underlying math; thus, SynOptSys can help designers and product developers overcome the barriers that prevent them from using powerful techniques to develop better pro ducts in a less costly manner. SynOpt-Sys is applicable to the design of any product or process with multiple criteria to meet, and at least two factors that influence achievement of those criteria. The user begins with a selected solution principle or system concept and a set of criteria that needs to be satisfied. The criteria may be expressed in terms of documented desirements or defined responses that the future system needs to achieve. Documented desirements can be imported into SynOptSys or created and documented directly within SynOptSys. Subsequent steps include identifying factors, specifying model order for each response, designing the experiment, running the experiment and gathering the data, analyzing the results, and determining the specifications for the optimized system. The user may also enter textual information as the project progresses. Data is easily edited within SynOptSys, and the software design enables full traceability within any step in the process, and facilitates reporting as needed. SynOptSys is unique in the way responses are defined and the nuances of the goodness associated with changes in response values for each of the responses of interest. The Desirability Optimization Methodology provides the basis of this novel feature. Moreover, this is a complete, guided design and optimization process tool with embedded math that can remain invisible to the user. It is not a standalone statistical program; it is a design and optimization system.
Storage assignment optimization in a multi-tier shuttle warehousing system
NASA Astrophysics Data System (ADS)
Wang, Yanyan; Mou, Shandong; Wu, Yaohua
2016-03-01
The current mathematical models for the storage assignment problem are generally established based on the traveling salesman problem(TSP), which has been widely applied in the conventional automated storage and retrieval system(AS/RS). However, the previous mathematical models in conventional AS/RS do not match multi-tier shuttle warehousing systems(MSWS) because the characteristics of parallel retrieval in multiple tiers and progressive vertical movement destroy the foundation of TSP. In this study, a two-stage open queuing network model in which shuttles and a lift are regarded as servers at different stages is proposed to analyze system performance in the terms of shuttle waiting period (SWP) and lift idle period (LIP) during transaction cycle time. A mean arrival time difference matrix for pairwise stock keeping units(SKUs) is presented to determine the mean waiting time and queue length to optimize the storage assignment problem on the basis of SKU correlation. The decomposition method is applied to analyze the interactions among outbound task time, SWP, and LIP. The ant colony clustering algorithm is designed to determine storage partitions using clustering items. In addition, goods are assigned for storage according to the rearranging permutation and the combination of storage partitions in a 2D plane. This combination is derived based on the analysis results of the queuing network model and on three basic principles. The storage assignment method and its entire optimization algorithm method as applied in a MSWS are verified through a practical engineering project conducted in the tobacco industry. The applying results show that the total SWP and LIP can be reduced effectively to improve the utilization rates of all devices and to increase the throughput of the distribution center.
A Markov Random Field Framework for Protein Side-Chain Resonance Assignment
NASA Astrophysics Data System (ADS)
Zeng, Jianyang; Zhou, Pei; Donald, Bruce Randall
Nuclear magnetic resonance (NMR) spectroscopy plays a critical role in structural genomics, and serves as a primary tool for determining protein structures, dynamics and interactions in physiologically-relevant solution conditions. The current speed of protein structure determination via NMR is limited by the lengthy time required in resonance assignment, which maps spectral peaks to specific atoms and residues in the primary sequence. Although numerous algorithms have been developed to address the backbone resonance assignment problem [68,2,10,37,14,64,1,31,60], little work has been done to automate side-chain resonance assignment [43, 48, 5]. Most previous attempts in assigning side-chain resonances depend on a set of NMR experiments that record through-bond interactions with side-chain protons for each residue. Unfortunately, these NMR experiments have low sensitivity and limited performance on large proteins, which makes it difficult to obtain enough side-chain resonance assignments. On the other hand, it is essential to obtain almost all of the side-chain resonance assignments as a prerequisite for high-resolution structure determination. To overcome this deficiency, we present a novel side-chain resonance assignment algorithm based on alternative NMR experiments measuring through-space interactions between protons in the protein, which also provide crucial distance restraints and are normally required in high-resolution structure determination. We cast the side-chain resonance assignment problem into a Markov Random Field (MRF) framework, and extend and apply combinatorial protein design algorithms to compute the optimal solution that best interprets the NMR data. Our MRF framework captures the contact map information of the protein derived from NMR spectra, and exploits the structural information available from the backbone conformations determined by orientational restraints and a set of discretized side-chain conformations (i.e., rotamers). A Hausdorff-based computation is employed in the scoring function to evaluate the probability of side-chain resonance assignments to generate the observed NMR spectra. The complexity of the assignment problem is first reduced by using a dead-end elimination (DEE) algorithm, which prunes side-chain resonance assignments that are provably not part of the optimal solution. Then an A* search algorithm is used to find a set of optimal side-chain resonance assignments that best fit the NMR data. We have tested our algorithm on NMR data for five proteins, including the FF Domain 2 of human transcription elongation factor CA150 (FF2), the B1 domain of Protein G (GB1), human ubiquitin, the ubiquitin-binding zinc finger domain of the human Y-family DNA polymerase Eta (pol η UBZ), and the human Set2-Rpb1 interacting domain (hSRI). Our algorithm assigns resonances for more than 90% of the protons in the proteins, and achieves about 80% correct side-chain resonance assignments. The final structures computed using distance restraints resulting from the set of assigned side-chain resonances have backbone RMSD 0.5 - 1.4 Å and all-heavy-atom RMSD 1.0 - 2.2 Å from the reference structures that were determined by X-ray crystallography or traditional NMR approaches. These results demonstrate that our algorithm can be successfully applied to automate side-chain resonance assignment and high-quality protein structure determination. Since our algorithm does not require any specific NMR experiments for measuring the through-bond interactions with side-chain protons, it can save a significant amount of both experimental cost and spectrometer time, and hence accelerate the NMR structure determination process.
Challenge: How Effective is Routing for Wireless Networking
2017-03-03
sage (called a“ hello ”) to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between...these schemes. A brief description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is...counted. A cost is then assigned to the link based on how many hello messages were heard; a link that has fewer hellos successfully transmitted across
Challenge: How Effective is Routing for Wireless Networking
2015-09-07
sage (called a“ hello ”) to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between...these schemes. A brief description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is...counted. A cost is then assigned to the link based on how many hello messages were heard; a link that has fewer hellos successfully transmitted across
NASA Technical Reports Server (NTRS)
Munoz, Abraham
1988-01-01
Conceived since the beginning of time, living in space is no longer a dream but rather a very near reality. The concept of a Space Station is not a new one, but a redefined one. Many investigations on the kinds of experiments and work assignments the Space Station will need to accommodate have been completed, but NASA specialists are constantly talking with potential users of the Station to learn more about the work they, the users, want to do in space. Present configurations are examined along with possible new ones.
2010-12-01
The Journal of Consumer Marketing, Vol. 12, Iss 3, 4 -22. Yang, D. (2007) What Happens If Facebook Thinks You’re Not Real? Retrieved on January 10...48 Appendix C – Instrument 3 -List of Semi-Structured Interview questions .......................49 Appendix D – Instrument 4 - Instructions for...Internet users have watched a 4 video online. This same report showed that 85% of young broadband users have watched an online video, and 62
Gao, Yuan; Zhou, Weigui; Ao, Hong; Chu, Jian; Zhou, Quan; Zhou, Bo; Wang, Kang; Li, Yi; Xue, Peng
2016-01-01
With the increasing demands for better transmission speed and robust quality of service (QoS), the capacity constrained backhaul gradually becomes a bottleneck in cooperative wireless networks, e.g., in the Internet of Things (IoT) scenario in joint processing mode of LTE-Advanced Pro. This paper focuses on resource allocation within capacity constrained backhaul in uplink cooperative wireless networks, where two base stations (BSs) equipped with single antennae serve multiple single-antennae users via multi-carrier transmission mode. In this work, we propose a novel cooperative transmission scheme based on compress-and-forward with user pairing to solve the joint mixed integer programming problem. To maximize the system capacity under the limited backhaul, we formulate the joint optimization problem of user sorting, subcarrier mapping and backhaul resource sharing among different pairs (subcarriers for users). A novel robust and efficient centralized algorithm based on alternating optimization strategy and perfect mapping is proposed. Simulations show that our novel method can improve the system capacity significantly under the constraint of the backhaul resource compared with the blind alternatives. PMID:27077865
NASA Astrophysics Data System (ADS)
Wang, Fu; Liu, Bo; Zhang, Lijia; Xin, Xiangjun; Tian, Qinghua; Zhang, Qi; Rao, Lan; Tian, Feng; Luo, Biao; Liu, Yingjun; Tang, Bao
2016-10-01
Elastic Optical Networks are considered to be a promising technology for future high-speed network. In this paper, we propose a RSA algorithm based on the ant colony optimization of minimum consecutiveness loss (ACO-MCL). Based on the effect of the spectrum consecutiveness loss on the pheromone in the ant colony optimization, the path and spectrum of the minimal impact on the network are selected for the service request. When an ant arrives at the destination node from the source node along a path, we assume that this path is selected for the request. We calculate the consecutiveness loss of candidate-neighbor link pairs along this path after the routing and spectrum assignment. Then, the networks update the pheromone according to the value of the consecutiveness loss. We save the path with the smallest value. After multiple iterations of the ant colony optimization, the final selection of the path is assigned for the request. The algorithms are simulated in different networks. The results show that ACO-MCL algorithm performs better in blocking probability and spectrum efficiency than other algorithms. Moreover, the ACO-MCL algorithm can effectively decrease spectrum fragmentation and enhance available spectrum consecutiveness. Compared with other algorithms, the ACO-MCL algorithm can reduce the blocking rate by at least 5.9% in heavy load.
Mechanism Design for Incentivizing Social Media Contributions
NASA Astrophysics Data System (ADS)
Singh, Vivek K.; Jain, Ramesh; Kankanhalli, Mohan
Despite recent advancements in user-driven social media platforms, tools for studying user behavior patterns and motivations remain primitive. We highlight the voluntary nature of user contributions and that users can choose when (and when not) to contribute to the common media pool. A Game theoretic framework is proposed to study the dynamics of social media networks where contribution costs are individual but gains are common. We model users as rational selfish agents, and consider domain attributes like voluntary participation, virtual reward structure, network effect, and public-sharing to model the dynamics of this interaction. The created model describes the most appropriate contribution strategy from each user's perspective and also highlights issues like 'free-rider' problem and individual rationality leading to irrational (i.e. sub-optimal) group behavior. We also consider the perspective of the system designer who is interested in finding the best incentive mechanisms to influence the selfish end-users so that the overall system utility is maximized. We propose and compare multiple mechanisms (based on optimal bonus payment, social incentive leveraging, and second price auction) to study how a system designer can exploit the selfishness of its users, to design incentive mechanisms which improve the overall task-completion probability and system performance, while possibly still benefiting the individual users.
Users Manual AERCOARE Version 1.0
The work completed under this work assignment is a continuation of an evaluation report prepared by ENVIRON entitled “Evaluation of the COARE–AERMOD Alternative Modeling Approach for Simulation of Shell Exploratory Drilling Sources in the Beaufort and Chukchi Seas” dated Decembe...
Microcomputer-Based Programs for Pharmacokinetic Simulations.
ERIC Educational Resources Information Center
Li, Ronald C.; And Others
1995-01-01
Microcomputer software that simulates drug-concentration time profiles based on user-assigned pharmacokinetic parameters such as central volume of distribution, elimination rate constant, absorption rate constant, dosing regimens, and compartmental transfer rate constants is described. The software is recommended for use in undergraduate…
Integration of Grid and Local Batch Resources at DESY
NASA Astrophysics Data System (ADS)
Beyer, Christoph; Finnern, Thomas; Gellrich, Andreas; Hartmann, Thomas; Kemp, Yves; Lewendel, Birgit
2017-10-01
As one of the largest resource centres DESY has to support differing work flows of users from various scientific backgrounds. Users can be for one HEP experiments in WLCG or Belle II as well as local HEP users but also physicists from other fields as photon science or accelerator development. By abandoning specific worker node setups in favour of generic flat nodes with middleware resources provided via CVMFS, we gain flexibility to subsume different use cases in a homogeneous environment. Grid jobs and the local batch system are managed in a HTCondor based setup, accepting pilot, user and containerized jobs. The unified setup allows dynamic re-assignment of resources between the different use cases. Monitoring is implemented on global batch system metrics as well as on a per job level utilizing corresponding cgroup information.
Sampling large landscapes with small-scale stratification-User's Manual
Bart, Jonathan
2011-01-01
This manual explains procedures for partitioning a large landscape into plots, assigning the plots to strata, and selecting plots in each stratum to be surveyed. These steps are referred to as the "sampling large landscapes (SLL) process." We assume that users of the manual have a moderate knowledge of ArcGIS and Microsoft ® Excel. The manual is written for a single user but in many cases, some steps will be carried out by a biologist designing the survey and some steps will be carried out by a quantitative assistant. Thus, the manual essentially may be passed back and forth between these users. The SLL process primarily has been used to survey birds, and we refer to birds as subjects of the counts. The process, however, could be used to count any objects. ®
NASA Astrophysics Data System (ADS)
Eyono Obono, S. D.; Basak, Sujit Kumar
2011-12-01
The general formulation of the assignment problem consists in the optimal allocation of a given set of tasks to a workforce. This problem is covered by existing literature for different domains such as distributed databases, distributed systems, transportation, packets radio networks, IT outsourcing, and teaching allocation. This paper presents a new version of the assignment problem for the allocation of academic tasks to staff members in departments with long leave opportunities. It presents the description of a workload allocation scheme and its algorithm, for the allocation of an equitable number of tasks in academic departments where long leaves are necessary.
Particle swarm optimization algorithm for optimizing assignment of blood in blood banking system.
Olusanya, Micheal O; Arasomwan, Martins A; Adewumi, Aderemi O
2015-01-01
This paper reports the performance of particle swarm optimization (PSO) for the assignment of blood to meet patients' blood transfusion requests for blood transfusion. While the drive for blood donation lingers, there is need for effective and efficient management of available blood in blood banking systems. Moreover, inherent danger of transfusing wrong blood types to patients, unnecessary importation of blood units from external sources, and wastage of blood products due to nonusage necessitate the development of mathematical models and techniques for effective handling of blood distribution among available blood types in order to minimize wastages and importation from external sources. This gives rise to the blood assignment problem (BAP) introduced recently in literature. We propose a queue and multiple knapsack models with PSO-based solution to address this challenge. Simulation is based on sets of randomly generated data that mimic real-world population distribution of blood types. Results obtained show the efficiency of the proposed algorithm for BAP with no blood units wasted and very low importation, where necessary, from outside the blood bank. The result therefore can serve as a benchmark and basis for decision support tools for real-life deployment.
Optimization of territory control of the mail carrier by using Hungarian methods
NASA Astrophysics Data System (ADS)
Supian, S.; Wahyuni, S.; Nahar, J.; Subiyanto
2018-03-01
In this paper, the territory control of the mail carrier from the central post office Bandung in delivering the package to the destination location was optimized by using Hungarian method. Sensitivity analysis against data changes that may occur was also conducted. The sampled data in this study are the territory control of 10 mail carriers who will be assigned to deliver mail package to 10 post office delivery centers in Bandung. The result of this research is the combination of territory control optimal from 10 mail carriers as follows: mail carrier 1 to Cikutra, mail carrier 2 to Ujung Berung, mail carrier 3 to Dayeuh Kolot, mail carrier 4 to Padalarang, mail carrier 5 to Situ Saeur, mail carrier 6 to Cipedes, mail carrier 7 to Cimahi, mail carrier 8 to Soreang, mail carrier 9 to Asia-Afrika, mail carrier 10 to Cikeruh. Based on this result, manager of the central post office Bandung can make optimal decisions to assign tasks to their mail carriers.
Wang, Yong; Ma, Xiaolei; Liu, Yong; Gong, Ke; Henricakson, Kristian C.; Xu, Maozeng; Wang, Yinhai
2016-01-01
This paper proposes a two-stage algorithm to simultaneously estimate origin-destination (OD) matrix, link choice proportion, and dispersion parameter using partial traffic counts in a congested network. A non-linear optimization model is developed which incorporates a dynamic dispersion parameter, followed by a two-stage algorithm in which Generalized Least Squares (GLS) estimation and a Stochastic User Equilibrium (SUE) assignment model are iteratively applied until the convergence is reached. To evaluate the performance of the algorithm, the proposed approach is implemented in a hypothetical network using input data with high error, and tested under a range of variation coefficients. The root mean squared error (RMSE) of the estimated OD demand and link flows are used to evaluate the model estimation results. The results indicate that the estimated dispersion parameter theta is insensitive to the choice of variation coefficients. The proposed approach is shown to outperform two established OD estimation methods and produce parameter estimates that are close to the ground truth. In addition, the proposed approach is applied to an empirical network in Seattle, WA to validate the robustness and practicality of this methodology. In summary, this study proposes and evaluates an innovative computational approach to accurately estimate OD matrices using link-level traffic flow data, and provides useful insight for optimal parameter selection in modeling travelers’ route choice behavior. PMID:26761209
Heuristic Scheduling in Grid Environments: Reducing the Operational Energy Demand
NASA Astrophysics Data System (ADS)
Bodenstein, Christian
In a world where more and more businesses seem to trade in an online market, the supply of online services to the ever-growing demand could quickly reach its capacity limits. Online service providers may find themselves maxed out at peak operation levels during high-traffic timeslots but too little demand during low-traffic timeslots, although the latter is becoming less frequent. At this point deciding which user is allocated what level of service becomes essential. The concept of Grid computing could offer a meaningful alternative to conventional super-computing centres. Not only can Grids reach the same computing speeds as some of the fastest supercomputers, but distributed computing harbors a great energy-saving potential. When scheduling projects in such a Grid environment however, simply assigning one process to a system becomes so complex in calculation that schedules are often too late to execute, rendering their optimizations useless. Current schedulers attempt to maximize the utility, given some sort of constraint, often reverting to heuristics. This optimization often comes at the cost of environmental impact, in this case CO 2 emissions. This work proposes an alternate model of energy efficient scheduling while keeping a respectable amount of economic incentives untouched. Using this model, it is possible to reduce the total energy consumed by a Grid environment using 'just-in-time' flowtime management, paired with ranking nodes by efficiency.
Measuring homework completion in behavioral activation.
Busch, Andrew M; Uebelacker, Lisa A; Kalibatseva, Zornitsa; Miller, Ivan W
2010-07-01
The aim of this study was to develop and validate an observer-based coding system for the characterization and completion of homework assignments during Behavioral Activation (BA). Existing measures of homework completion are generally unsophisticated, and there is no current measure of homework completion designed to capture the particularities of BA. The tested scale sought to capture the type of assignment, realm of functioning targeted, extent of completion, and assignment difficulty. Homework assignments were drawn from 12 (mean age = 48, 83% female) clients in two trials of a 10-session BA manual targeting treatment-resistant depression in primary care. The two coders demonstrated acceptable or better reliability on most codes, and unreliable codes were dropped from the proposed scale. In addition, correlations between homework completion and outcome were strong, providing some support for construct validity. Ultimately, this line of research aims to develop a user-friendly, reliable measure of BA homework completion that can be completed by a therapist during session.
NASA Astrophysics Data System (ADS)
Abdillah, T.; Dai, R.; Setiawan, E.
2018-02-01
This study aims to develop the application of Web Services technology with RestFul Protocol to optimize the information presentation on mining potential. This study used User Interface Design approach for the information accuracy and relevance as well as the Web Service for the reliability in presenting the information. The results show that: the information accuracy and relevance regarding mining potential can be seen from the achievement of User Interface implementation in the application that is based on the following rules: The consideration of the appropriate colours and objects, the easiness of using the navigation, and users’ interaction with the applications that employs symbols and languages understood by the users; the information accuracy and relevance related to mining potential can be observed by the information presented by using charts and Tool Tip Text to help the users understand the provided chart/figure; the reliability of the information presentation is evident by the results of Web Services testing in Figure 4.5.6. This study finds out that User Interface Design and Web Services approaches (for the access of different Platform apps) are able to optimize the presentation. The results of this study can be used as a reference for software developers and Provincial Government of Gorontalo.
Distributed Method to Optimal Profile Descent
NASA Astrophysics Data System (ADS)
Kim, Geun I.
Current ground automation tools for Optimal Profile Descent (OPD) procedures utilize path stretching and speed profile change to maintain proper merging and spacing requirements at high traffic terminal area. However, low predictability of aircraft's vertical profile and path deviation during decent add uncertainty to computing estimated time of arrival, a key information that enables the ground control center to manage airspace traffic effectively. This paper uses an OPD procedure that is based on a constant flight path angle to increase the predictability of the vertical profile and defines an OPD optimization problem that uses both path stretching and speed profile change while largely maintaining the original OPD procedure. This problem minimizes the cumulative cost of performing OPD procedures for a group of aircraft by assigning a time cost function to each aircraft and a separation cost function to a pair of aircraft. The OPD optimization problem is then solved in a decentralized manner using dual decomposition techniques under inter-aircraft ADS-B mechanism. This method divides the optimization problem into more manageable sub-problems which are then distributed to the group of aircraft. Each aircraft solves its assigned sub-problem and communicate the solutions to other aircraft in an iterative process until an optimal solution is achieved thus decentralizing the computation of the optimization problem.
A hybrid nonlinear programming method for design optimization
NASA Technical Reports Server (NTRS)
Rajan, S. D.
1986-01-01
Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.
Autopilot for frequency-modulation atomic force microscopy.
Kuchuk, Kfir; Schlesinger, Itai; Sivan, Uri
2015-10-01
One of the most challenging aspects of operating an atomic force microscope (AFM) is finding optimal feedback parameters. This statement applies particularly to frequency-modulation AFM (FM-AFM), which utilizes three feedback loops to control the cantilever excitation amplitude, cantilever excitation frequency, and z-piezo extension. These loops are regulated by a set of feedback parameters, tuned by the user to optimize stability, sensitivity, and noise in the imaging process. Optimization of these parameters is difficult due to the coupling between the frequency and z-piezo feedback loops by the non-linear tip-sample interaction. Four proportional-integral (PI) parameters and two lock-in parameters regulating these loops require simultaneous optimization in the presence of a varying unknown tip-sample coupling. Presently, this optimization is done manually in a tedious process of trial and error. Here, we report on the development and implementation of an algorithm that computes the control parameters automatically. The algorithm reads the unperturbed cantilever resonance frequency, its quality factor, and the z-piezo driving signal power spectral density. It analyzes the poles and zeros of the total closed loop transfer function, extracts the unknown tip-sample transfer function, and finds four PI parameters and two lock-in parameters for the frequency and z-piezo control loops that optimize the bandwidth and step response of the total system. Implementation of the algorithm in a home-built AFM shows that the calculated parameters are consistently excellent and rarely require further tweaking by the user. The new algorithm saves the precious time of experienced users, facilitates utilization of FM-AFM by casual users, and removes the main hurdle on the way to fully automated FM-AFM.
Autopilot for frequency-modulation atomic force microscopy
NASA Astrophysics Data System (ADS)
Kuchuk, Kfir; Schlesinger, Itai; Sivan, Uri
2015-10-01
One of the most challenging aspects of operating an atomic force microscope (AFM) is finding optimal feedback parameters. This statement applies particularly to frequency-modulation AFM (FM-AFM), which utilizes three feedback loops to control the cantilever excitation amplitude, cantilever excitation frequency, and z-piezo extension. These loops are regulated by a set of feedback parameters, tuned by the user to optimize stability, sensitivity, and noise in the imaging process. Optimization of these parameters is difficult due to the coupling between the frequency and z-piezo feedback loops by the non-linear tip-sample interaction. Four proportional-integral (PI) parameters and two lock-in parameters regulating these loops require simultaneous optimization in the presence of a varying unknown tip-sample coupling. Presently, this optimization is done manually in a tedious process of trial and error. Here, we report on the development and implementation of an algorithm that computes the control parameters automatically. The algorithm reads the unperturbed cantilever resonance frequency, its quality factor, and the z-piezo driving signal power spectral density. It analyzes the poles and zeros of the total closed loop transfer function, extracts the unknown tip-sample transfer function, and finds four PI parameters and two lock-in parameters for the frequency and z-piezo control loops that optimize the bandwidth and step response of the total system. Implementation of the algorithm in a home-built AFM shows that the calculated parameters are consistently excellent and rarely require further tweaking by the user. The new algorithm saves the precious time of experienced users, facilitates utilization of FM-AFM by casual users, and removes the main hurdle on the way to fully automated FM-AFM.
Autopilot for frequency-modulation atomic force microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuchuk, Kfir; Schlesinger, Itai; Sivan, Uri, E-mail: phsivan@tx.technion.ac.il
2015-10-15
One of the most challenging aspects of operating an atomic force microscope (AFM) is finding optimal feedback parameters. This statement applies particularly to frequency-modulation AFM (FM-AFM), which utilizes three feedback loops to control the cantilever excitation amplitude, cantilever excitation frequency, and z-piezo extension. These loops are regulated by a set of feedback parameters, tuned by the user to optimize stability, sensitivity, and noise in the imaging process. Optimization of these parameters is difficult due to the coupling between the frequency and z-piezo feedback loops by the non-linear tip-sample interaction. Four proportional-integral (PI) parameters and two lock-in parameters regulating these loopsmore » require simultaneous optimization in the presence of a varying unknown tip-sample coupling. Presently, this optimization is done manually in a tedious process of trial and error. Here, we report on the development and implementation of an algorithm that computes the control parameters automatically. The algorithm reads the unperturbed cantilever resonance frequency, its quality factor, and the z-piezo driving signal power spectral density. It analyzes the poles and zeros of the total closed loop transfer function, extracts the unknown tip-sample transfer function, and finds four PI parameters and two lock-in parameters for the frequency and z-piezo control loops that optimize the bandwidth and step response of the total system. Implementation of the algorithm in a home-built AFM shows that the calculated parameters are consistently excellent and rarely require further tweaking by the user. The new algorithm saves the precious time of experienced users, facilitates utilization of FM-AFM by casual users, and removes the main hurdle on the way to fully automated FM-AFM.« less
Air Weather Service Master Station Catalog: USAFETAC Climatic Database Users Handbook No. 6
1993-03-01
4) . ... .4 • FIELD NO. DESCRIPTION OF FIELD AND COMMENTS 01 STN NUM. A 6- digit number with the first 5 digits assigned to a particular weather...reporting location lAW WMO ,ules plus a sixth digit as follows: 0 = The first five digits are the actual block/station number (WMO number) assigned to...it is considered inactive for that hour. A digit (1-9) tells how many months it has been since a report was received from the station for that hour
Upgrade and Extension of the Data Acquisition System for Propulsion and Gas Dynamic Laboratories
1992-06-01
Program: TURBO4 ............... 165 Figure D7 TPL Program: SCAN TEMP .... ........... .. 169 Figure DS TPL Program: TURBO -MENU . . .......... 170...User 1 Cape Command TURBO CGMPRI3 DESIGN UPI4753A WORK BACKUP PROGRAM EIT CKARGER LAB CAT DIR LISTINGS MENU * Figure 31 HP9000 Initial CRT Screen... diselS -Data filelSa":. 700,0,1" 140 Data disc2$-Data_file2S&" :.700,0,1" 150 ASSIGN UVatapat1l TO Data discl$ 160 ASSIGN IDatapatb2 TO Data diac2$ 170
SimPhospho: a software tool enabling confident phosphosite assignment.
Suni, Veronika; Suomi, Tomi; Tsubosaka, Tomoya; Imanishi, Susumu Y; Elo, Laura L; Corthals, Garry L
2018-03-27
Mass spectrometry combined with enrichment strategies for phosphorylated peptides has been successfully employed for two decades to identify sites of phosphorylation. However, unambiguous phosphosite assignment is considered challenging. Given that site-specific phosphorylation events function as different molecular switches, validation of phosphorylation sites is of utmost importance. In our earlier study we developed a method based on simulated phosphopeptide spectral libraries, which enables highly sensitive and accurate phosphosite assignments. To promote more widespread use of this method, we here introduce a software implementation with improved usability and performance. We present SimPhospho, a fast and user-friendly tool for accurate simulation of phosphopeptide tandem mass spectra. Simulated phosphopeptide spectral libraries are used to validate and supplement database search results, with a goal to improve reliable phosphoproteome identification and reporting. The presented program can be easily used together with the Trans-Proteomic Pipeline and integrated in a phosphoproteomics data analysis workflow. SimPhospho is available for Windows, Linux and Mac operating systems at https://sourceforge.net/projects/simphospho/. It is open source and implemented in C ++. A user's manual with detailed description of data analysis using SimPhospho as well as test data can be found as supplementary material of this article. Supplementary data are available at https://www.btk.fi/research/ computational-biomedicine/software/.
System Operations Studies : Feeder System Model. User's Manual.
DOT National Transportation Integrated Search
1982-11-01
The Feeder System Model (FSM) is one of the analytic models included in the System Operations Studies (SOS) software package developed for urban transit systems analysis. The objective of the model is to assign a proportion of the zone-to-zone travel...
45 CFR 164.312 - Technical safeguards.
Code of Federal Regulations, 2012 CFR
2012-10-01
... REQUIREMENTS SECURITY AND PRIVACY Security Standards for the Protection of Electronic Protected Health... that maintain electronic protected health information to allow access only to those persons or software... specifications: (i) Unique user identification (Required). Assign a unique name and/or number for identifying and...
Integrating prediction, provenance, and optimization into high energy workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schram, M.; Bansal, V.; Friese, R. D.
We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.
Grigoletti, Laura; Amaddeo, Francesco; Grassi, Aldrigo; Boldrini, Massimo; Chiappelli, Marco; Percudani, Mauro; Catapano, Francesco; Fiorillo, Andrea; Perris, Francesco; Bacigalupi, Maurizio; Albanese, Paolo; Simonetti, Simona; De Agostini, Paola; Tansella, Michele
2010-01-01
To develop predictive models to allocate patients into frequent and low service users groups within the Italian Community-based Mental Health Services (CMHSs). To allocate frequent users to different packages of care, identifying the costs of these packages. Socio-demographic and clinical data and GAF scores at baseline were collected for 1250 users attending five CMHSs. All psychiatric contacts made by these patients during six months were recorded. A logistic regression identified frequent service users predictive variables. Multinomial logistic regression identified variables able to predict the most appropriate package of care. A cost function was utilised to estimate costs. Frequent service users were 49%, using nearly 90% of all contacts. The model classified correctly 80% of users in the frequent and low users groups. Three packages of care were identified: Basic Community Treatment (4,133 Euro per six months); Intensive Community Treatment (6,180 Euro) and Rehabilitative Community Treatment (11,984 Euro) for 83%, 6% and 11% of frequent service users respectively. The model was found to be accurate for 85% of users. It is possible to develop predictive models to identify frequent service users and to assign them to pre-defined packages of care, and to use these models to inform the funding of psychiatric care.
Nonlinear Multidimensional Assignment Problems Efficient Conic Optimization Methods and Applications
2015-06-24
WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Arizona State University School of Mathematical & Statistical Sciences 901 S...SUPPLEMENTARY NOTES 14. ABSTRACT The major goals of this project were completed: the exact solution of previously unsolved challenging combinatorial optimization... combinatorial optimization problem, the Directional Sensor Problem, was solved in two ways. First, heuristically in an engineering fashion and second, exactly
Phunchongharn, Phond; Hossain, Ekram; Camorlinga, Sergio
2011-11-01
We study the multiple access problem for e-Health applications (referred to as secondary users) coexisting with medical devices (referred to as primary or protected users) in a hospital environment. In particular, we focus on transmission scheduling and power control of secondary users in multiple spatial reuse time-division multiple access (STDMA) networks. The objective is to maximize the spectrum utilization of secondary users and minimize their power consumption subject to the electromagnetic interference (EMI) constraints for active and passive medical devices and minimum throughput guarantee for secondary users. The multiple access problem is formulated as a dual objective optimization problem which is shown to be NP-complete. We propose a joint scheduling and power control algorithm based on a greedy approach to solve the problem with much lower computational complexity. To this end, an enhanced greedy algorithm is proposed to improve the performance of the greedy algorithm by finding the optimal sequence of secondary users for scheduling. Using extensive simulations, the tradeoff in performance in terms of spectrum utilization, energy consumption, and computational complexity is evaluated for both the algorithms.
Carter, Allison; Roth, Eric Abella; Ding, Erin; Milloy, M-J; Kestler, Mary; Jabbari, Shahab; Webster, Kath; de Pokomandy, Alexandra; Loutfy, Mona; Kaida, Angela
2018-03-01
We used latent class analysis to identify substance use patterns for 1363 women living with HIV in Canada and assessed associations with socio-economic marginalization, violence, and sub-optimal adherence to combination antiretroviral therapy (cART). A six-class model was identified consisting of: abstainers (26.3%), Tobacco Users (8.81%), Alcohol Users (31.9%), 'Socially Acceptable' Poly-substance Users (13.9%), Illicit Poly-substance Users (9.81%) and Illicit Poly-substance Users of All Types (9.27%). Multinomial logistic regression showed that women experiencing recent violence had significantly higher odds of membership in all substance use latent classes, relative to Abstainers, while those reporting sub-optimal cART adherence had higher odds of being members of the poly-substance use classes only. Factors significantly associated with Illicit Poly-substance Users of All Types were sexual minority status, lower income, and lower resiliency. Findings underline a need for increased social and structural supports for women who use substances to support them in leading safe and healthy lives with HIV.
Optimal Time Allocation in Backscatter Assisted Wireless Powered Communication Networks.
Lyu, Bin; Yang, Zhen; Gui, Guan; Sari, Hikmet
2017-06-01
This paper proposes a wireless powered communication network (WPCN) assisted by backscatter communication (BackCom). This model consists of a power station, an information receiver and multiple users that can work in either BackCom mode or harvest-then-transmit (HTT) mode. The time block is mainly divided into two parts corresponding to the data backscattering and transmission periods, respectively. The users first backscatter data to the information receiver in time division multiple access (TDMA) during the data backscattering period. When one user works in the BackCom mode, the other users harvest energy from the power station. During the data transmission period, two schemes, i.e., non-orthogonal multiple access (NOMA) and TDMA, are considered. To maximize the system throughput, the optimal time allocation policies are obtained. Simulation results demonstrate the superiority of the proposed model.
Optimal Time Allocation in Backscatter Assisted Wireless Powered Communication Networks
Lyu, Bin; Yang, Zhen; Gui, Guan; Sari, Hikmet
2017-01-01
This paper proposes a wireless powered communication network (WPCN) assisted by backscatter communication (BackCom). This model consists of a power station, an information receiver and multiple users that can work in either BackCom mode or harvest-then-transmit (HTT) mode. The time block is mainly divided into two parts corresponding to the data backscattering and transmission periods, respectively. The users first backscatter data to the information receiver in time division multiple access (TDMA) during the data backscattering period. When one user works in the BackCom mode, the other users harvest energy from the power station. During the data transmission period, two schemes, i.e., non-orthogonal multiple access (NOMA) and TDMA, are considered. To maximize the system throughput, the optimal time allocation policies are obtained. Simulation results demonstrate the superiority of the proposed model. PMID:28587171
Kherfi, Mohammed Lamine; Ziou, Djemel
2006-04-01
In content-based image retrieval, understanding the user's needs is a challenging task that requires integrating him in the process of retrieval. Relevance feedback (RF) has proven to be an effective tool for taking the user's judgement into account. In this paper, we present a new RF framework based on a feature selection algorithm that nicely combines the advantages of a probabilistic formulation with those of using both the positive example (PE) and the negative example (NE). Through interaction with the user, our algorithm learns the importance he assigns to image features, and then applies the results obtained to define similarity measures that correspond better to his judgement. The use of the NE allows images undesired by the user to be discarded, thereby improving retrieval accuracy. As for the probabilistic formulation of the problem, it presents a multitude of advantages and opens the door to more modeling possibilities that achieve a good feature selection. It makes it possible to cluster the query data into classes, choose the probability law that best models each class, model missing data, and support queries with multiple PE and/or NE classes. The basic principle of our algorithm is to assign more importance to features with a high likelihood and those which distinguish well between PE classes and NE classes. The proposed algorithm was validated separately and in image retrieval context, and the experiments show that it performs a good feature selection and contributes to improving retrieval effectiveness.
Voltage scheduling for low power/energy
NASA Astrophysics Data System (ADS)
Manzak, Ali
2001-07-01
Power considerations have become an increasingly dominant factor in the design of both portable and desk-top systems. An effective way to reduce power consumption is to lower the supply voltage since voltage is quadratically related to power. This dissertation considers the problem of lowering the supply voltage at (i) the system level and at (ii) the behavioral level. At the system level, the voltage of the variable voltage processor is dynamically changed with the work load. Processors with limited sized buffers as well as those with very large buffers are considered. Given the task arrival times, deadline times, execution times, periods and switching activities, task scheduling algorithms that minimize energy or peak power are developed for the processors equipped with very large buffers. A relation between the operating voltages of the tasks for minimum energy/power is determined using the Lagrange multiplier method, and an iterative algorithm that utilizes this relation is developed. Experimental results show that the voltage assignment obtained by the proposed algorithm is very close (0.1% error) to that of the optimal energy assignment and the optimal peak power (1% error) assignment. Next, on-line and off-fine minimum energy task scheduling algorithms are developed for processors with limited sized buffers. These algorithms have polynomial time complexity and present optimal (off-line) and close-to-optimal (on-line) solutions. A procedure to calculate the minimum buffer size given information about the size of the task (maximum, minimum), execution time (best case, worst case) and deadlines is also presented. At the behavioral level, resources operating at multiple voltages are used to minimize power while maintaining the throughput. Such a scheme has the advantage of allowing modules on the critical paths to be assigned to the highest voltage levels (thus meeting the required timing constraints) while allowing modules on non-critical paths to be assigned to lower voltage levels (thus reducing the power consumption). A polynomial time resource and latency constrained scheduling algorithm is developed to distribute the available slack among the nodes such that power consumption is minimum. The algorithm is iterative and utilizes the slack based on the Lagrange multiplier method.
TaDb: A time-aware diffusion-based recommender algorithm
NASA Astrophysics Data System (ADS)
Li, Wen-Jun; Xu, Yuan-Yuan; Dong, Qiang; Zhou, Jun-Lin; Fu, Yan
2015-02-01
Traditional recommender algorithms usually employ the early and recent records indiscriminately, which overlooks the change of user interests over time. In this paper, we show that the interests of a user remain stable in a short-term interval and drift during a long-term period. Based on this observation, we propose a time-aware diffusion-based (TaDb) recommender algorithm, which assigns different temporal weights to the leading links existing before the target user's collection and the following links appearing after that in the diffusion process. Experiments on four real datasets, Netflix, MovieLens, FriendFeed and Delicious show that TaDb algorithm significantly improves the prediction accuracy compared with the algorithms not considering temporal effects.
Eisenberg, Daniel; Downs, Marilyn F; Golberstein, Ezra
2012-09-01
Mental illness stigma refers to negative stereotypes and prejudices about people with mental illness, and is a widespread phenomenon with damaging social, psychological, and economic consequences. Despite considerable policy attention, mental illness stigma does not appear to have declined significantly in recent years. Interpersonal contact with persons with mental illness has been identified as a promising approach to reducing mental illness stigma. This study investigates the effect of contact with mental health treatment users on stigma using an observational research design that is free of self-selection bias. The research design is based on the quasi-experiment in which university students are assigned to live together as roommates. Survey data were collected from first-year undergraduates at two large universities in the United States (N = 1605). Multivariable regressions were used to estimate the effect of assignment to a roommate with a history of mental health treatment on a brief measure of stigmatizing attitudes. Contact with a treatment user caused a modest increase in stigma (standardized effect size = 0.15, p = 0.03). This effect was present among students without a prior treatment history of their own, but not among those with a prior history. The findings indicate that naturalistic contact alone does not necessarily yield a reduction in mental illness stigma. This may help explain why stigma has not declined in societies such as the United States even as treatment use has risen substantially. The findings also highlight the importance of isolating the specific components, beyond contact per se, that are necessary to reduce stigma in contact-based interventions. Copyright © 2012 Elsevier Ltd. All rights reserved.
Optimal Decision Making in a Class of Uncertain Systems Based on Uncertain Variables
NASA Astrophysics Data System (ADS)
Bubnicki, Z.
2006-06-01
The paper is concerned with a class of uncertain systems described by relational knowledge representations with unknown parameters which are assumed to be values of uncertain variables characterized by a user in the form of certainty distributions. The first part presents the basic optimization problem consisting in finding the decision maximizing the certainty index that the requirement given by a user is satisfied. The main part is devoted to the description of the optimization problem with the given certainty threshold. It is shown how the approach presented in the paper may be applied to some problems for anticipatory systems.
Linear triangular optimization technique and pricing scheme in residential energy management systems
NASA Astrophysics Data System (ADS)
Anees, Amir; Hussain, Iqtadar; AlKhaldi, Ali Hussain; Aslam, Muhammad
2018-06-01
This paper presents a new linear optimization algorithm for power scheduling of electric appliances. The proposed system is applied in a smart home community, in which community controller acts as a virtual distribution company for the end consumers. We also present a pricing scheme between community controller and its residential users based on real-time pricing and likely block rates. The results of the proposed optimization algorithm demonstrate that by applying the anticipated technique, not only end users can minimise the consumption cost, but it can also reduce the power peak to an average ratio which will be beneficial for the utilities as well.
Punn, Rajesh; Hanisch, Debra; Motonaga, Kara S; Rosenthal, David N; Ceresnak, Scott R; Dubin, Anne M
2016-02-01
Cardiac resynchronization therapy indications and management are well described in adults. Echocardiography (ECHO) has been used to optimize mechanical synchrony in these patients; however, there are issues with reproducibility and time intensity. Pediatric patients add challenges, with diverse substrates and limited capacity for cooperation. Electrocardiographic (ECG) methods to assess electrical synchrony are expeditious but have not been extensively studied in children. We sought to compare ECHO and ECG CRT optimization in children. Prospective, pediatric, single-center cross-over trial comparing ECHO and ECG optimization with CRT. Patients were assigned to undergo either ECHO or ECG optimization, followed for 6 months, and crossed-over to the other assignment for another 6 months. ECHO pulsed-wave tissue Doppler and 12-lead ECG were obtained for 5 VV delays. ECG optimization was defined as the shortest QRSD and ECHO optimization as the lowest dyssynchrony index. ECHOs/ECGs were interpreted by readers blinded to optimization technique. After each 6 month period, these data were collected: ejection fraction, velocimetry-derived cardiac index, quality of life, ECHO-derived stroke distance, M-mode dyssynchrony, study cost, and time. Outcomes for each optimization method were compared. From June 2012 to December 2013, 19 patients enrolled. Mean age was 9.1 ± 4.3 years; 14 (74%) had structural heart disease. The mean time for optimization was shorter using ECG than ECHO (9 ± 1 min vs. 68 ± 13 min, P < 0.01). Mean cost for charges was $4,400 ± 700 less for ECG. No other outcome differed between groups. ECHO optimization of synchrony was not superior to ECG optimization in this pilot study. ECG optimization required less time and cost than ECHO optimization. © 2015 Wiley Periodicals, Inc.
An optimal user-interface for EPIMS database conversions and SSQ 25002 EEE parts screening
NASA Technical Reports Server (NTRS)
Watson, John C.
1996-01-01
The Electrical, Electronic, and Electromechanical (EEE) Parts Information Management System (EPIMS) database was selected by the International Space Station Parts Control Board for providing parts information to NASA managers and contractors. Parts data is transferred to the EPIMS database by converting parts list data to the EP1MS Data Exchange File Format. In general, parts list information received from contractors and suppliers does not convert directly into the EPIMS Data Exchange File Format. Often parts lists use different variable and record field assignments. Many of the EPES variables are not defined in the parts lists received. The objective of this work was to develop an automated system for translating parts lists into the EPIMS Data Exchange File Format for upload into the EPIMS database. Once EEE parts information has been transferred to the EPIMS database it is necessary to screen parts data in accordance with the provisions of the SSQ 25002 Supplemental List of Qualified Electrical, Electronic, and Electromechanical Parts, Manufacturers, and Laboratories (QEPM&L). The SSQ 2S002 standards are used to identify parts which satisfy the requirements for spacecraft applications. An additional objective for this work was to develop an automated system which would screen EEE parts information against the SSQ 2S002 to inform managers of the qualification status of parts used in spacecraft applications. The EPIMS Database Conversion and SSQ 25002 User Interfaces are designed to interface through the World-Wide-Web(WWW)/Internet to provide accessibility by NASA managers and contractors.
HOMER: The Micropower Optimization Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2004-03-01
HOMER, the micropower optimization model, helps users to design micropower systems for off-grid and grid-connected power applications. HOMER models micropower systems with one or more power sources including wind turbines, photovoltaics, biomass power, hydropower, cogeneration, diesel engines, cogeneration, batteries, fuel cells, and electrolyzers. Users can explore a range of design questions such as which technologies are most effective, what size should components be, how project economics are affected by changes in loads or costs, and is the renewable resource adequate.
Interactive model evaluation tool based on IPython notebook
NASA Astrophysics Data System (ADS)
Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet
2015-04-01
In hydrological modelling, some kind of parameter optimization is mostly performed. This can be the selection of a single best parameter set, a split in behavioural and non-behavioural parameter sets based on a selected threshold or a posterior parameter distribution derived with a formal Bayesian approach. The selection of the criterion to measure the goodness of fit (likelihood or any objective function) is an essential step in all of these methodologies and will affect the final selected parameter subset. Moreover, the discriminative power of the objective function is also dependent from the time period used. In practice, the optimization process is an iterative procedure. As such, in the course of the modelling process, an increasing amount of simulations is performed. However, the information carried by these simulation outputs is not always fully exploited. In this respect, we developed and present an interactive environment that enables the user to intuitively evaluate the model performance. The aim is to explore the parameter space graphically and to visualize the impact of the selected objective function on model behaviour. First, a set of model simulation results is loaded along with the corresponding parameter sets and a data set of the same variable as the model outcome (mostly discharge). The ranges of the loaded parameter sets define the parameter space. A selection of the two parameters visualised can be made by the user. Furthermore, an objective function and a time period of interest need to be selected. Based on this information, a two-dimensional parameter response surface is created, which actually just shows a scatter plot of the parameter combinations and assigns a color scale corresponding with the goodness of fit of each parameter combination. Finally, a slider is available to change the color mapping of the points. Actually, the slider provides a threshold to exclude non behaviour parameter sets and the color scale is only attributed to the remaining parameter sets. As such, by interactively changing the settings and interpreting the graph, the user gains insight in the model structural behaviour. Moreover, a more deliberate choice of objective function and periods of high information content can be identified. The environment is written in an IPython notebook and uses the available interactive functions provided by the IPython community. As such, the power of the IPython notebook as a development environment for scientific computing is illustrated (Shen, 2014).
CLUSTERnGO: a user-defined modelling platform for two-stage clustering of time-series data.
Fidaner, Işık Barış; Cankorur-Cetinkaya, Ayca; Dikicioglu, Duygu; Kirdar, Betul; Cemgil, Ali Taylan; Oliver, Stephen G
2016-02-01
Simple bioinformatic tools are frequently used to analyse time-series datasets regardless of their ability to deal with transient phenomena, limiting the meaningful information that may be extracted from them. This situation requires the development and exploitation of tailor-made, easy-to-use and flexible tools designed specifically for the analysis of time-series datasets. We present a novel statistical application called CLUSTERnGO, which uses a model-based clustering algorithm that fulfils this need. This algorithm involves two components of operation. Component 1 constructs a Bayesian non-parametric model (Infinite Mixture of Piecewise Linear Sequences) and Component 2, which applies a novel clustering methodology (Two-Stage Clustering). The software can also assign biological meaning to the identified clusters using an appropriate ontology. It applies multiple hypothesis testing to report the significance of these enrichments. The algorithm has a four-phase pipeline. The application can be executed using either command-line tools or a user-friendly Graphical User Interface. The latter has been developed to address the needs of both specialist and non-specialist users. We use three diverse test cases to demonstrate the flexibility of the proposed strategy. In all cases, CLUSTERnGO not only outperformed existing algorithms in assigning unique GO term enrichments to the identified clusters, but also revealed novel insights regarding the biological systems examined, which were not uncovered in the original publications. The C++ and QT source codes, the GUI applications for Windows, OS X and Linux operating systems and user manual are freely available for download under the GNU GPL v3 license at http://www.cmpe.boun.edu.tr/content/CnG. sgo24@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Program to Optimize Simulated Trajectories (POST). Volume 2: Utilization manual
NASA Technical Reports Server (NTRS)
Bauer, G. L.; Cornick, D. E.; Habeger, A. R.; Petersen, F. M.; Stevenson, R.
1975-01-01
Information pertinent to users of the program to optimize simulated trajectories (POST) is presented. The input required and output available is described for each of the trajectory and targeting/optimization options. A sample input listing and resulting output are given.
ERIC Educational Resources Information Center
Biddle, Christopher J.
2013-01-01
The purpose of this qualitative holistic multiple-case study was to identify the optimal theoretical approach for a Counter-Terrorism Reality-Based Training (CTRBT) model to train post-9/11 police officers to perform effectively in their counter-terrorism assignments. Post-9/11 police officers assigned to counter-terrorism duties are not trained…
2007-06-01
introduces ASC-U’s approach for solving the dynamic UAV allocation problem. 26 Christopher J...18 Figure 6. Assignments Dynamics Example (after) .........................................................20 Figure 7. ASC-U Dynamic Cueing...decisions in order to respond to the dynamic environment they face. Thus, to succeed, the Army’s transformation cannot rely
Data Sciences Summer Institute Topology Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watts, Seth
DSSI_TOPOPT is a 2D topology optimization code that designs stiff structures made of a single linear elastic material and void space. The code generates a finite element mesh of a rectangular design domain on which the user specifies displacement and load boundary conditions. The code iteratively designs a structure that minimizes the compliance (maximizes the stiffness) of the structure under the given loading, subject to an upper bound on the amount of material used. Depending on user options, the code can evaluate the performance of a user-designed structure, or create a design from scratch. Output includes the finite element mesh,more » design, and visualizations of the design.« less
The nurse scheduling problem: a goal programming and nonlinear optimization approaches
NASA Astrophysics Data System (ADS)
Hakim, L.; Bakhtiar, T.; Jaharuddin
2017-01-01
Nurses scheduling is an activity of allocating nurses to conduct a set of tasks at certain room at a hospital or health centre within a certain period. One of obstacles in the nurse scheduling is the lack of resources in order to fulfil the needs of the hospital. Nurse scheduling which is undertaken manually will be at risk of not fulfilling some nursing rules set by the hospital. Therefore, this study aimed to perform scheduling models that satisfy all the specific rules set by the management of Bogor State Hospital. We have developed three models to overcome the scheduling needs. Model 1 is designed to schedule nurses who are solely assigned to a certain inpatient unit and Model 2 is constructed to manage nurses who are assigned to an inpatient room as well as at Polyclinic room as conjunct nurses. As the assignment of nurses on each shift is uneven, then we propose Model 3 to minimize the variance of the workload in order to achieve equitable assignment on every shift. The first two models are formulated in goal programming framework, while the last model is in nonlinear optimization form.
Mixed-Strategy Chance Constrained Optimal Control
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Kuwata, Yoshiaki; Balaram, J.
2013-01-01
This paper presents a novel chance constrained optimal control (CCOC) algorithm that chooses a control action probabilistically. A CCOC problem is to find a control input that minimizes the expected cost while guaranteeing that the probability of violating a set of constraints is below a user-specified threshold. We show that a probabilistic control approach, which we refer to as a mixed control strategy, enables us to obtain a cost that is better than what deterministic control strategies can achieve when the CCOC problem is nonconvex. The resulting mixed-strategy CCOC problem turns out to be a convexification of the original nonconvex CCOC problem. Furthermore, we also show that a mixed control strategy only needs to "mix" up to two deterministic control actions in order to achieve optimality. Building upon an iterative dual optimization, the proposed algorithm quickly converges to the optimal mixed control strategy with a user-specified tolerance.
Multidisciplinary Aerospace Systems Optimization: Computational AeroSciences (CAS) Project
NASA Technical Reports Server (NTRS)
Kodiyalam, S.; Sobieski, Jaroslaw S. (Technical Monitor)
2001-01-01
The report describes a method for performing optimization of a system whose analysis is so expensive that it is impractical to let the optimization code invoke it directly because excessive computational cost and elapsed time might result. In such situation it is imperative to have user control the number of times the analysis is invoked. The reported method achieves that by two techniques in the Design of Experiment category: a uniform dispersal of the trial design points over a n-dimensional hypersphere and a response surface fitting, and the technique of krigging. Analyses of all the trial designs whose number may be set by the user are performed before activation of the optimization code and the results are stored as a data base. That code is then executed and referred to the above data base. Two applications, one of the airborne laser system, and one of an aircraft optimization illustrate the method application.
Interplanetary Program to Optimize Simulated Trajectories (IPOST). Volume 1: User's guide
NASA Technical Reports Server (NTRS)
Hong, P. E.; Kent, P. D.; Olson, D. W.; Vallado, C. A.
1992-01-01
IPOST is intended to support many analysis phases, from early interplanetary feasibility studies through spacecraft development and operations. The IPOST output provides information for sizing and understanding mission impacts related to propulsion, guidance, communications, sensor/actuators, payload, and other dynamic and geometric environments. IPOST models three degree of freedom trajectory events, such as launch/ascent, orbital coast, propulsive maneuvering (impulsive and finite burn), gravity assist, and atmospheric entry. Trajectory propagation is performed using a choice of Cowell, Encke, Multiconic, Onestep, or Conic methods. The user identifies a desired sequence fo trajectory events, and selects which parameters are independent (controls) and dependent (targets), as well as other constraints and the coat function. Targeting and optimization is performed using the Stanford NPSOL algorithm. IPOST structure allows sub-problems within a master optimization problem to aid in the general constrained parameter optimization solution. An alternate optimization method uses implicit simulation and collocation techniques.
NASA Astrophysics Data System (ADS)
Xuan, Hejun; Wang, Yuping; Xu, Zhanqi; Hao, Shanshan; Wang, Xiaoli
2017-11-01
Virtualization technology can greatly improve the efficiency of the networks by allowing the virtual optical networks to share the resources of the physical networks. However, it will face some challenges, such as finding the efficient strategies for virtual nodes mapping, virtual links mapping and spectrum assignment. It is even more complex and challenging when the physical elastic optical networks using multi-core fibers. To tackle these challenges, we establish a constrained optimization model to determine the optimal schemes of optical network mapping, core allocation and spectrum assignment. To solve the model efficiently, tailor-made encoding scheme, crossover and mutation operators are designed. Based on these, an efficient genetic algorithm is proposed to obtain the optimal schemes of the virtual nodes mapping, virtual links mapping, core allocation. The simulation experiments are conducted on three widely used networks, and the experimental results show the effectiveness of the proposed model and algorithm.
Relabeling exchange method (REM) for learning in neural networks
NASA Astrophysics Data System (ADS)
Wu, Wen; Mammone, Richard J.
1994-02-01
The supervised training of neural networks require the use of output labels which are usually arbitrarily assigned. In this paper it is shown that there is a significant difference in the rms error of learning when `optimal' label assignment schemes are used. We have investigated two efficient random search algorithms to solve the relabeling problem: the simulated annealing and the genetic algorithm. However, we found them to be computationally expensive. Therefore we shall introduce a new heuristic algorithm called the Relabeling Exchange Method (REM) which is computationally more attractive and produces optimal performance. REM has been used to organize the optimal structure for multi-layered perceptrons and neural tree networks. The method is a general one and can be implemented as a modification to standard training algorithms. The motivation of the new relabeling strategy is based on the present interpretation of dyslexia as an encoding problem.
Where can I get help with science questions?
Atmospheric Science Data Center
2014-12-08
... (ASDC's) User Services. They will forward them to the volunteer scientists assigned to the ERBE data sets. Since the ERBE project has ... is very limited and depends on the availability of our volunteer scientists. It may take a few days to receive your answers. ...
User Authentication from Web Browsing Behavior
2013-05-01
ourselves with a cognitive personal fingerprint. Attribution is broadly defined as the assignment of an ef- fect to a cause. We differentiate between...reusable patterns of behavior. We encode the semantic and stylistic content Figure 4: Burstiness profile below 1 min aggregated across all sessions for
45 CFR 164.312 - Technical safeguards.
Code of Federal Regulations, 2013 CFR
2013-10-01
... REQUIREMENTS SECURITY AND PRIVACY Security Standards for the Protection of Electronic Protected Health... persons or software programs that have been granted access rights as specified in § 164.308(a)(4). (2) Implementation specifications: (i) Unique user identification (Required). Assign a unique name and/or number for...
Guthrie, Kate M; Rosen, Rochelle K; Vargas, Sara E; Guillen, Melissa; Steger, Arielle L; Getz, Melissa L; Smith, Kelley A; Ramirez, Jaime J; Kojic, Erna M
2017-10-01
The development of HIV-preventive topical vaginal microbicides has been challenged by a lack of sufficient adherence in later stage clinical trials to confidently evaluate effectiveness. This dilemma has highlighted the need to integrate translational research earlier in the drug development process, essentially applying behavioral science to facilitate the advances of basic science with respect to the uptake and use of biomedical prevention technologies. In the last several years, there has been an increasing recognition that the user experience, specifically the sensory experience, as well as the role of meaning-making elicited by those sensations, may play a more substantive role than previously thought. Importantly, the role of the user-their sensory perceptions, their judgements of those experiences, and their willingness to use a product-is critical in product uptake and consistent use post-marketing, ultimately realizing gains in global public health. Specifically, a successful prevention product requires an efficacious drug, an efficient drug delivery system, and an effective user. We present an integrated iterative drug development and user experience evaluation method to illustrate how user-centered formulation design can be iterated from the early stages of preclinical development to leverage the user experience. Integrating the user and their product experiences into the formulation design process may help optimize both the efficiency of drug delivery and the effectiveness of the user.
Adams, Samantha A; de Bont, Antoinette A
2007-06-01
Hyperlinked web trust marks have been a popular topic of discussion during the past 10 years. However, the discussion has focused mostly on what these trust marks are not doing in terms of helping patients (or other lay end users) find reliable medical information on the web. In this paper, we discuss how this focus on patients and their actions with respect to trust marks, has overshadowed, if not rendered invisible, what trust marks are doing to educate medical site/information providers. We draw on data from ethnographic research conducted at the Health on the Net Foundation in 2002 and 2003 in order to explore an alternate definition of what it means to be a 'user' of a trust mark and the importance of the review process in educating site providers. We argue that understanding the work involved in the process of assigning a seal is crucial to understanding the role that the seal plays as part of the medical internet.
Loft: An Automated Mesh Generator for Stiffened Shell Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Eldred, Lloyd B.
2011-01-01
Loft is an automated mesh generation code that is designed for aerospace vehicle structures. From user input, Loft generates meshes for wings, noses, tanks, fuselage sections, thrust structures, and so on. As a mesh is generated, each element is assigned properties to mark the part of the vehicle with which it is associated. This property assignment is an extremely powerful feature that enables detailed analysis tasks, such as load application and structural sizing. This report is presented in two parts. The first part is an overview of the code and its applications. The modeling approach that was used to create the finite element meshes is described. Several applications of the code are demonstrated, including a Next Generation Launch Technology (NGLT) wing-sizing study, a lunar lander stage study, a launch vehicle shroud shape study, and a two-stage-to-orbit (TSTO) orbiter. Part two of the report is the program user manual. The manual includes in-depth tutorials and a complete command reference.
Combined Simulated Annealing and Genetic Algorithm Approach to Bus Network Design
NASA Astrophysics Data System (ADS)
Liu, Li; Olszewski, Piotr; Goh, Pong-Chai
A new method - combined simulated annealing (SA) and genetic algorithm (GA) approach is proposed to solve the problem of bus route design and frequency setting for a given road network with fixed bus stop locations and fixed travel demand. The method involves two steps: a set of candidate routes is generated first and then the best subset of these routes is selected by the combined SA and GA procedure. SA is the main process to search for a better solution to minimize the total system cost, comprising user and operator costs. GA is used as a sub-process to generate new solutions. Bus demand assignment on two alternative paths is performed at the solution evaluation stage. The method was implemented on four theoretical grid networks of different size and a benchmark network. Several GA operators (crossover and mutation) were utilized and tested for their effectiveness. The results show that the proposed method can efficiently converge to the optimal solution on a small network but computation time increases significantly with network size. The method can also be used for other transport operation management problems.
Orthos, an alarm system for the ALICE DAQ operations
NASA Astrophysics Data System (ADS)
Chapeland, Sylvain; Carena, Franco; Carena, Wisla; Chibante Barroso, Vasco; Costa, Filippo; Denes, Ervin; Divia, Roberto; Fuchs, Ulrich; Grigore, Alexandru; Simonetti, Giuseppe; Soos, Csaba; Telesca, Adriana; Vande Vyvre, Pierre; von Haller, Barthelemy
2012-12-01
ALICE (A Large Ion Collider Experiment) is the heavy-ion detector studying the physics of strongly interacting matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). The DAQ (Data Acquisition System) facilities handle the data flow from the detectors electronics up to the mass storage. The DAQ system is based on a large farm of commodity hardware consisting of more than 600 devices (Linux PCs, storage, network switches), and controls hundreds of distributed hardware and software components interacting together. This paper presents Orthos, the alarm system used to detect, log, report, and follow-up abnormal situations on the DAQ machines at the experimental area. The main objective of this package is to integrate alarm detection and notification mechanisms with a full-featured issues tracker, in order to prioritize, assign, and fix system failures optimally. This tool relies on a database repository with a logic engine, SQL interfaces to inject or query metrics, and dynamic web pages for user interaction. We describe the system architecture, the technologies used for the implementation, and the integration with existing monitoring tools.
Millimeter wave satellite concepts, volume 1
NASA Technical Reports Server (NTRS)
Hilsen, N. B.; Holland, L. D.; Thomas, R. E.; Wallace, R. W.; Gallagher, J. G.
1977-01-01
The identification of technologies necessary for development of millimeter spectrum communication satellites was examined from a system point of view. Development of methodology based on the technical requirements of potential services that might be assigned to millimeter wave bands for identifying the viable and appropriate technologies for future NASA millimeter research and development programs, and testing of this methodology with selected user applications and services were the goals of the program. The entire communications network, both ground and space subsystems was studied. Cost, weight, and performance models for the subsystems, conceptual design for point-to-point and broadcast communications satellites, and analytic relationships between subsystem parameters and an overall link performance are discussed along with baseline conceptual systems, sensitivity studies, model adjustment analyses, identification of critical technologies and their risks, and brief research and development program scenarios for the technologies judged to be moderate or extensive risks. Identification of technologies for millimeter satellite communication systems, and assessment of the relative risks of these technologies, was accomplished through subsystem modeling and link optimization for both point-to-point and broadcast applications.
Image categorization for marketing purposes
NASA Astrophysics Data System (ADS)
Almishari, Mishari I.; Lee, Haengju; Gnanasambandam, Nathan
2011-03-01
Images meant for marketing and promotional purposes (i.e. coupons) represent a basic component in incentivizing customers to visit shopping outlets and purchase discounted commodities. They also help department stores in attracting more customers and potentially, speeding up their cash flow. While coupons are available from various sources - print, web, etc. categorizing these monetary instruments is a benefit to the users. We are interested in an automatic categorizer system that aggregates these coupons from different sources (web, digital coupons, paper coupons, etc) and assigns a type to each of these coupons in an efficient manner. While there are several dimensions to this problem, in this paper we study the problem of accurately categorizing/classifying the coupons. We propose and evaluate four different techniques for categorizing the coupons namely, word-based model, n-gram-based model, externally weighing model, weight decaying model which take advantage of known machine learning algorithms. We evaluate these techniques and they achieve high accuracies in the range of 73.1% to 93.2%. We provide various examples of accuracy optimizations that can be performed and show a progressive increase in categorization accuracy for our test dataset.
Ma, Jianmin; Eisenhaber, Frank; Maurer-Stroh, Sebastian
2013-12-01
Beta lactams comprise the largest and still most effective group of antibiotics, but bacteria can gain resistance through different beta lactamases that can degrade these antibiotics. We developed a user friendly tree building web server that allows users to assign beta lactamase sequences to their respective molecular classes and subclasses. Further clinically relevant information includes if the gene is typically chromosomal or transferable through plasmids as well as listing the antibiotics which the most closely related reference sequences are known to target and cause resistance against. This web server can automatically build three phylogenetic trees: the first tree with closely related sequences from a Tachyon search against the NCBI nr database, the second tree with curated reference beta lactamase sequences, and the third tree built specifically from substrate binding pocket residues of the curated reference beta lactamase sequences. We show that the latter is better suited to recover antibiotic substrate assignments through nearest neighbor annotation transfer. The users can also choose to build a structural model for the query sequence and view the binding pocket residues of their query relative to other beta lactamases in the sequence alignment as well as in the 3D structure relative to bound antibiotics. This web server is freely available at http://blac.bii.a-star.edu.sg/.
CometBoards Users Manual Release 1.0
NASA Technical Reports Server (NTRS)
Guptill, James D.; Coroneos, Rula M.; Patnaik, Surya N.; Hopkins, Dale A.; Berke, Lazlo
1996-01-01
Several nonlinear mathematical programming algorithms for structural design applications are available at present. These include the sequence of unconstrained minimizations technique, the method of feasible directions, and the sequential quadratic programming technique. The optimality criteria technique and the fully utilized design concept are two other structural design methods. A project was undertaken to bring all these design methods under a common computer environment so that a designer can select any one of these tools that may be suitable for his/her application. To facilitate selection of a design algorithm, to validate and check out the computer code, and to ascertain the relative merits of the design tools, modest finite element structural analysis programs based on the concept of stiffness and integrated force methods have been coupled to each design method. The code that contains both these design and analysis tools, by reading input information from analysis and design data files, can cast the design of a structure as a minimum-weight optimization problem. The code can then solve it with a user-specified optimization technique and a user-specified analysis method. This design code is called CometBoards, which is an acronym for Comparative Evaluation Test Bed of Optimization and Analysis Routines for the Design of Structures. This manual describes for the user a step-by-step procedure for setting up the input data files and executing CometBoards to solve a structural design problem. The manual includes the organization of CometBoards; instructions for preparing input data files; the procedure for submitting a problem; illustrative examples; and several demonstration problems. A set of 29 structural design problems have been solved by using all the optimization methods available in CometBoards. A summary of the optimum results obtained for these problems is appended to this users manual. CometBoards, at present, is available for Posix-based Cray and Convex computers, Iris and Sun workstations, and the VM/CMS system.
DyNAVacS: an integrative tool for optimized DNA vaccine design.
Harish, Nagarajan; Gupta, Rekha; Agarwal, Parul; Scaria, Vinod; Pillai, Beena
2006-07-01
DNA vaccines have slowly emerged as keystones in preventive immunology due to their versatility in inducing both cell-mediated as well as humoral immune responses. The design of an efficient DNA vaccine, involves choice of a suitable expression vector, ensuring optimal expression by codon optimization, engineering CpG motifs for enhancing immune responses and providing additional sequence signals for efficient translation. DyNAVacS is a web-based tool created for rapid and easy design of DNA vaccines. It follows a step-wise design flow, which guides the user through the various sequential steps in the design of the vaccine. Further, it allows restriction enzyme mapping, design of primers spanning user specified sequences and provides information regarding the vectors currently used for generation of DNA vaccines. The web version uses Apache HTTP server. The interface was written in HTML and utilizes the Common Gateway Interface scripts written in PERL for functionality. DyNAVacS is an integrated tool consisting of user-friendly programs, which require minimal information from the user. The software is available free of cost, as a web based application at URL: http://miracle.igib.res.in/dynavac/.
Consumer-identified barriers and strategies for optimizing technology use in the workplace.
De Jonge, Desleigh M; Rodger, Sylvia A
2006-01-01
This article explores the experiences of 26 assistive technology (AT) users having a range of physical impairments as they optimized their use of technology in the workplace. A qualitative research design was employed using in-depth, open-ended interviews and observations of AT users in the workplace. Participants identified many factors that limited their use of technology such as discomfort and pain, limited knowledge of the technology's features, and the complexity of the technology. The amount of time required for training, limited work time available for mastery, cost of training and limitations of the training provided, resulted in an over-reliance on trial and error and informal support networks and a sense of isolation. AT users enhanced their use of technology by addressing the ergonomics of the workstation and customizing the technology to address individual needs and strategies. Other key strategies included tailored training and learning support as well as opportunities to practice using the technology and explore its features away from work demands. This research identified structures important for effective AT use in the workplace which need to be put in place to ensure that AT users are able to master and optimize their use of technology.
Robust Rate Maximization for Heterogeneous Wireless Networks under Channel Uncertainties
Xu, Yongjun; Hu, Yuan; Li, Guoquan
2018-01-01
Heterogeneous wireless networks are a promising technology in next generation wireless communication networks, which has been shown to efficiently reduce the blind area of mobile communication and improve network coverage compared with the traditional wireless communication networks. In this paper, a robust power allocation problem for a two-tier heterogeneous wireless networks is formulated based on orthogonal frequency-division multiplexing technology. Under the consideration of imperfect channel state information (CSI), the robust sum-rate maximization problem is built while avoiding sever cross-tier interference to macrocell user and maintaining the minimum rate requirement of each femtocell user. To be practical, both of channel estimation errors from the femtocells to the macrocell and link uncertainties of each femtocell user are simultaneously considered in terms of outage probabilities of users. The optimization problem is analyzed under no CSI feedback with some cumulative distribution function and partial CSI with Gaussian distribution of channel estimation error. The robust optimization problem is converted into the convex optimization problem which is solved by using Lagrange dual theory and subgradient algorithm. Simulation results demonstrate the effectiveness of the proposed algorithm by the impact of channel uncertainties on the system performance. PMID:29466315
NASA Astrophysics Data System (ADS)
Thomas, V. I.; Yu, E.; Acharya, P.; Jaramillo, J.; Chowdhury, F.
2015-12-01
Maintaining and archiving accurate site metadata is critical for seismic network operations. The Advanced National Seismic System (ANSS) Station Information System (SIS) is a repository of seismic network field equipment, equipment response, and other site information. Currently, there are 187 different sensor models and 114 data-logger models in SIS. SIS has a web-based user interface that allows network operators to enter information about seismic equipment and assign response parameters to it. It allows users to log entries for sites, equipment, and data streams. Users can also track when equipment is installed, updated, and/or removed from sites. When seismic equipment configurations change for a site, SIS computes the overall gain of a data channel by combining the response parameters of the underlying hardware components. Users can then distribute this metadata in standardized formats such as FDSN StationXML or dataless SEED. One powerful advantage of SIS is that existing data in the repository can be leveraged: e.g., new instruments can be assigned response parameters from the Incorporated Research Institutions for Seismology (IRIS) Nominal Response Library (NRL), or from a similar instrument already in the inventory, thereby reducing the amount of time needed to determine parameters when new equipment (or models) are introduced into a network. SIS is also useful for managing field equipment that does not produce seismic data (eg power systems, telemetry devices or GPS receivers) and gives the network operator a comprehensive view of site field work. SIS allows users to generate field logs to document activities and inventory at sites. Thus, operators can also use SIS reporting capabilities to improve planning and maintenance of the network. Queries such as how many sensors of a certain model are installed or what pieces of equipment have active problem reports are just a few examples of the type of information that is available to SIS users.
N-Screen Aware Multicriteria Hybrid Recommender System Using Weight Based Subspace Clustering
Ullah, Farman; Lee, Sungchang
2014-01-01
This paper presents a recommender system for N-screen services in which users have multiple devices with different capabilities. In N-screen services, a user can use various devices in different locations and time and can change a device while the service is running. N-screen aware recommendation seeks to improve the user experience with recommended content by considering the user N-screen device attributes such as screen resolution, media codec, remaining battery time, and access network and the user temporal usage pattern information that are not considered in existing recommender systems. For N-screen aware recommendation support, this work introduces a user device profile collaboration agent, manager, and N-screen control server to acquire and manage the user N-screen devices profile. Furthermore, a multicriteria hybrid framework is suggested that incorporates the N-screen devices information with user preferences and demographics. In addition, we propose an individual feature and subspace weight based clustering (IFSWC) to assign different weights to each subspace and each feature within a subspace in the hybrid framework. The proposed system improves the accuracy, precision, scalability, sparsity, and cold start issues. The simulation results demonstrate the effectiveness and prove the aforementioned statements. PMID:25152921
Jing, Xia; Cimino, James J; Del Fiol, Guilherme
2015-11-30
The Librarian Infobutton Tailoring Environment (LITE) is a Web-based knowledge capture, management, and configuration tool with which users can build profiles used by OpenInfobutton, an open source infobutton manager, to provide electronic health record users with context-relevant links to online knowledge resources. We conducted a multipart evaluation study to explore users' attitudes and acceptance of LITE and to guide future development. The evaluation consisted of an initial online survey to all LITE users, followed by an observational study of a subset of users in which evaluators' sessions were recorded while they conducted assigned tasks. The observational study was followed by administration of a modified System Usability Scale (SUS) survey. Fourteen users responded to the survey and indicated good acceptance of LITE with feedback that was mostly positive. Six users participated in the observational study, demonstrating average task completion time of less than 6 minutes and an average SUS score of 72, which is considered good compared with other SUS scores. LITE can be used to fulfill its designated tasks quickly and successfully. Evaluators proposed suggestions for improvements in LITE functionality and user interface.
Development of a User Interface for a Regression Analysis Software Tool
NASA Technical Reports Server (NTRS)
Ulbrich, Norbert Manfred; Volden, Thomas R.
2010-01-01
An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.
Optimization-Based Selection of Influential Agents in a Rural Afghan Social Network
2010-06-01
nonlethal targeting model, a nonlinear programming ( NLP ) optimization formulation that identifies the k US agent assignment strategy producing the greatest...leader social network, and 3) the nonlethal targeting model, a nonlinear programming ( NLP ) optimization formulation that identifies the k US agent...NATO Coalition in Afghanistan. 55 for Afghanistan ( [54], [31], [48], [55], [30]). While Arab tribes tend to be more hierarchical, Pashtun tribes are
Neural Meta-Memes Framework for Combinatorial Optimization
NASA Astrophysics Data System (ADS)
Song, Li Qin; Lim, Meng Hiot; Ong, Yew Soon
In this paper, we present a Neural Meta-Memes Framework (NMMF) for combinatorial optimization. NMMF is a framework which models basic optimization algorithms as memes and manages them dynamically when solving combinatorial problems. NMMF encompasses neural networks which serve as the overall planner/coordinator to balance the workload between memes. We show the efficacy of the proposed NMMF through empirical study on a class of combinatorial problem, the quadratic assignment problem (QAP).
2015-06-19
reduces the flight hours over the positioning legs of the missions resulting in a reduction of spending, an increase in flexibility and a more effective...dedicated to the DV transport mission, then the optimal basing ratio would apply to nine aircraft and would result in utilizing the additional asset from...Scott AFB where it is currently assigned. Operating the 2014 mission set optimally, would have resulted in 299 fewer flight hours flown, realizing
Tolić, Nikola; Liu, Yina; Liyu, Andrey; Shen, Yufeng; Tfaily, Malak M; Kujawinski, Elizabeth B; Longnecker, Krista; Kuo, Li-Jung; Robinson, Errol W; Paša-Tolić, Ljiljana; Hess, Nancy J
2017-12-05
Ultrahigh resolution mass spectrometry, such as Fourier transform ion cyclotron resonance mass spectrometry (FT ICR MS), can resolve thousands of molecular ions in complex organic matrices. A Compound Identification Algorithm (CIA) was previously developed for automated elemental formula assignment for natural organic matter (NOM). In this work, we describe software Formularity with a user-friendly interface for CIA function and newly developed search function Isotopic Pattern Algorithm (IPA). While CIA assigns elemental formulas for compounds containing C, H, O, N, S, and P, IPA is capable of assigning formulas for compounds containing other elements. We used halogenated organic compounds (HOC), a chemical class that is ubiquitous in nature as well as anthropogenic systems, as an example to demonstrate the capability of Formularity with IPA. A HOC standard mix was used to evaluate the identification confidence of IPA. Tap water and HOC spike in Suwannee River NOM were used to assess HOC identification in complex environmental samples. Strategies for reconciliation of CIA and IPA assignments were discussed. Software and sample databases with documentation are freely available.
Zhang, Xuejun; Lei, Jiaxing
2015-01-01
Considering reducing the airspace congestion and the flight delay simultaneously, this paper formulates the airway network flow assignment (ANFA) problem as a multiobjective optimization model and presents a new multiobjective optimization framework to solve it. Firstly, an effective multi-island parallel evolution algorithm with multiple evolution populations is employed to improve the optimization capability. Secondly, the nondominated sorting genetic algorithm II is applied for each population. In addition, a cooperative coevolution algorithm is adapted to divide the ANFA problem into several low-dimensional biobjective optimization problems which are easier to deal with. Finally, in order to maintain the diversity of solutions and to avoid prematurity, a dynamic adjustment operator based on solution congestion degree is specifically designed for the ANFA problem. Simulation results using the real traffic data from China air route network and daily flight plans demonstrate that the proposed approach can improve the solution quality effectively, showing superiority to the existing approaches such as the multiobjective genetic algorithm, the well-known multiobjective evolutionary algorithm based on decomposition, and a cooperative coevolution multiobjective algorithm as well as other parallel evolution algorithms with different migration topology. PMID:26180840
Simulation-based planning for theater air warfare
NASA Astrophysics Data System (ADS)
Popken, Douglas A.; Cox, Louis A., Jr.
2004-08-01
Planning for Theatre Air Warfare can be represented as a hierarchy of decisions. At the top level, surviving airframes must be assigned to roles (e.g., Air Defense, Counter Air, Close Air Support, and AAF Suppression) in each time period in response to changing enemy air defense capabilities, remaining targets, and roles of opposing aircraft. At the middle level, aircraft are allocated to specific targets to support their assigned roles. At the lowest level, routing and engagement decisions are made for individual missions. The decisions at each level form a set of time-sequenced Courses of Action taken by opposing forces. This paper introduces a set of simulation-based optimization heuristics operating within this planning hierarchy to optimize allocations of aircraft. The algorithms estimate distributions for stochastic outcomes of the pairs of Red/Blue decisions. Rather than using traditional stochastic dynamic programming to determine optimal strategies, we use an innovative combination of heuristics, simulation-optimization, and mathematical programming. Blue decisions are guided by a stochastic hill-climbing search algorithm while Red decisions are found by optimizing over a continuous representation of the decision space. Stochastic outcomes are then provided by fast, Lanchester-type attrition simulations. This paper summarizes preliminary results from top and middle level models.
Using the structure-function linkage database to characterize functional domains in enzymes.
Brown, Shoshana; Babbitt, Patricia
2014-12-12
The Structure-Function Linkage Database (SFLD; http://sfld.rbvi.ucsf.edu/) is a Web-accessible database designed to link enzyme sequence, structure, and functional information. This unit describes the protocols by which a user may query the database to predict the function of uncharacterized enzymes and to correct misannotated functional assignments. The information in this unit is especially useful in helping a user discriminate functional capabilities of a sequence that is only distantly related to characterized sequences in publicly available databases. Copyright © 2014 John Wiley & Sons, Inc.
2012-12-01
trajectories in space, and are therefore very highly similar, and a cosine of 0 indicates that the two vectors are unrelated. The vector of a good summary...topic. The effectiveness of the AGS’s ability to automatically grade student assignment is completely dependent on a good match between this corpus...students to summarise “User Documents” that focused on fishing, then a good corpus would contain documents about the various types of fishing
Characterizing Deficiencies of Path-Based Routing for Wireless Multi-Hop Networks
2017-05-01
called a “ hello ”) to all of its neighbors. If a series of hello messages are exchanged between two users, a link is considered to exist between them...A brief description of ETX is as follows. For a given window of time, the number of hello packets that a user receives from a neighbor is counted. A...cost is then assigned to the link based on how many hello messages were heard; a link that has fewer hellos successfully transmitted across it will be
Optimizing the NASA Technical Report Server
NASA Technical Reports Server (NTRS)
Nelson, Michael L.; Maa, Ming-Hokng
1996-01-01
The NASA Technical Report Server (NTRS), a World Wide Web report distribution NASA technical publications service, is modified for performance enhancement, greater protocol support, and human interface optimization. Results include: Parallel database queries, significantly decreasing user access times by an average factor of 2.3; access from clients behind firewalls and/ or proxies which truncate excessively long Uniform Resource Locators (URLs); access to non-Wide Area Information Server (WAIS) databases and compatibility with the 239-50.3 protocol; and a streamlined user interface.
TeleProbe: design and development of an efficient system for telepathology
NASA Astrophysics Data System (ADS)
Ahmed, Wamiq M.; Robinson, J. Paul; Ghafoor, Arif
2005-10-01
This paper describes an internet-based system for telepathology. This system provides support for multiple users and exploits the opportunities for optimization that arise in multi-user environment. Techniques for increasing system responsiveness by improving resource utilization and lowering network traffic are explored. Some of the proposed optimizations include an auto-focus module, client and server side caching, and request reordering. These systems can be an economic solution not only for remote pathology consultation but also for pathology and biology education.
NASA Astrophysics Data System (ADS)
Chaves-González, José M.; Vega-Rodríguez, Miguel A.; Gómez-Pulido, Juan A.; Sánchez-Pérez, Juan M.
2011-08-01
This article analyses the use of a novel parallel evolutionary strategy to solve complex optimization problems. The work developed here has been focused on a relevant real-world problem from the telecommunication domain to verify the effectiveness of the approach. The problem, known as frequency assignment problem (FAP), basically consists of assigning a very small number of frequencies to a very large set of transceivers used in a cellular phone network. Real data FAP instances are very difficult to solve due to the NP-hard nature of the problem, therefore using an efficient parallel approach which makes the most of different evolutionary strategies can be considered as a good way to obtain high-quality solutions in short periods of time. Specifically, a parallel hyper-heuristic based on several meta-heuristics has been developed. After a complete experimental evaluation, results prove that the proposed approach obtains very high-quality solutions for the FAP and beats any other result published.
ROCOPT: A user friendly interactive code to optimize rocket structural components
NASA Technical Reports Server (NTRS)
Rule, William K.
1989-01-01
ROCOPT is a user-friendly, graphically-interfaced, microcomputer-based computer program (IBM compatible) that optimizes rocket components by minimizing the structural weight. The rocket components considered are ring stiffened truncated cones and cylinders. The applied loading is static, and can consist of any combination of internal or external pressure, axial force, bending moment, and torque. Stress margins are calculated by means of simple closed form strength of material type equations. Stability margins are determined by approximate, orthotropic-shell, closed-form equations. A modified form of Powell's method, in conjunction with a modified form of the external penalty method, is used to determine the minimum weight of the structure subject to stress and stability margin constraints, as well as user input constraints on the structural dimensions. The graphical interface guides the user through the required data prompts, explains program options and graphically displays results for easy interpretation.
Carey, Michael P.; Senn, Theresa E.; Coury-Doniger, Patricia; Urban, Marguerite A.; Vanable, Peter A.; Carey, Kate B.
2013-01-01
Randomized controlled trials (RCTs) remain the gold standard for evaluating intervention efficacy but are often costly. To optimize their scientific yield, RCTs can be designed to investigate multiple research questions. This paper describes an RCT that used a modified Solomon four-group design to simultaneously evaluate two, theoretically-guided, health promotion interventions as well as assessment reactivity. Recruited participants (N = 1010; 56% male; 69% African American) were randomly assigned to one of four conditions formed by crossing two intervention conditions (i.e., general health promotion vs. sexual risk reduction intervention) with two assessment conditions (i.e., general health vs. sexual health survey). After completing their assigned baseline assessment, participants received the assigned intervention, and returned for follow-ups at 3, 6, 9, and 12 months. In this report, we summarize baseline data, which show high levels of sexual risk behavior; alcohol, marijuana, and tobacco use; and fast food consumption. Sexual risk behaviors and substance use were correlated. Participants reported high satisfaction with both interventions but ratings for the sexual risk reduction intervention were higher. Planned follow-up sessions, and subsequent analyses, will assess changes in health behaviors including sexual risk behaviors. This study design demonstrates one way to optimize the scientific yield of an RCT. PMID:23816489
Laffy, Patrick W.; Wood-Charlson, Elisha M.; Turaev, Dmitrij; Weynberg, Karen D.; Botté, Emmanuelle S.; van Oppen, Madeleine J. H.; Webster, Nicole S.; Rattei, Thomas
2016-01-01
Abundant bioinformatics resources are available for the study of complex microbial metagenomes, however their utility in viral metagenomics is limited. HoloVir is a robust and flexible data analysis pipeline that provides an optimized and validated workflow for taxonomic and functional characterization of viral metagenomes derived from invertebrate holobionts. Simulated viral metagenomes comprising varying levels of viral diversity and abundance were used to determine the optimal assembly and gene prediction strategy, and multiple sequence assembly methods and gene prediction tools were tested in order to optimize our analysis workflow. HoloVir performs pairwise comparisons of single read and predicted gene datasets against the viral RefSeq database to assign taxonomy and additional comparison to phage-specific and cellular markers is undertaken to support the taxonomic assignments and identify potential cellular contamination. Broad functional classification of the predicted genes is provided by assignment of COG microbial functional category classifications using EggNOG and higher resolution functional analysis is achieved by searching for enrichment of specific Swiss-Prot keywords within the viral metagenome. Application of HoloVir to viral metagenomes from the coral Pocillopora damicornis and the sponge Rhopaloeides odorabile demonstrated that HoloVir provides a valuable tool to characterize holobiont viral communities across species, environments, or experiments. PMID:27375564
MOOville: The Writing Project's Own "Private Idaho".
ERIC Educational Resources Information Center
Conlon, Michael
1997-01-01
Describes how a computerized environment supplemented traditional undergraduate courses in English literature and composition at the University of Florida, and was developed with a grant from IBM. Highlights include the use of MOO (multi-user, object-oriented) space; student assignments; the client-server setting; and student and teacher…
Experimental Optimization of Exposure Index and Quality of Service in Wlan Networks.
Plets, David; Vermeeren, Günter; Poorter, Eli De; Moerman, Ingrid; Goudos, Sotirios K; Luc, Martens; Wout, Joseph
2017-07-01
This paper presents the first real-life optimization of the Exposure Index (EI). A genetic optimization algorithm is developed and applied to three real-life Wireless Local Area Network scenarios in an experimental testbed. The optimization accounts for downlink, uplink and uplink of other users, for realistic duty cycles, and ensures a sufficient Quality of Service to all users. EI reductions up to 97.5% compared to a reference configuration can be achieved in a downlink-only scenario, in combination with an improved Quality of Service. Due to the dominance of uplink exposure and the lack of WiFi power control, no optimizations are possible in scenarios that also consider uplink traffic. However, future deployments that do implement WiFi power control can be successfully optimized, with EI reductions up to 86% compared to a reference configuration and an EI that is 278 times lower than optimized configurations under the absence of power control. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Rash, James L.
2010-01-01
NASA's space data-communications infrastructure, the Space Network and the Ground Network, provide scheduled (as well as some limited types of unscheduled) data-communications services to user spacecraft via orbiting relay satellites and ground stations. An implementation of the methods and algorithms disclosed herein will be a system that produces globally optimized schedules with not only optimized service delivery by the space data-communications infrastructure but also optimized satisfaction of all user requirements and prescribed constraints, including radio frequency interference (RFI) constraints. Evolutionary search, a class of probabilistic strategies for searching large solution spaces, constitutes the essential technology in this disclosure. Also disclosed are methods and algorithms for optimizing the execution efficiency of the schedule-generation algorithm itself. The scheduling methods and algorithms as presented are adaptable to accommodate the complexity of scheduling the civilian and/or military data-communications infrastructure. Finally, the problem itself, and the methods and algorithms, are generalized and specified formally, with applicability to a very broad class of combinatorial optimization problems.
NASA Astrophysics Data System (ADS)
Shirazi, Abolfazl
2016-10-01
This article introduces a new method to optimize finite-burn orbital manoeuvres based on a modified evolutionary algorithm. Optimization is carried out based on conversion of the orbital manoeuvre into a parameter optimization problem by assigning inverse tangential functions to the changes in direction angles of the thrust vector. The problem is analysed using boundary delimitation in a common optimization algorithm. A method is introduced to achieve acceptable values for optimization variables using nonlinear simulation, which results in an enlarged convergence domain. The presented algorithm benefits from high optimality and fast convergence time. A numerical example of a three-dimensional optimal orbital transfer is presented and the accuracy of the proposed algorithm is shown.
An A Priori Multiobjective Optimization Model of a Search and Rescue Network
1992-03-01
sequences. Classical sensitivity analysis and tolerance analysis were used to analyze the frequency assignments generated by the different weight...function for excess coverage of a frequency. Sensitivity analysis is used to investigate the robustness of the frequency assignments produced by the...interest. The linear program solution is used to produce classical sensitivity analysis for the weight ranges. 17 III. Model Formulation This chapter
ERIC Educational Resources Information Center
Carrell, Scott E.; Sacerdote, Bruce I.; West, James E.
2011-01-01
We take cohorts of entering freshmen at the United States Air Force Academy and assign half to peer groups with the goal of maximizing the academic performance of the lowest ability students. Our assignment algorithm uses peer effects estimates from the observational data. We find a negative and significant treatment effect for the students we…
2006-12-01
APPROACH As mentioned previously, ASCU does not use simulation in the traditional manner. Instead, it uses simulation to transition and capture the state...0 otherwise (by a heuristic discussed below). • Let cja = The reward for a UAV with sensor pack- age j being assigned to mission area a from the
Optimal block cosine transform image coding for noisy channels
NASA Technical Reports Server (NTRS)
Vaishampayan, V.; Farvardin, N.
1986-01-01
The two dimensional block transform coding scheme based on the discrete cosine transform was studied extensively for image coding applications. While this scheme has proven to be efficient in the absence of channel errors, its performance degrades rapidly over noisy channels. A method is presented for the joint source channel coding optimization of a scheme based on the 2-D block cosine transform when the output of the encoder is to be transmitted via a memoryless design of the quantizers used for encoding the transform coefficients. This algorithm produces a set of locally optimum quantizers and the corresponding binary code assignment for the assumed transform coefficient statistics. To determine the optimum bit assignment among the transform coefficients, an algorithm was used based on the steepest descent method, which under certain convexity conditions on the performance of the channel optimized quantizers, yields the optimal bit allocation. Comprehensive simulation results for the performance of this locally optimum system over noisy channels were obtained and appropriate comparisons against a reference system designed for no channel error were rendered.
Agreement Technologies for Energy Optimization at Home.
González-Briones, Alfonso; Chamoso, Pablo; De La Prieta, Fernando; Demazeau, Yves; Corchado, Juan M
2018-05-19
Nowadays, it is becoming increasingly common to deploy sensors in public buildings or homes with the aim of obtaining data from the environment and taking decisions that help to save energy. Many of the current state-of-the-art systems make decisions considering solely the environmental factors that cause the consumption of energy. These systems are successful at optimizing energy consumption; however, they do not adapt to the preferences of users and their comfort. Any system that is to be used by end-users should consider factors that affect their wellbeing. Thus, this article proposes an energy-saving system, which apart from considering the environmental conditions also adapts to the preferences of inhabitants. The architecture is based on a Multi-Agent System (MAS), its agents use Agreement Technologies (AT) to perform a negotiation process between the comfort preferences of the users and the degree of optimization that the system can achieve according to these preferences. A case study was conducted in an office building, showing that the proposed system achieved average energy savings of 17.15%.
Differential-Evolution Control Parameter Optimization for Unmanned Aerial Vehicle Path Planning
Kok, Kai Yit; Rajendran, Parvathy
2016-01-01
The differential evolution algorithm has been widely applied on unmanned aerial vehicle (UAV) path planning. At present, four random tuning parameters exist for differential evolution algorithm, namely, population size, differential weight, crossover, and generation number. These tuning parameters are required, together with user setting on path and computational cost weightage. However, the optimum settings of these tuning parameters vary according to application. Instead of trial and error, this paper presents an optimization method of differential evolution algorithm for tuning the parameters of UAV path planning. The parameters that this research focuses on are population size, differential weight, crossover, and generation number. The developed algorithm enables the user to simply define the weightage desired between the path and computational cost to converge with the minimum generation required based on user requirement. In conclusion, the proposed optimization of tuning parameters in differential evolution algorithm for UAV path planning expedites and improves the final output path and computational cost. PMID:26943630
NASA Astrophysics Data System (ADS)
Wisittipanit, Nuttachat; Wisittipanich, Warisa
2018-07-01
Demand response (DR) refers to changes in the electricity use patterns of end-users in response to incentive payment designed to prompt lower electricity use during peak periods. Typically, there are three players in the DR system: an electric utility operator, a set of aggregators and a set of end-users. The DR model used in this study aims to minimize the operator's operational cost and offer rewards to aggregators, while profit-maximizing aggregators compete to sell DR services to the operator and provide compensation to end-users for altering their consumption profiles. This article presents the first application of two metaheuristics in the DR system: particle swarm optimization (PSO) and differential evolution (DE). The objective is to optimize the incentive payments during various periods to satisfy all stakeholders. The results show that DE significantly outperforms PSO, since it can attain better compensation rates, lower operational costs and higher aggregator profits.
A flexible, interactive software tool for fitting the parameters of neuronal models.
Friedrich, Péter; Vella, Michael; Gulyás, Attila I; Freund, Tamás F; Káli, Szabolcs
2014-01-01
The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool.
Statistical Learning of Origin-Specific Statically Optimal Individualized Treatment Rules
van der Laan, Mark J.; Petersen, Maya L.
2008-01-01
Consider a longitudinal observational or controlled study in which one collects chronological data over time on a random sample of subjects. The time-dependent process one observes on each subject contains time-dependent covariates, time-dependent treatment actions, and an outcome process or single final outcome of interest. A statically optimal individualized treatment rule (as introduced in van der Laan et. al. (2005), Petersen et. al. (2007)) is a treatment rule which at any point in time conditions on a user-supplied subset of the past, computes the future static treatment regimen that maximizes a (conditional) mean future outcome of interest, and applies the first treatment action of the latter regimen. In particular, Petersen et. al. (2007) clarified that, in order to be statically optimal, an individualized treatment rule should not depend on the observed treatment mechanism. Petersen et. al. (2007) further developed estimators of statically optimal individualized treatment rules based on a past capturing all confounding of past treatment history on outcome. In practice, however, one typically wishes to find individualized treatment rules responding to a user-supplied subset of the complete observed history, which may not be sufficient to capture all confounding. The current article provides an important advance on Petersen et. al. (2007) by developing locally efficient double robust estimators of statically optimal individualized treatment rules responding to such a user-supplied subset of the past. However, failure to capture all confounding comes at a price; the static optimality of the resulting rules becomes origin-specific. We explain origin-specific static optimality, and discuss the practical importance of the proposed methodology. We further present the results of a data analysis in which we estimate a statically optimal rule for switching antiretroviral therapy among patients infected with resistant HIV virus. PMID:19122792
A flexible, interactive software tool for fitting the parameters of neuronal models
Friedrich, Péter; Vella, Michael; Gulyás, Attila I.; Freund, Tamás F.; Káli, Szabolcs
2014-01-01
The construction of biologically relevant neuronal models as well as model-based analysis of experimental data often requires the simultaneous fitting of multiple model parameters, so that the behavior of the model in a certain paradigm matches (as closely as possible) the corresponding output of a real neuron according to some predefined criterion. Although the task of model optimization is often computationally hard, and the quality of the results depends heavily on technical issues such as the appropriate choice (and implementation) of cost functions and optimization algorithms, no existing program provides access to the best available methods while also guiding the user through the process effectively. Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. Meanwhile, educated users can extend the capabilities of the program and customize it according to their needs with relatively little effort. Optimizer has been developed in Python, takes advantage of open-source Python modules for nonlinear optimization, and interfaces directly with the NEURON simulator to run the models. Other simulators are supported through an external interface. We have tested the program on several different types of problems of varying complexity, using different model classes. As targets, we used simulated traces from the same or a more complex model class, as well as experimental data. We successfully used Optimizer to determine passive parameters and conductance densities in compartmental models, and to fit simple (adaptive exponential integrate-and-fire) neuronal models to complex biological data. Our detailed comparisons show that Optimizer can handle a wider range of problems, and delivers equally good or better performance than any other existing neuronal model fitting tool. PMID:25071540
Zhang, Dezhi; Li, Shuangyan
2014-01-01
This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level. PMID:24977209
Zhang, Dezhi; Li, Shuangyan; Qin, Jin
2014-01-01
This paper proposes a new model of simultaneous optimization of three-level logistics decisions, for logistics authorities, logistics operators, and logistics users, for regional logistics network with environmental impact consideration. The proposed model addresses the interaction among the three logistics players in a complete competitive logistics service market with CO2 emission charges. We also explicitly incorporate the impacts of the scale economics of the logistics park and the logistics users' demand elasticity into the model. The logistics authorities aim to maximize the total social welfare of the system, considering the demand of green logistics development by two different methods: optimal location of logistics nodes and charging a CO2 emission tax. Logistics operators are assumed to compete with logistics service fare and frequency, while logistics users minimize their own perceived logistics disutility given logistics operators' service fare and frequency. A heuristic algorithm based on the multinomial logit model is presented for the three-level decision model, and a numerical example is given to illustrate the above optimal model and its algorithm. The proposed model provides a useful tool for modeling competitive logistics services and evaluating logistics policies at the strategic level.
Practices in NASA's EOSDIS to Promote Open Data and Research Integrity
NASA Astrophysics Data System (ADS)
Behnke, J.; Ramapriyan, H.
2017-12-01
The purpose of this paper is to highlight the key practices adopted by NASA in its Earth Observing System Data and Information System (EOSDIS) to promote and facilitate open data and research integrity. EOSDIS is the system that manages most of NASA's Earth science data from various sources - satellites, aircraft, field campaigns and some research projects. Since its inception in 1990 as a part of the Earth Observing System (EOS) Program, EOSDIS has been following NASA's free and open data and information policy, whereby data are shared with all users on a non-discriminatory basis and are provided at no cost. To ensure that the data are discoverable and accessible to the user community, NASA follows an evolutionary development approach, whereby the latest technologies that can be practically adopted are infused into EOSDIS. This results in continuous improvements in system capabilities such that technologies that users are accustomed to in other environments are brought to bear in their access to NASA's Earth observation data. Mechanisms have existed for ensuring that the data products offered by EOSDIS are vetted by the community before they are released. Information about data products such as Algorithm Theoretical Basis Documents and quality assessments are openly available with the products. The EOSDIS Distributed Active Archive Centers (DAACs) work with the science teams responsible for product generation to assist with proper use of metadata. The DAACs have knowledgeable staff to answer users' questions and have access to scientific experts as needed. Citation of data products in scientific papers are facilitated by assignment of Digital Object Identifiers (DOIs) - at present, over 50% of data products in EOSDIS have been assigned DOIs. NASA gathers and publishes citation metrics for the datasets offered by the DAACs. Through its Software and Services Citations Working Group, NASA is currently investigating broadening DOI assignments to promote greater provenance traceability. NASA has developed Preservation Content Specifications for Earth science data to ensure that provenance and context are captured and preserved for the future and is applying them to data and information from its missions. All these actions promote availability of information to promote integrity in scientific research.
Information Switching Processor (ISP) contention analysis and control
NASA Technical Reports Server (NTRS)
Shyy, D.; Inukai, T.
1993-01-01
Future satellite communications, as a viable means of communications and an alternative to terrestrial networks, demand flexibility and low end-user cost. On-board switching/processing satellites potentially provide these features, allowing flexible interconnection among multiple spot beams, direct to the user communications services using very small aperture terminals (VSAT's), independent uplink and downlink access/transmission system designs optimized to user's traffic requirements, efficient TDM downlink transmission, and better link performance. A flexible switching system on the satellite in conjunction with low-cost user terminals will likely benefit future satellite network users.
Dusseldorp, Elise; Doove, Lisa; Mechelen, Iven van
2016-06-01
In the analysis of randomized controlled trials (RCTs), treatment effect heterogeneity often occurs, implying differences across (subgroups of) clients in treatment efficacy. This phenomenon is typically referred to as treatment-subgroup interactions. The identification of subgroups of clients, defined in terms of pretreatment characteristics that are involved in a treatment-subgroup interaction, is a methodologically challenging task, especially when many characteristics are available that may interact with treatment and when no comprehensive a priori hypotheses on relevant subgroups are available. A special type of treatment-subgroup interaction occurs if the ranking of treatment alternatives in terms of efficacy differs across subgroups of clients (e.g., for one subgroup treatment A is better than B and for another subgroup treatment B is better than A). These are called qualitative treatment-subgroup interactions and are most important for optimal treatment assignment. The method QUINT (Qualitative INteraction Trees) was recently proposed to induce subgroups involved in such interactions from RCT data. The result of an analysis with QUINT is a binary tree from which treatment assignment criteria can be derived. The implementation of this method, the R package quint, is the topic of this paper. The analysis process is described step-by-step using data from the Breast Cancer Recovery Project, showing the reader all functions included in the package. The output is explained and given a substantive interpretation. Furthermore, an overview is given of the tuning parameters involved in the analysis, along with possible motivational concerns associated with choice alternatives that are available to the user.
The PMDB Protein Model Database
Castrignanò, Tiziana; De Meo, Paolo D'Onorio; Cozzetto, Domenico; Talamo, Ivano Giuseppe; Tramontano, Anna
2006-01-01
The Protein Model Database (PMDB) is a public resource aimed at storing manually built 3D models of proteins. The database is designed to provide access to models published in the scientific literature, together with validating experimental data. It is a relational database and it currently contains >74 000 models for ∼240 proteins. The system is accessible at and allows predictors to submit models along with related supporting evidence and users to download them through a simple and intuitive interface. Users can navigate in the database and retrieve models referring to the same target protein or to different regions of the same protein. Each model is assigned a unique identifier that allows interested users to directly access the data. PMID:16381873
Leveraging Multiactions to Improve Medical Personalized Ranking for Collaborative Filtering.
Gao, Shan; Guo, Guibing; Li, Runzhi; Wang, Zongmin
2017-01-01
Nowadays, providing high-quality recommendation services to users is an essential component in web applications, including shopping, making friends, and healthcare. This can be regarded either as a problem of estimating users' preference by exploiting explicit feedbacks (numerical ratings), or as a problem of collaborative ranking with implicit feedback (e.g., purchases, views, and clicks). Previous works for solving this issue include pointwise regression methods and pairwise ranking methods. The emerging healthcare websites and online medical databases impose a new challenge for medical service recommendation. In this paper, we develop a model, MBPR (Medical Bayesian Personalized Ranking over multiple users' actions), based on the simple observation that users tend to assign higher ranks to some kind of healthcare services that are meanwhile preferred in users' other actions. Experimental results on the real-world datasets demonstrate that MBPR achieves more accurate recommendations than several state-of-the-art methods and shows its generality and scalability via experiments on the datasets from one mobile shopping app.
Leveraging Multiactions to Improve Medical Personalized Ranking for Collaborative Filtering
2017-01-01
Nowadays, providing high-quality recommendation services to users is an essential component in web applications, including shopping, making friends, and healthcare. This can be regarded either as a problem of estimating users' preference by exploiting explicit feedbacks (numerical ratings), or as a problem of collaborative ranking with implicit feedback (e.g., purchases, views, and clicks). Previous works for solving this issue include pointwise regression methods and pairwise ranking methods. The emerging healthcare websites and online medical databases impose a new challenge for medical service recommendation. In this paper, we develop a model, MBPR (Medical Bayesian Personalized Ranking over multiple users' actions), based on the simple observation that users tend to assign higher ranks to some kind of healthcare services that are meanwhile preferred in users' other actions. Experimental results on the real-world datasets demonstrate that MBPR achieves more accurate recommendations than several state-of-the-art methods and shows its generality and scalability via experiments on the datasets from one mobile shopping app. PMID:29118963
Optimizing diffusion of an online computer tailored lifestyle program: a study protocol.
Schneider, Francine; van Osch, Liesbeth A D M; Kremers, Stef P J; Schulz, Daniela N; van Adrichem, Mathieu J G; de Vries, Hein
2011-06-20
Although the Internet is a promising medium to offer lifestyle interventions to large amounts of people at relatively low costs and effort, actual exposure rates of these interventions fail to meet the high expectations. Since public health impact of interventions is determined by intervention efficacy and level of exposure to the intervention, it is imperative to put effort in optimal dissemination. The present project attempts to optimize the dissemination process of a new online computer tailored generic lifestyle program by carefully studying the adoption process and developing a strategy to achieve sustained use of the program. A prospective study will be conducted to yield relevant information concerning the adoption process by studying the level of adoption of the program, determinants involved in adoption and characteristics of adopters and non-adopters as well as satisfied and unsatisfied users. Furthermore, a randomized control trial will be conducted to the test the effectiveness of a proactive strategy using periodic e-mail prompts in optimizing sustained use of the new program. Closely mapping the adoption process will gain insight in characteristics of adopters and non-adopters and satisfied and unsatisfied users. This insight can be used to further optimize the program by making it more suitable for a wider range of users, or to develop adjusted interventions to attract subgroups of users that are not reached or satisfied with the initial intervention. Furthermore, by studying the effect of a proactive strategy using period prompts compared to a reactive strategy to stimulate sustained use of the intervention and, possibly, behaviour change, specific recommendations on the use and the application of prompts in online lifestyle interventions can be developed. Dutch Trial Register NTR1786 and Medical Ethics Committee of Maastricht University and the University Hospital Maastricht (NL2723506809/MEC0903016).
ERIC Educational Resources Information Center
Rizvi, Rubina Fatima
2017-01-01
Despite high Electronic Health Record (EHR) system adoption rates by hospital and office-based practices, many users remain highly dissatisfied with the current state of EHRs. Sub-optimal EHR usability as a result of insufficient incorporation of User-Centered Design (UCD) approach during System Development Life Cycle process (SDLC) is considered…
Recreational System Optimization to Reduce Conflict on Public Lands
NASA Astrophysics Data System (ADS)
Shilling, Fraser; Boggs, Jennifer; Reed, Sarah
2012-09-01
In response to federal administrative rule, the Tahoe National Forest (TNF), California, USA engaged in trail-route prioritization for motorized recreation (e.g., off-highway-vehicles) and other recreation types. The prioritization was intended to identify routes that were suitable and ill-suited for maintenance in a transportation system. A recreational user survey was conducted online ( n = 813) for user preferences for trail system characteristics, recreational use patterns, and demographics. Motorized trail users and non-motorized users displayed very clear and contrasting preferences for the same system. As has been found by previous investigators, non-motorized users expressed antagonism to motorized use on the same recreational travel system, whereas motorized users either supported multiple-use routes or dismissed non-motorized recreationists' concerns. To help the TNF plan for reduced conflict, a geographic information system (GIS) based modeling approach was used to identify recreational opportunities and potential environmental impacts of all travel routes. This GIS-based approach was based on an expert-derived rule set. The rules addressed particular environmental and recreation concerns in the TNF. Route segments were identified that could be incorporated into minimal-impact networks to support various types of recreation. The combination of potential impacts and user-benefits supported an optimization approach for an appropriate recreational travel network to minimize environmental impacts and user-conflicts in a multi-purpose system.
Recreational system optimization to reduce conflict on public lands.
Shilling, Fraser; Boggs, Jennifer; Reed, Sarah
2012-09-01
In response to federal administrative rule, the Tahoe National Forest (TNF), California, USA engaged in trail-route prioritization for motorized recreation (e.g., off-highway-vehicles) and other recreation types. The prioritization was intended to identify routes that were suitable and ill-suited for maintenance in a transportation system. A recreational user survey was conducted online (n = 813) for user preferences for trail system characteristics, recreational use patterns, and demographics. Motorized trail users and non-motorized users displayed very clear and contrasting preferences for the same system. As has been found by previous investigators, non-motorized users expressed antagonism to motorized use on the same recreational travel system, whereas motorized users either supported multiple-use routes or dismissed non-motorized recreationists' concerns. To help the TNF plan for reduced conflict, a geographic information system (GIS) based modeling approach was used to identify recreational opportunities and potential environmental impacts of all travel routes. This GIS-based approach was based on an expert-derived rule set. The rules addressed particular environmental and recreation concerns in the TNF. Route segments were identified that could be incorporated into minimal-impact networks to support various types of recreation. The combination of potential impacts and user-benefits supported an optimization approach for an appropriate recreational travel network to minimize environmental impacts and user-conflicts in a multi-purpose system.
A Geographic Optimization Approach to Coast Guard Ship Basing
2015-06-01
information found an optimal result for partition- ing. Carlsson applies the travelling salesman problem (tries to find the shortest path to visit a list of...maximum 200 words) This thesis studies the problem of finding efficient ship base locations, area of operations (AO) among bases, and ship assignments...for a coast guard (CG) organization. This problem is faced by many CGs around the world and is motivated by the need to optimize operational outcomes
Autonomous Modelling of X-ray Spectra Using Robust Global Optimization Methods
NASA Astrophysics Data System (ADS)
Rogers, Adam; Safi-Harb, Samar; Fiege, Jason
2015-08-01
The standard approach to model fitting in X-ray astronomy is by means of local optimization methods. However, these local optimizers suffer from a number of problems, such as a tendency for the fit parameters to become trapped in local minima, and can require an involved process of detailed user intervention to guide them through the optimization process. In this work we introduce a general GUI-driven global optimization method for fitting models to X-ray data, written in MATLAB, which searches for optimal models with minimal user interaction. We directly interface with the commonly used XSPEC libraries to access the full complement of pre-existing spectral models that describe a wide range of physics appropriate for modelling astrophysical sources, including supernova remnants and compact objects. Our algorithm is powered by the Ferret genetic algorithm and Locust particle swarm optimizer from the Qubist Global Optimization Toolbox, which are robust at finding families of solutions and identifying degeneracies. This technique will be particularly instrumental for multi-parameter models and high-fidelity data. In this presentation, we provide details of the code and use our techniques to analyze X-ray data obtained from a variety of astrophysical sources.
Rationalities of Collaboration for Language Learning in a Wiki
ERIC Educational Resources Information Center
Bradley, Linda; Lindstrom, Berner; Rystedt, Hans
2010-01-01
For language learning, online environments allowing for user generated content are becoming increasingly important since they offer possibilities for learners to elaborate on assignments and projects. This study investigates what wikis can do as a means to enhance group interaction, when students are encouraged to participate in constructing text…
Library Instruction Handbook. Revised Edition.
ERIC Educational Resources Information Center
Marsh, Sheila; Goff, Linda
The purpose of this handbook is to familiarize the user with the organization of the California State University-Sacramento (CSUS) library. As part of a course requirement, the handbook is designed to provide the student with information necessary to perform assignments in the library independently of librarians and support staff, and as a…
Effects of Behavioral and Pharmacological Treatment on Smokeless Tobacco Users.
ERIC Educational Resources Information Center
Hatsukami, Dorothy; And Others
1996-01-01
Examined the effects of 2 mg of nicotine polacrilex versus placebo gum and a group behavioral treatment versus minimal contact on cessation of smokeless tobacco use. Participants (n=210) were randomly assigned 1 of the 4 treatment conditions. Withdrawal symptoms were assessed throughout the treatment. Discusses findings. (KW)
78 FR 68813 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-15
... planned radio frequency (RF) bands that are shared on a co-primary basis by Federal and non-Federal users... newly proposed assignment within the shared portions of the radio spectrum; and replaced the manual RF... national security. The Web-based system replaced a manual process where coordination and approval could...
Wikipedia in Promoting Science Literary Skills in Primary Schools
ERIC Educational Resources Information Center
Menon, Sunitha; Alias, Norlidah; DeWitt, Dorothy
2014-01-01
In learning Science, online environments allowing for user generated content are becoming increasingly important since they offer possibilities for learners to elaborate on assignments and projects. This study investigates how Wikipedia can serve as a means for enhancing science literary skills when students are encouraged to participate in…
Modelling Cognitive Style in a Peer Help Network.
ERIC Educational Resources Information Center
Bull, Susan; McCalla, Gord
2002-01-01
Explains I-Help, a computer-based peer help network where students can ask and answer questions about assignments and courses based on the metaphor of a help desk. Highlights include cognitive style; user modeling in I-Help; matching helpers to helpees; and types of questions. (Contains 64 references.) (LRW)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-21
... potential for harmful interference to adjacent Wireless Communications Service (WCS) spectrum users by...) average equivalent isotropically radiated power (EIRP) to facilitate the flexible deployment of SDARS... qualifications of SDARS applicants or licensees to operate a station, transfer or assign a license, and to...
The Use of MERLOT in Biochemistry and Molecular Biology Education
ERIC Educational Resources Information Center
Cooper, Scott
2005-01-01
The referatory, Multimedia Educational Resources for Learning and Online Teaching (MERLOT), contains links to 1300 electronic teaching resources in biology and chemistry. Approximately 20% have been peer reviewed, and most have user comments or assignments attached. In addition to being a source of educational resources, the MERLOT project seeks…
The Effects of Elaboration on Self-Learning Procedures from Text.
ERIC Educational Resources Information Center
Yang, Fu-mei
This study investigated the effects of augmenting and deleting elaborations in an existing self-instructional text for a micro-computer database application, "Microsoft Works User's Manual." A total of 60 undergraduate students were randomly assigned to the original, elaborated, or unelaborated text versions. The elaborated version…
75 FR 8995 - Submission for OMB Review: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-26
... currently approved collection. Title of Collection: Occupational Code Assignment (OCA). OMB Control Number... (OCA), is provided as a public service for the states as well as for others who use occupational information. The OCA process is designed to help users relate an occupational specialty or a job title or to...
DocML: A Digital Library of University Data.
ERIC Educational Resources Information Center
Papadakis, Ioannis; Karakoidas, Vassileios; Chrissikopoulos, Vassileios
2002-01-01
Describes DocML, a Web-based digital library of university data that is used to build a system capable of preserving and managing student assignments. Topics include requirements for a digital library of university data; metadata and XML; three-tier architecture; user interface; searching; browsing; content delivery; and administrative issues.…
Development of a database for chemical mechanism assignments for volatile organic emissions.
Carter, William P L
2015-10-01
The development of a database for making model species assignments when preparing total organic gas (TOG) emissions input for atmospheric models is described. This database currently has assignments of model species for 12 different gas-phase chemical mechanisms for over 1700 chemical compounds and covers over 3000 chemical categories used in five different anthropogenic TOG profile databases or output by two different biogenic emissions models. This involved developing a unified chemical classification system, assigning compounds to mixtures, assigning model species for the mechanisms to the compounds, and making assignments for unknown, unassigned, and nonvolatile mass. The comprehensiveness of the assignments, the contributions of various types of speciation categories to current profile and total emissions data, inconsistencies with existing undocumented model species assignments, and remaining speciation issues and areas of needed work are also discussed. The use of the system to prepare input for SMOKE, the Speciation Tool, and for biogenic models is described in the supplementary materials. The database, associated programs and files, and a users manual are available online at http://www.cert.ucr.edu/~carter/emitdb . Assigning air quality model species to the hundreds of emitted chemicals is a necessary link between emissions data and modeling effects of emissions on air quality. This is not easy and makes it difficult to implement new and more chemically detailed mechanisms in models. If done incorrectly, it is similar to errors in emissions speciation or the chemical mechanism used. Nevertheless, making such assignments is often an afterthought in chemical mechanism development and emissions processing, and existing assignments are usually undocumented and have errors and inconsistencies. This work is designed to address some of these problems.
Intelligent web image retrieval system
NASA Astrophysics Data System (ADS)
Hong, Sungyong; Lee, Chungwoo; Nah, Yunmook
2001-07-01
Recently, the web sites such as e-business sites and shopping mall sites deal with lots of image information. To find a specific image from these image sources, we usually use web search engines or image database engines which rely on keyword only retrievals or color based retrievals with limited search capabilities. This paper presents an intelligent web image retrieval system. We propose the system architecture, the texture and color based image classification and indexing techniques, and representation schemes of user usage patterns. The query can be given by providing keywords, by selecting one or more sample texture patterns, by assigning color values within positional color blocks, or by combining some or all of these factors. The system keeps track of user's preferences by generating user query logs and automatically add more search information to subsequent user queries. To show the usefulness of the proposed system, some experimental results showing recall and precision are also explained.
Optical authentication based on moiré effect of nonlinear gratings in phase space
NASA Astrophysics Data System (ADS)
Liao, Meihua; He, Wenqi; Wu, Jiachen; Lu, Dajiang; Liu, Xiaoli; Peng, Xiang
2015-12-01
An optical authentication scheme based on the moiré effect of nonlinear gratings in phase space is proposed. According to the phase function relationship of the moiré effect in phase space, an arbitrary authentication image can be encoded into two nonlinear gratings which serve as the authentication lock (AL) and the authentication key (AK). The AL is stored in the authentication system while the AK is assigned to the authorized user. The authentication procedure can be performed using an optoelectronic approach, while the design process is accomplished by a digital approach. Furthermore, this optical authentication scheme can be extended for multiple users with different security levels. The proposed scheme can not only verify the legality of a user identity, but can also discriminate and control the security levels of legal users. Theoretical analysis and simulation experiments are provided to verify the feasibility and effectiveness of the proposed scheme.
ConcreteWorks v3 training/user manual (P1) : ConcreteWorks software (P2).
DOT National Transportation Integrated Search
2017-04-01
ConcreteWorks is designed to be a user-friendly software package that can help concrete : professionals optimize concrete mixture proportioning, perform a concrete thermal analysis, and : increase the chloride diffusion service life. The software pac...
Quality versus intelligibility: studying human preferences for American Sign Language video
NASA Astrophysics Data System (ADS)
Ciaramello, Frank M.; Hemami, Sheila S.
2011-03-01
Real-time videoconferencing using cellular devices provides natural communication to the Deaf community. For this application, compressed American Sign Language (ASL) video must be evaluated in terms of the intelligibility of the conversation and not in terms of the overall aesthetic quality of the video. This work presents a paired comparison experiment to determine the subjective preferences of ASL users in terms of the trade-off between intelligibility and quality when varying the proportion of the bitrate allocated explicitly to the regions of the video containing the signer. A rate-distortion optimization technique, which jointly optimizes a quality criteria and an intelligibility criteria according to a user-specified parameter, generates test video pairs for the subjective experiment. Experimental results suggest that at sufficiently high bitrates, all users prefer videos in which the non-signer regions in the video are encoded with some nominal rate. As the total encoding bitrate decreases, users generally prefer video in which a greater proportion of the rate is allocated to the signer. The specific operating points preferred in the quality-intelligibility trade-off vary with the demographics of the users.
Multidisciplinary Environments: A History of Engineering Framework Development
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Gillian, Ronnie E.
2006-01-01
This paper traces the history of engineering frameworks and their use by Multidisciplinary Design Optimization (MDO) practitioners. The approach is to reference papers that have been presented at one of the ten previous Multidisciplinary Analysis and Optimization (MA&O) conferences. By limiting the search to MA&O papers, the authors can (1) identify the key ideas that led to general purpose MDO frameworks and (2) uncover roadblocks that delayed the development of these ideas. The authors make no attempt to assign credit for revolutionary ideas or to assign blame for missed opportunities. Rather, the goal is to trace the various threads of computer architecture and software framework research and to observe how these threads contributed to the commercial framework products available today.
Dynamic, stochastic models for congestion pricing and congestion securities.
DOT National Transportation Integrated Search
2010-12-01
This research considers congestion pricing under demand uncertainty. In particular, a robust optimization (RO) approach is applied to optimal congestion pricing problems under user equilibrium. A mathematical model is developed and an analysis perfor...
Electronic neural networks for global optimization
NASA Technical Reports Server (NTRS)
Thakoor, A. P.; Moopenn, A. W.; Eberhardt, S.
1990-01-01
An electronic neural network with feedback architecture, implemented in analog custom VLSI is described. Its application to problems of global optimization for dynamic assignment is discussed. The convergence properties of the neural network hardware are compared with computer simulation results. The neural network's ability to provide optimal or near optimal solutions within only a few neuron time constants, a speed enhancement of several orders of magnitude over conventional search methods, is demonstrated. The effect of noise on the circuit dynamics and the convergence behavior of the neural network hardware is also examined.
Evaluation of Goal Programming for the Optimal Assignment of Inspectors to Construction Projects
1988-09-01
Inputs ..... .............. 90 Equation Coefficients . ....... .. 90 Weights, Priorities and the AHP . . 91 Right-Hand Side Values ........ .. 91...the AHP Hierarchy with k Levels . . 36 3. Sample Matrix for Pairwise Comparison ........ .. 37 4. Assignment of I and p for Example Problem...Weights for Example Problem ... 61 3. AHP Weights and Coefficient ci, Values. ........ 63 vii AFIT/GEM/LSM/88S-16 Abstract The purpose of this study was
ERIC Educational Resources Information Center
Rajappa, Medha; Bobby, Zachariah; Nandeesha, H.; Suryapriya, R.; Ragul, Anithasri; Yuvaraj, B.; Revathy, G.; Priyadarssini, M.
2016-01-01
Graduate medical students of India are taught Biochemistry by didactic lectures and they hardly get any opportunity to clarify their doubts and reinforce the concepts which they learn in these lectures. We used a combination of teaching-learning (T-L) methods (open book assignment followed by group tutorials) to study their efficacy in improving…
Chen, G; Wu, F Y; Liu, Z C; Yang, K; Cui, F
2015-08-01
Subject-specific finite element (FE) models can be generated from computed tomography (CT) datasets of a bone. A key step is assigning material properties automatically onto finite element models, which remains a great challenge. This paper proposes a node-based assignment approach and also compares it with the element-based approach in the literature. Both approaches were implemented using ABAQUS. The assignment procedure is divided into two steps: generating the data file of the image intensity of a bone in a MATLAB program and reading the data file into ABAQUS via user subroutines. The node-based approach assigns the material properties to each node of the finite element mesh, while the element-based approach assigns the material properties directly to each integration point of an element. Both approaches are independent from the type of elements. A number of FE meshes are tested and both give accurate solutions; comparatively the node-based approach involves less programming effort. The node-based approach is also independent from the type of analyses; it has been tested on the nonlinear analysis of a Sawbone femur. The node-based approach substantially improves the level of automation of the assignment procedure of bone material properties. It is the simplest and most powerful approach that is applicable to many types of analyses and elements. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
Investigation of goal change to optimize upper-extremity motor performance in a robotic environment.
Brewer, Bambi R; Klatzky, Roberta; Markham, Heather; Matsuoka, Yoky
2009-10-01
Robotic devices for therapy have the potential to enable intensive, fully customized home rehabilitation over extended periods for individuals with stroke and traumatic brain injury, thus empowering them to maximize their functional recovery. For robotic rehabilitation to be most effective, systems must have the capacity to assign performance goals to the user and to increment those goals to encourage performance improvement. Otherwise, individuals may plateau at an artificially low level of function. Frequent goal change is needed to motivate improvements in performance by individuals with brain injury; but because of entrenched habits, these individuals may avoid striving for goals that they perceive as becoming ever more difficult. For this reason, implicit, undetectable goal change (distortion) may be more effective than explicit goal change at optimizing the motor performance of some individuals with brain injury. This paper reviews a body of work that provides a basis for incorporating implicit goal change into a robotic rehabilitation paradigm. This work was conducted with individuals without disability to provide foundational knowledge for using goal change in a robotic environment. In addition, we compare motor performance with goal change to performance with no goal or with a static goal for individuals without brain injury. Our results show that goal change can improve motor performance when participants attend to visual feedback. Building on these preliminary results can lead to more effective robotic paradigms for the rehabilitation of individuals with brain injury, including individuals with cerebral palsy.
Jeannerat, Damien
2017-01-01
The introduction of a universal data format to report the correlation data of 2D NMR spectra such as COSY, HSQC and HMBC spectra will have a large impact on the reliability of structure determination of small organic molecules. These lists of assigned cross peaks will bridge signals found in NMR 1D and 2D spectra and the assigned chemical structure. The record could be very compact, human and computer readable so that it can be included in the supplementary material of publications and easily transferred into databases of scientific literature and chemical compounds. The records will allow authors, reviewers and future users to test the consistency and, in favorable situations, the uniqueness of the assignment of the correlation data to the associated chemical structures. Ideally, the data format of the correlation data should include direct links to the NMR spectra to make it possible to validate their reliability and allow direct comparison of spectra. In order to take the full benefits of their potential, the correlation data and the NMR spectra should therefore follow any manuscript in the review process and be stored in open-access database after publication. Keeping all NMR spectra, correlation data and assigned structures together at all time will allow the future development of validation tools increasing the reliability of past and future NMR data. This will facilitate the development of artificial intelligence analysis of NMR spectra by providing a source of data than can be used efficiently because they have been validated or can be validated by future users. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Pegg, Elise C; Gill, Harinderjit S
2016-09-06
A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users. Copyright © 2016 Elsevier Ltd. All rights reserved.
Capacity-constrained traffic assignment in networks with residual queues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lam, W.H.K.; Zhang, Y.
2000-04-01
This paper proposes a capacity-constrained traffic assignment model for strategic transport planning in which the steady-state user equilibrium principle is extended for road networks with residual queues. Therefore, the road-exit capacity and the queuing effects can be incorporated into the strategic transport model for traffic forecasting. The proposed model is applicable to the congested network particularly when the traffic demands exceeds the capacity of the network during the peak period. An efficient solution method is proposed for solving the steady-state traffic assignment problem with residual queues. Then a simple numerical example is employed to demonstrate the application of the proposedmore » model and solution method, while an example of a medium-sized arterial highway network in Sioux Falls, South Dakota, is used to test the applicability of the proposed solution to real problems.« less
Transoptr — A second order beam transport design code with optimization and constraints
NASA Astrophysics Data System (ADS)
Heighway, E. A.; Hutcheon, R. M.
1981-08-01
This code was written initially to design an achromatic and isochronous reflecting magnet and has been extended to compete in capability (for constrained problems) with TRANSPORT. Its advantage is its flexibility in that the user writes a routine to describe his transport system. The routine allows the definition of general variables from which the system parameters can be derived. Further, the user can write any constraints he requires as algebraic equations relating the parameters. All variables may be used in either a first or second order optimization.
Utilization-Based Modeling and Optimization for Cognitive Radio Networks
NASA Astrophysics Data System (ADS)
Liu, Yanbing; Huang, Jun; Liu, Zhangxiong
The cognitive radio technique promises to manage and allocate the scarce radio spectrum in the highly varying and disparate modern environments. This paper considers a cognitive radio scenario composed of two queues for the primary (licensed) users and cognitive (unlicensed) users. According to the Markov process, the system state equations are derived and an optimization model for the system is proposed. Next, the system performance is evaluated by calculations which show the rationality of our system model. Furthermore, discussions among different parameters for the system are presented based on the experimental results.
NASA Astrophysics Data System (ADS)
Zhu, Tingju; Marques, Guilherme Fernandes; Lund, Jay R.
2015-05-01
Efficient reallocation and conjunctive operation of existing water supplies is gaining importance as demands grow, competitions among users intensify, and new supplies become more costly. This paper analyzes the roles and benefits of conjunctive use of surface water and groundwater and market-based water transfers in an integrated regional water system where agricultural and urban water users coordinate supply and demand management based on supply reliability and economic values of water. Agricultural users optimize land and water use for annual and perennial crops to maximize farm income, while urban users choose short-term and long-term water conservation actions to maintain reliability and minimize costs. The temporal order of these decisions is represented in a two-stage optimization that maximizes the net expected benefits of crop production, urban conservation and water management including conjunctive use and water transfers. Long-term decisions are in the first stage and short-term decisions are in a second stage based on probabilities of water availability events. Analytical and numerical analyses are made. Results show that conjunctive use and water transfers can substantially stabilize farmer's income and reduce system costs by reducing expensive urban water conservation or construction. Water transfers can equalize marginal values of water across users, while conjunctive use minimizes water marginal value differences in time. Model results are useful for exploring the integration of different water demands and supplies through water transfers, conjunctive use, and conservation, providing valuable insights for improving system management.
Li, Guangxia; An, Kang; Gao, Bin; Zheng, Gan
2017-01-01
This paper proposes novel satellite-based wireless sensor networks (WSNs), which integrate the WSN with the cognitive satellite terrestrial network. Having the ability to provide seamless network access and alleviate the spectrum scarcity, cognitive satellite terrestrial networks are considered as a promising candidate for future wireless networks with emerging requirements of ubiquitous broadband applications and increasing demand for spectral resources. With the emerging environmental and energy cost concerns in communication systems, explicit concerns on energy efficient resource allocation in satellite networks have also recently received considerable attention. In this regard, this paper proposes energy-efficient optimal power allocation schemes in the cognitive satellite terrestrial networks for non-real-time and real-time applications, respectively, which maximize the energy efficiency (EE) of the cognitive satellite user while guaranteeing the interference at the primary terrestrial user below an acceptable level. Specifically, average interference power (AIP) constraint is employed to protect the communication quality of the primary terrestrial user while average transmit power (ATP) or peak transmit power (PTP) constraint is adopted to regulate the transmit power of the satellite user. Since the energy-efficient power allocation optimization problem belongs to the nonlinear concave fractional programming problem, we solve it by combining Dinkelbach’s method with Lagrange duality method. Simulation results demonstrate that the fading severity of the terrestrial interference link is favorable to the satellite user who can achieve EE gain under the ATP constraint comparing to the PTP constraint. PMID:28869546
Optimal satisfaction degree in energy harvesting cognitive radio networks
NASA Astrophysics Data System (ADS)
Li, Zan; Liu, Bo-Yang; Si, Jiang-Bo; Zhou, Fu-Hui
2015-12-01
A cognitive radio (CR) network with energy harvesting (EH) is considered to improve both spectrum efficiency and energy efficiency. A hidden Markov model (HMM) is used to characterize the imperfect spectrum sensing process. In order to maximize the whole satisfaction degree (WSD) of the cognitive radio network, a tradeoff between the average throughput of the secondary user (SU) and the interference to the primary user (PU) is analyzed. We formulate the satisfaction degree optimization problem as a mixed integer nonlinear programming (MINLP) problem. The satisfaction degree optimization problem is solved by using differential evolution (DE) algorithm. The proposed optimization problem allows the network to adaptively achieve the optimal solution based on its required quality of service (Qos). Numerical results are given to verify our analysis. Project supported by the National Natural Science Foundation of China (Grant No. 61301179), the Doctorial Programs Foundation of the Ministry of Education of China (Grant No. 20110203110011), and the 111 Project (Grant No. B08038).
Predictive Cache Modeling and Analysis
2011-11-01
metaheuristic /bin-packing algorithm to optimize task placement based on task communication characterization. Our previous work on task allocation showed...Cache Miss Minimization Technology To efficiently explore combinations and discover nearly-optimal task-assignment algorithms , we extended to our...it was possible to use our algorithmic techniques to decrease network bandwidth consumption by ~25%. In this effort, we adapted these existing
Optimizing Balanced Incomplete Block Designs for Educational Assessments
ERIC Educational Resources Information Center
van der Linden, Wim J.; Veldkamp, Bernard P.; Carlson, James E.
2004-01-01
A popular design in large-scale educational assessments as well as any other type of survey is the balanced incomplete block design. The design is based on an item pool split into a set of blocks of items that are assigned to sets of "assessment booklets." This article shows how the problem of calculating an optimal balanced incomplete block…
System Engineering Concept Demonstration, Effort Summary. Volume 1
1992-12-01
involve only the system software, user frameworks and user tools. U •User Tool....s , Catalyst oExternal 00 Computer Framwork P OSystems • •~ Sysytem...analysis, synthesis, optimization, conceptual design of Catalyst. The paper discusses the definition, design, test, and evaluation; operational concept...This approach will allow system engineering The conceptual requirements for the Process Model practitioners to recognize and tailor the model. This
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-08-31
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.
NASA Astrophysics Data System (ADS)
Zhang, Hao; Chen, Minghua; Parekh, Abhay; Ramchandran, Kannan
2011-09-01
We design a distributed multi-channel P2P Video-on-Demand (VoD) system using "plug-and-play" helpers. Helpers are heterogenous "micro-servers" with limited storage, bandwidth and number of users they can serve simultaneously. Our proposed system has the following salient features: (1) it jointly optimizes over helper-user connection topology, video storage distribution and transmission bandwidth allocation; (2) it minimizes server load, and is adaptable to varying supply and demand patterns across multiple video channels irrespective of video popularity; and (3) it is fully distributed and requires little or no maintenance overhead. The combinatorial nature of the problem and the system demand for distributed algorithms makes the problem uniquely challenging. By utilizing Lagrangian decomposition and Markov chain approximation based arguments, we address this challenge by designing two distributed algorithms running in tandem: a primal-dual storage and bandwidth allocation algorithm and a "soft-worst-neighbor-choking" topology-building algorithm. Our scheme provably converges to a near-optimal solution, and is easy to implement in practice. Packet-level simulation results show that the proposed scheme achieves minimum sever load under highly heterogeneous combinations of supply and demand patterns, and is robust to system dynamics of user/helper churn, user/helper asynchrony, and random delays in the network.
PLA realizations for VLSI state machines
NASA Technical Reports Server (NTRS)
Gopalakrishnan, S.; Whitaker, S.; Maki, G.; Liu, K.
1990-01-01
A major problem associated with state assignment procedures for VLSI controllers is obtaining an assignment that produces minimal or near minimal logic. The key item in Programmable Logic Array (PLA) area minimization is the number of unique product terms required by the design equations. This paper presents a state assignment algorithm for minimizing the number of product terms required to implement a finite state machine using a PLA. Partition algebra with predecessor state information is used to derive a near optimal state assignment. A maximum bound on the number of product terms required can be obtained by inspecting the predecessor state information. The state assignment algorithm presented is much simpler than existing procedures and leads to the same number of product terms or less. An area-efficient PLA structure implemented in a 1.0 micron CMOS process is presented along with a summary of the performance for a controller implemented using this design procedure.
OPTIMAL WELL LOCATOR (OWL): A SCREENING TOOL FOR EVALUATING LOCATIONS OF MONITORING WELLS
The Optimal Well Locator ( OWL) program was designed and developed by USEPA to be a screening tool to evaluate and optimize the placement of wells in long term monitoring networks at small sites. The first objective of the OWL program is to allow the user to visualize the change ...
DAKOTA JAGUAR 3.0 user's manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Bauman, Lara E; Chan, Ethan
2013-05-01
JAGUAR (JAva GUi for Applied Research) is a Java software tool providing an advanced text editor and graphical user interface (GUI) to manipulate DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) input specifications. This document focuses on the features necessary to use JAGUAR.
Rajappa, Medha; Bobby, Zachariah; Nandeesha, H; Suryapriya, R; Ragul, Anithasri; Yuvaraj, B; Revathy, G; Priyadarssini, M
2016-07-08
Graduate medical students of India are taught Biochemistry by didactic lectures and they hardly get any opportunity to clarify their doubts and reinforce the concepts which they learn in these lectures. We used a combination of teaching-learning (T-L) methods (open book assignment followed by group tutorials) to study their efficacy in improving the learning outcome. About 143 graduate medical students were classified into low (<50%: group 1, n = 23), medium (50-75%: group 2, n = 74), and high (>75%: group 3, n = 46) achievers, based on their internal assessment marks. After the regular teaching module on the topics "Vitamins and Enzymology", all the students attempted an open book assignment without peer consultation. Then all the students participated in group tutorials. The effects on the groups were evaluated by pre and posttests at the end of each phase, with the same set of MCQs. Gain from group tutorials and overall gain was significantly higher in the low achievers, compared to other groups. High and medium achievers obtained more gain from open book assignment, than group tutorials. The overall gain was significantly higher than the gain obtained from open book assignment or group tutorials, in all three groups. All the three groups retained the gain even after 1 week of the exercise. Hence, optimal use of novel T-L methods (open book assignment followed by group tutorials) as revision exercises help in strengthening concepts in Biochemistry in this oft neglected group of low achievers in graduate medical education. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):321-325, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.
Sinkó, József; Kákonyi, Róbert; Rees, Eric; Metcalf, Daniel; Knight, Alex E.; Kaminski, Clemens F.; Szabó, Gábor; Erdélyi, Miklós
2014-01-01
Localization-based super-resolution microscopy image quality depends on several factors such as dye choice and labeling strategy, microscope quality and user-defined parameters such as frame rate and number as well as the image processing algorithm. Experimental optimization of these parameters can be time-consuming and expensive so we present TestSTORM, a simulator that can be used to optimize these steps. TestSTORM users can select from among four different structures with specific patterns, dye and acquisition parameters. Example results are shown and the results of the vesicle pattern are compared with experimental data. Moreover, image stacks can be generated for further evaluation using localization algorithms, offering a tool for further software developments. PMID:24688813
Optimal Resource Allocation for NOMA-TDMA Scheme with α-Fairness in Industrial Internet of Things.
Sun, Yanjing; Guo, Yiyu; Li, Song; Wu, Dapeng; Wang, Bin
2018-05-15
In this paper, a joint non-orthogonal multiple access and time division multiple access (NOMA-TDMA) scheme is proposed in Industrial Internet of Things (IIoT), which allowed multiple sensors to transmit in the same time-frequency resource block using NOMA. The user scheduling, time slot allocation, and power control are jointly optimized in order to maximize the system α -fair utility under transmit power constraint and minimum rate constraint. The optimization problem is nonconvex because of the fractional objective function and the nonconvex constraints. To deal with the original problem, we firstly convert the objective function in the optimization problem into a difference of two convex functions (D.C.) form, and then propose a NOMA-TDMA-DC algorithm to exploit the global optimum. Numerical results show that the NOMA-TDMA scheme significantly outperforms the traditional orthogonal multiple access scheme in terms of both spectral efficiency and user fairness.
A tool for efficient, model-independent management optimization under uncertainty
White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.
2018-01-01
To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.
NASA Technical Reports Server (NTRS)
Klumpp, A. R.
1994-01-01
The Ada Namelist Package, developed for the Ada programming language, enables a calling program to read and write FORTRAN-style namelist files. A namelist file consists of any number of assignment statements in any order. Features of the Ada Namelist Package are: the handling of any combination of user-defined types; the ability to read vectors, matrices, and slices of vectors and matrices; the handling of mismatches between variables in the namelist file and those in the programmed list of namelist variables; and the ability to avoid searching the entire input file for each variable. The principle user benefits of this software are the following: the ability to write namelist-readable files, the ability to detect most file errors in the initialization phase, a package organization that reduces the number of instantiated units to a few packages rather than to many subprograms, a reduced number of restrictions, and an increased execution speed. The Ada Namelist reads data from an input file into variables declared within a user program. It then writes data from the user program to an output file, printer, or display. The input file contains a sequence of assignment statements in arbitrary order. The output is in namelist-readable form. There is a one-to-one correspondence between namelist I/O statements executed in the user program and variables read or written. Nevertheless, in the input file, mismatches are allowed between assignment statements in the file and the namelist read procedure statements in the user program. The Ada Namelist Package itself is non-generic. However, it has a group of nested generic packages following the nongeneric opening portion. The opening portion declares a variety of useraccessible constants, variables and subprograms. The subprograms are procedures for initializing namelists for reading, reading and writing strings. The subprograms are also functions for analyzing the content of the current dataset and diagnosing errors. Two nested generic packages follow the opening portion. The first generic package contains procedures that read and write objects of scalar type. The second contains subprograms that read and write one and two-dimensional arrays whose components are of scalar type and whose indices are of either of the two discrete types (integer or enumeration). Subprograms in the second package also read and write vector and matrix slices. The Ada Namelist ASCII text files are available on a 360k 5.25" floppy disk written on an IBM PC/AT running under the PC DOS operating system. The largest subprogram in the package requires 150k of memory. The package was developed using VAX Ada v. 1.5 under DEC VMS v. 4.5. It should be portable to any validated Ada compiler. The software was developed in 1989, and is a copyrighted work with all copyright vested in NASA.
Optimal erasure protection for scalably compressed video streams with limited retransmission.
Taubman, David; Thie, Johnson
2005-08-01
This paper shows how the priority encoding transmission (PET) framework may be leveraged to exploit both unequal error protection and limited retransmission for RD-optimized delivery of streaming media. Previous work on scalable media protection with PET has largely ignored the possibility of retransmission. Conversely, the PET framework has not been harnessed by the substantial body of previous work on RD optimized hybrid forward error correction/automatic repeat request schemes. We limit our attention to sources which can be modeled as independently compressed frames (e.g., video frames), where each element in the scalable representation of each frame can be transmitted in one or both of two transmission slots. An optimization algorithm determines the level of protection which should be assigned to each element in each slot, subject to transmission bandwidth constraints. To balance the protection assigned to elements which are being transmitted for the first time with those which are being retransmitted, the proposed algorithm formulates a collection of hypotheses concerning its own behavior in future transmission slots. We show how the PET framework allows for a decoupled optimization algorithm with only modest complexity. Experimental results obtained with Motion JPEG2000 compressed video demonstrate that substantial performance benefits can be obtained using the proposed framework.
A mixed analog/digital chaotic neuro-computer system for quadratic assignment problems.
Horio, Yoshihiko; Ikeguchi, Tohru; Aihara, Kazuyuki
2005-01-01
We construct a mixed analog/digital chaotic neuro-computer prototype system for quadratic assignment problems (QAPs). The QAP is one of the difficult NP-hard problems, and includes several real-world applications. Chaotic neural networks have been used to solve combinatorial optimization problems through chaotic search dynamics, which efficiently searches optimal or near optimal solutions. However, preliminary experiments have shown that, although it obtained good feasible solutions, the Hopfield-type chaotic neuro-computer hardware system could not obtain the optimal solution of the QAP. Therefore, in the present study, we improve the system performance by adopting a solution construction method, which constructs a feasible solution using the analog internal state values of the chaotic neurons at each iteration. In order to include the construction method into our hardware, we install a multi-channel analog-to-digital conversion system to observe the internal states of the chaotic neurons. We show experimentally that a great improvement in the system performance over the original Hopfield-type chaotic neuro-computer is obtained. That is, we obtain the optimal solution for the size-10 QAP in less than 1000 iterations. In addition, we propose a guideline for parameter tuning of the chaotic neuro-computer system according to the observation of the internal states of several chaotic neurons in the network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong, Zhenhuan; Boyuka, David; Zou, X
Download Citation Email Print Request Permissions Save to Project The size and scope of cutting-edge scientific simulations are growing much faster than the I/O and storage capabilities of their run-time environments. The growing gap is exacerbated by exploratory, data-intensive analytics, such as querying simulation data with multivariate, spatio-temporal constraints, which induces heterogeneous access patterns that stress the performance of the underlying storage system. Previous work addresses data layout and indexing techniques to improve query performance for a single access pattern, which is not sufficient for complex analytics jobs. We present PARLO a parallel run-time layout optimization framework, to achieve multi-levelmore » data layout optimization for scientific applications at run-time before data is written to storage. The layout schemes optimize for heterogeneous access patterns with user-specified priorities. PARLO is integrated with ADIOS, a high-performance parallel I/O middleware for large-scale HPC applications, to achieve user-transparent, light-weight layout optimization for scientific datasets. It offers simple XML-based configuration for users to achieve flexible layout optimization without the need to modify or recompile application codes. Experiments show that PARLO improves performance by 2 to 26 times for queries with heterogeneous access patterns compared to state-of-the-art scientific database management systems. Compared to traditional post-processing approaches, its underlying run-time layout optimization achieves a 56% savings in processing time and a reduction in storage overhead of up to 50%. PARLO also exhibits a low run-time resource requirement, while also limiting the performance impact on running applications to a reasonable level.« less
Lee, JongHyup; Pak, Dohyun
2016-01-01
For practical deployment of wireless sensor networks (WSN), WSNs construct clusters, where a sensor node communicates with other nodes in its cluster, and a cluster head support connectivity between the sensor nodes and a sink node. In hybrid WSNs, cluster heads have cellular network interfaces for global connectivity. However, when WSNs are active and the load of cellular networks is high, the optimal assignment of cluster heads to base stations becomes critical. Therefore, in this paper, we propose a game theoretic model to find the optimal assignment of base stations for hybrid WSNs. Since the communication and energy cost is different according to cellular systems, we devise two game models for TDMA/FDMA and CDMA systems employing power prices to adapt to the varying efficiency of recent wireless technologies. The proposed model is defined on the assumptions of the ideal sensing field, but our evaluation shows that the proposed model is more adaptive and energy efficient than local selections. PMID:27589743
Liu, Ying; ZENG, Donglin; WANG, Yuanjia
2014-01-01
Summary Dynamic treatment regimens (DTRs) are sequential decision rules tailored at each point where a clinical decision is made based on each patient’s time-varying characteristics and intermediate outcomes observed at earlier points in time. The complexity, patient heterogeneity, and chronicity of mental disorders call for learning optimal DTRs to dynamically adapt treatment to an individual’s response over time. The Sequential Multiple Assignment Randomized Trial (SMARTs) design allows for estimating causal effects of DTRs. Modern statistical tools have been developed to optimize DTRs based on personalized variables and intermediate outcomes using rich data collected from SMARTs; these statistical methods can also be used to recommend tailoring variables for designing future SMART studies. This paper introduces DTRs and SMARTs using two examples in mental health studies, discusses two machine learning methods for estimating optimal DTR from SMARTs data, and demonstrates the performance of the statistical methods using simulated data. PMID:25642116
Optimization of a Thermodynamic Model Using a Dakota Toolbox Interface
NASA Astrophysics Data System (ADS)
Cyrus, J.; Jafarov, E. E.; Schaefer, K. M.; Wang, K.; Clow, G. D.; Piper, M.; Overeem, I.
2016-12-01
Scientific modeling of the Earth physical processes is an important driver of modern science. The behavior of these scientific models is governed by a set of input parameters. It is crucial to choose accurate input parameters that will also preserve the corresponding physics being simulated in the model. In order to effectively simulate real world processes the models output data must be close to the observed measurements. To achieve this optimal simulation, input parameters are tuned until we have minimized the objective function, which is the error between the simulation model outputs and the observed measurements. We developed an auxiliary package, which serves as a python interface between the user and DAKOTA. The package makes it easy for the user to conduct parameter space explorations, parameter optimizations, as well as sensitivity analysis while tracking and storing results in a database. The ability to perform these analyses via a Python library also allows the users to combine analysis techniques, for example finding an approximate equilibrium with optimization then immediately explore the space around it. We used the interface to calibrate input parameters for the heat flow model, which is commonly used in permafrost science. We performed optimization on the first three layers of the permafrost model, each with two thermal conductivity coefficients input parameters. Results of parameter space explorations indicate that the objective function not always has a unique minimal value. We found that gradient-based optimization works the best for the objective functions with one minimum. Otherwise, we employ more advanced Dakota methods such as genetic optimization and mesh based convergence in order to find the optimal input parameters. We were able to recover 6 initially unknown thermal conductivity parameters within 2% accuracy of their known values. Our initial tests indicate that the developed interface for the Dakota toolbox could be used to perform analysis and optimization on a `black box' scientific model more efficiently than using just Dakota.
Performance Optimization of Priority Assisted CSMA/CA Mechanism of 802.15.6 under Saturation Regime
Shakir, Mustafa; Rehman, Obaid Ur; Rahim, Mudassir; Alrajeh, Nabil; Khan, Zahoor Ali; Khan, Mahmood Ashraf; Niaz, Iftikhar Azim; Javaid, Nadeem
2016-01-01
Due to the recent development in the field of Wireless Sensor Networks (WSNs), the Wireless Body Area Networks (WBANs) have become a major area of interest for the developers and researchers. Human body exhibits postural mobility due to which distance variation occurs and the status of connections amongst sensors change time to time. One of the major requirements of WBAN is to prolong the network lifetime without compromising on other performance measures, i.e., delay, throughput and bandwidth efficiency. Node prioritization is one of the possible solutions to obtain optimum performance in WBAN. IEEE 802.15.6 CSMA/CA standard splits the nodes with different user priorities based on Contention Window (CW) size. Smaller CW size is assigned to higher priority nodes. This standard helps to reduce delay, however, it is not energy efficient. In this paper, we propose a hybrid node prioritization scheme based on IEEE 802.15.6 CSMA/CA to reduce energy consumption and maximize network lifetime. In this scheme, optimum performance is achieved by node prioritization based on CW size as well as power in respective user priority. Our proposed scheme reduces the average back off time for channel access due to CW based prioritization. Additionally, power based prioritization for a respective user priority helps to minimize required number of retransmissions. Furthermore, we also compare our scheme with IEEE 802.15.6 CSMA/CA standard (CW assisted node prioritization) and power assisted node prioritization under postural mobility in WBAN. Mathematical expressions are derived to determine the accurate analytical model for throughput, delay, bandwidth efficiency, energy consumption and life time for each node prioritization scheme. With the intention of analytical model validation, we have performed the simulations in OMNET++/MIXIM framework. Analytical and simulation results show that our proposed hybrid node prioritization scheme outperforms other node prioritization schemes in terms of average network delay, average throughput, average bandwidth efficiency and network lifetime. PMID:27598167
McCall, Hugh Cameron; Richardson, Chris G; Helgadottir, Fjola Dogg; Chen, Frances S
2018-03-21
Treatment rates for social anxiety, a prevalent and potentially debilitating condition, remain among the lowest of all major mental disorders today. Although computer-delivered interventions are well poised to surmount key barriers to the treatment of social anxiety, most are only marginally effective when delivered as stand-alone treatments. A new, Web-based cognitive behavioral therapy (CBT) intervention called Overcome Social Anxiety was recently created to address the limitations of prior computer-delivered interventions. Users of Overcome Social Anxiety are self-directed through various CBT modules incorporating cognitive restructuring and behavioral experiments. The intervention is personalized to each user's symptoms, and automatic email reminders and time limits are used to encourage adherence. The purpose of this study was to conduct a randomized controlled trial to investigate the effectiveness of Overcome Social Anxiety in reducing social anxiety symptoms in a nonclinical sample of university students. As a secondary aim, we also investigated whether Overcome Social Anxiety would increase life satisfaction in this sample. Following eligibility screening, participants were randomly assigned to a treatment condition or a wait-list control condition. Only those assigned to the treatment condition were given access to Overcome Social Anxiety; they were asked to complete the program within 4 months. The social interaction anxiety scale (SIAS), the fear of negative evaluation scale (FNE), and the quality of life enjoyment and satisfaction questionnaire-short form (Q-LES-Q-SF) were administered to participants from both conditions during baseline and 4-month follow-up lab visits. Over the course of the study, participants assigned to the treatment condition experienced a significant reduction in social anxiety (SIAS: P<.001, Cohen d=0.72; FNE: P<.001, Cohen d=0.82), whereas those assigned to the control condition did not (SIAS: P=.13, Cohen d=0.26; FNE: P=.40, Cohen d=0.14). Additionally, a direct comparison of the average change in social anxiety in the 2 conditions over the course of the study showed that those assigned to the treatment condition experienced significantly more improvement than those assigned to the control condition (SIAS: P=.03, Cohen d=0.56; FNE: P=.001, Cohen d=0.97). Although participants assigned to the treatment condition experienced a slight increase in life satisfaction, as measured by Q-LES-Q-SF scores, and those assigned to the control condition experienced a slight decrease, these changes were not statistically significant (treatment: P=.35, Cohen d=-0.18; control: P=.30, Cohen d=0.18). Our findings indicate that Overcome Social Anxiety is an effective intervention for treating symptoms of social anxiety and that it may have further utility in serving as a model for the development of new interventions. Additionally, our findings provide evidence that contemporary Web-based interventions can be sophisticated enough to benefit users even when delivered as stand-alone treatments, suggesting that further opportunities likely exist for the development of other Web-based mental health interventions. ClinicalTrials.gov NCT02792127; https://clinicaltrials.gov/ct2/show/record/NCT02792127 (Archived by WebCite at http://www.webcitation.org/6xGSRh7MG). ©Hugh Cameron McCall, Chris G Richardson, Fjola Dogg Helgadottir, Frances S Chen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 21.03.2018.
Q2Stress: A database for multiple cues to stress assignment in Italian.
Spinelli, Giacomo; Sulpizio, Simone; Burani, Cristina
2017-12-01
In languages where the position of lexical stress within a word is not predictable from print, readers rely on distributional information extracted from the lexicon in order to assign stress. Lexical databases are thus especially important for researchers willing to address stress assignment in those languages. Here we present Q2Stress, a new database aimed to fill the lack of such a resource for Italian. Q2Stress includes multiple cues readers may use in assigning stress, such as type and token frequency of stress patterns as well as their distribution with respect to number of syllables, grammatical category, word beginnings, word endings, and consonant-vowel structures. Furthermore, for the first time, data for both adults and children are available. Q2Stress may help researchers to answer empirical as well as theoretical questions about stress assignment and stress-related issues, and more in general, to explore the orthography-to-phonology relation in reading. Q2Stress is designed as a user-friendly resource, as it comes with scripts allowing researchers to explore and select their own stimuli according to several criteria as well as summary tables for overall data analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tolić, Nikola; Liu, Yina; Liyu, Andrey
Ultrahigh-resolution mass spectrometry, such as Fourier transform ion-cyclotron resonance mass spectrometry (FT-ICR MS), can resolve thousands of molecular ions in complex organic matrices. A Compound Identification Algorithm (CIA) was previously developed for automated elemental formula assignment for natural organic matter (NOM). In this work we describe a user friendly interface for CIA, titled Formularity, which includes an additional functionality to perform search of formulas based on an Isotopic Pattern Algorithm (IPA). While CIA assigns elemental formulas for compounds containing C, H, O, N, S, and P, IPA is capable of assigning formulas for compounds containing other elements. We used halogenatedmore » organic compounds (HOC), a chemical class that is ubiquitous in nature as well as anthropogenic systems, as an example to demonstrate the capability of Formularity with IPA. A HOC standard mix was used to evaluate the identification confidence of IPA. The HOC spike in NOM and tap water were used to assess HOC identification in natural and anthropogenic matrices. Strategies for reconciliation of CIA and IPA assignments are discussed. Software and sample databases with documentation are freely available from the PNNL OMICS software repository https://omics.pnl.gov/software/formularity.« less
Butler, William E.; Atai, Nadia; Carter, Bob; Hochberg, Fred
2014-01-01
The Richard Floor Biorepository supports collaborative studies of extracellular vesicles (EVs) found in human fluids and tissue specimens. The current emphasis is on biomarkers for central nervous system neoplasms but its structure may serve as a template for collaborative EV translational studies in other fields. The informatic system provides specimen inventory tracking with bar codes assigned to specimens and containers and projects, is hosted on globalized cloud computing resources, and embeds a suite of shared documents, calendars, and video-conferencing features. Clinical data are recorded in relation to molecular EV attributes and may be tagged with terms drawn from a network of externally maintained ontologies thus offering expansion of the system as the field matures. We fashioned the graphical user interface (GUI) around a web-based data visualization package. This system is now in an early stage of deployment, mainly focused on specimen tracking and clinical, laboratory, and imaging data capture in support of studies to optimize detection and analysis of brain tumour–specific mutations. It currently includes 4,392 specimens drawn from 611 subjects, the majority with brain tumours. As EV science evolves, we plan biorepository changes which may reflect multi-institutional collaborations, proteomic interfaces, additional biofluids, changes in operating procedures and kits for specimen handling, novel procedures for detection of tumour-specific EVs, and for RNA extraction and changes in the taxonomy of EVs. We have used an ontology-driven data model and web-based architecture with a graph theory–driven GUI to accommodate and stimulate the semantic web of EV science. PMID:25317275
Radiation therapy planning and simulation with magnetic resonance images
NASA Astrophysics Data System (ADS)
Boettger, Thomas; Nyholm, Tufve; Karlsson, Magnus; Nunna, Chandrasekhar; Celi, Juan Carlos
2008-03-01
We present a system which allows for use of magnetic resonance (MR) images as primary RT workflow modality alone and no longer limits the user to computed tomography data for radiation therapy (RT) planning, simulation and patient localization. The single steps for achieving this goal are explained in detail. For planning two MR data sets, MR1 and MR2 are acquired sequentially. For MR1 a standardized Ultrashort TE (UTE) sequence is used enhancing bony anatomy. The sequence for MR2 is chosen to get optimal contrast for the target and the organs at risk for each individual patient. Both images are naturally in registration, neglecting elastic soft tissue deformations. The planning software first automatically extracts skin and bony anatomy from MR1. The user can semi-automatically delineate target structures and organs at risk based on MR1 or MR2, associate all segmentations with MR1 and create a plan in the coordinate system of MR1. Projections similar to digitally reconstructed radiographs (DRR) enhancing bony anatomy are calculated from the MR1 directly and can be used for iso-center definition and setup verification. Furthermore we present a method for creating a Pseudo-CT data set which assigns electron densities to the voxels of MR1 based on the skin and bone segmentations. The Pseudo-CT is then used for dose calculation. Results from first tests under clinical conditions show the feasibility of the completely MR based workflow in RT for necessary clinical cases. It needs to be investigated in how far geometrical distortions influence accuracy of MR-based RT planning.
Butler, William E; Atai, Nadia; Carter, Bob; Hochberg, Fred
2014-01-01
The Richard Floor Biorepository supports collaborative studies of extracellular vesicles (EVs) found in human fluids and tissue specimens. The current emphasis is on biomarkers for central nervous system neoplasms but its structure may serve as a template for collaborative EV translational studies in other fields. The informatic system provides specimen inventory tracking with bar codes assigned to specimens and containers and projects, is hosted on globalized cloud computing resources, and embeds a suite of shared documents, calendars, and video-conferencing features. Clinical data are recorded in relation to molecular EV attributes and may be tagged with terms drawn from a network of externally maintained ontologies thus offering expansion of the system as the field matures. We fashioned the graphical user interface (GUI) around a web-based data visualization package. This system is now in an early stage of deployment, mainly focused on specimen tracking and clinical, laboratory, and imaging data capture in support of studies to optimize detection and analysis of brain tumour-specific mutations. It currently includes 4,392 specimens drawn from 611 subjects, the majority with brain tumours. As EV science evolves, we plan biorepository changes which may reflect multi-institutional collaborations, proteomic interfaces, additional biofluids, changes in operating procedures and kits for specimen handling, novel procedures for detection of tumour-specific EVs, and for RNA extraction and changes in the taxonomy of EVs. We have used an ontology-driven data model and web-based architecture with a graph theory-driven GUI to accommodate and stimulate the semantic web of EV science.
Carey, Michael P; Senn, Theresa E; Coury-Doniger, Patricia; Urban, Marguerite A; Vanable, Peter A; Carey, Kate B
2013-09-01
Randomized controlled trials (RCTs) remain the gold standard for evaluating intervention efficacy but are often costly. To optimize their scientific yield, RCTs can be designed to investigate multiple research questions. This paper describes an RCT that used a modified Solomon four-group design to simultaneously evaluate two, theoretically-guided, health promotion interventions as well as assessment reactivity. Recruited participants (N = 1010; 56% male; 69% African American) were randomly assigned to one of four conditions formed by crossing two intervention conditions (i.e., general health promotion vs. sexual risk reduction intervention) with two assessment conditions (i.e., general health vs. sexual health survey). After completing their assigned baseline assessment, participants received the assigned intervention, and returned for follow-ups at 3, 6, 9, and 12 months. In this report, we summarize baseline data, which show high levels of sexual risk behavior; alcohol, marijuana, and tobacco use; and fast food consumption. Sexual risk behaviors and substance use were correlated. Participants reported high satisfaction with both interventions but ratings for the sexual risk reduction intervention were higher. Planned follow-up sessions, and subsequent analyses, will assess changes in health behaviors including sexual risk behaviors. This study design demonstrates one way to optimize the scientific yield of an RCT. © 2013 Elsevier Inc. All rights reserved.
Asnoune, M; Abdelmalek, F; Djelloul, A; Mesghouni, K; Addou, A
2016-11-01
In household waste matters, the objective is always to conceive an optimal integrated system of management, where the terms 'optimal' and 'integrated' refer generally to a combination between the waste and the techniques of treatment, valorization and elimination, which often aim at the lowest possible cost. The management optimization of household waste using operational methodologies has not yet been applied in any Algerian district. We proposed an optimization of the valorization of household waste in Tiaret city in order to lower the total management cost. The methodology is modelled by non-linear mathematical equations using 28 variables of decision and aims to assign optimally the seven components of household waste (i.e. plastic, cardboard paper, glass, metals, textiles, organic matter and others) among four centres of treatment [i.e. waste to energy (WTE) or incineration, composting (CM), anaerobic digestion (ANB) or methanization and landfilling (LF)]. The analysis of the obtained results shows that the variation of total cost is mainly due to the assignment of waste among the treatment centres and that certain treatment cannot be applied to household waste in Tiaret city. On the other hand, certain techniques of valorization have been favoured by the optimization. In this work, four scenarios have been proposed to optimize the system cost, where the modelling shows that the mixed scenario (the three treatment centres CM, ANB, LF) suggests a better combination of technologies of waste treatment, with an optimal solution for the system (cost and profit). © The Author(s) 2016.
Ahn, Dohyun; Seo, Youngnam; Kim, Minkyung; Kwon, Joung Huem; Jung, Younbo; Ahn, Jungsun
2014-01-01
Abstract This study examined the role of display size and mode in increasing users' sense of being together with and of their psychological immersion in a virtual character. Using a high-resolution three-dimensional virtual character, this study employed a 2×2 (stereoscopic mode vs. monoscopic mode×actual human size vs. small size display) factorial design in an experiment with 144 participants randomly assigned to each condition. Findings showed that stereoscopic mode had a significant effect on both users' sense of being together and psychological immersion. However, display size affected only the sense of being together. Furthermore, display size was not found to moderate the effect of stereoscopic mode. PMID:24606057
Social-aware data dissemination in opportunistic mobile social networks
NASA Astrophysics Data System (ADS)
Yang, Yibo; Zhao, Honglin; Ma, Jinlong; Han, Xiaowei
Opportunistic Mobile Social Networks (OMSNs), formed by mobile users with social relationships and characteristics, enhance spontaneous communication among users that opportunistically encounter each other. Such networks can be exploited to improve the performance of data forwarding. Discovering optimal relay nodes is one of the important issues for efficient data propagation in OMSNs. Although traditional centrality definitions to identify the nodes features in network, they cannot identify effectively the influential nodes for data dissemination in OMSNs. Existing protocols take advantage of spatial contact frequency and social characteristics to enhance transmission performance. However, existing protocols have not fully exploited the benefits of the relations and the effects between geographical information, social features and user interests. In this paper, we first evaluate these three characteristics of users and design a routing protocol called Geo-Social-Interest (GSI) protocol to select optimal relay nodes. We compare the performance of GSI using real INFOCOM06 data sets. The experiment results demonstrate that GSI overperforms the other protocols with highest data delivery ratio and low communication overhead.
47 CFR 64.611 - Internet-based TRS registration.
Code of Federal Regulations, 2012 CFR
2012-10-01
... number assigned or issued to any Registered Internet-based TRS User. (e) Toll free numbers. A VRS or IP... effective date of this Order remove from the Internet-based TRS Numbering Directory any toll free number... 47 Telecommunication 3 2012-10-01 2012-10-01 false Internet-based TRS registration. 64.611 Section...
Uses and abuses of multipliers in the stand prognosis model
David A. Hamilton
1994-01-01
Users of the Stand Prognosis Model may have difficulties in selecting the proper set of multipliers to simulate a desired effect or in determining the appropriate value to assign to selected multipliers. A series of examples describe impact of multipliers on simulated stand development. Guidelines for the proper use of multipliers are presented....
Strategic planning for health care management information systems.
Rosenberger, H R; Kaiser, K M
1985-01-01
Using a planning methodology and a structured design technique for analyzing data and data flow, information requirements can be derived to produce a strategic plan for a management information system. Such a long-range plan classifies information groups and assigns them priorities according to the goals of the organization. The approach emphasizes user involvement.
ERIC Educational Resources Information Center
Schonborn, Konrad J.; Bivall, Petter; Tibell, Lena A. E.
2011-01-01
This study explores tertiary students' interaction with a haptic virtual model representing the specific binding of two biomolecules, a core concept in molecular life science education. Twenty students assigned to a "haptics" (experimental) or "no-haptics" (control) condition performed a "docking" task where users sought the most favourable…
Nicotine Replacement: Effects on Postcessation Weight Gain.
ERIC Educational Resources Information Center
Gross, Janet; And Others
1989-01-01
Examined nicotine replacement effects on postcessation weight gain in smoking cessation volunteers. Randomly assigned abstinent subjects to active nicotine or placebo gum conditions for 10 weeks. Analyses revealed strong evidence for gum effect on weight gain, with active gum users gaining mean total of 3.8 pounds compared with 7.8 pounds for…
B2C Mass Customization in the Classroom
ERIC Educational Resources Information Center
Visich, John K.; Gu, Qiannong; Khumawala, Basheer M.
2012-01-01
The purpose of this article is to describe an internet-based mass customization assignment in Operations Management/Supply Chain Management classes where students utilize the Web site of a company that offers a customized product. Students evaluate the user interface, judge the value proposition of the product they demonstrate, and discuss issues…
The Effects of Target Audience on Social Tagging
ERIC Educational Resources Information Center
Alsarhan, Hesham
2013-01-01
Online social bookmarking systems allow users to assign tags (i.e., keywords) to represent the content of resources. Research on the effects of target audience on social tagging suggests that taggers select different tags for themselves, their community (e.g., family, friends, colleagues), and the general public (Panke & Gaiser, 2009; Pu &…
Any Effects of Different Levels of Online User Identity Revelation?
ERIC Educational Resources Information Center
Yu, Fu-Yun
2012-01-01
This study examined the effects of different levels of identity revelation in relation to aspects most relevant to engaged online learning activities. An online learning system supporting question-generation and peer-assessment was adopted. Three 7th grade classes (N=101) were assigned to three identity revelation modes (real-name, nickname and…
ERIC Educational Resources Information Center
Kline, Lanaii
A computer program that produces three reports based on asset inventory data--i.e. facilities and equipment data--is described. Written in FORTRAN IV (Level G), the program was used on the IBM 360 Model 91 at the University of California at Los Angeles (UCLA). The first report is a listing of data sorted by local, user-assigned identification…
Fuzzy-logic based Q-Learning interference management algorithms in two-tier networks
NASA Astrophysics Data System (ADS)
Xu, Qiang; Xu, Zezhong; Li, Li; Zheng, Yan
2017-10-01
Unloading from macrocell network and enhancing coverage can be realized by deploying femtocells in the indoor scenario. However, the system performance of the two-tier network could be impaired by the co-tier and cross-tier interference. In this paper, a distributed resource allocation scheme is studied when each femtocell base station is self-governed and the resource cannot be assigned centrally through the gateway. A novel Q-Learning interference management scheme is proposed, that is divided into cooperative and independent part. In the cooperative algorithm, the interference information is exchanged between the cell-edge users which are classified by the fuzzy logic in the same cell. Meanwhile, we allocate the orthogonal subchannels to the high-rate cell-edge users to disperse the interference power when the data rate requirement is satisfied. The resource is assigned directly according to the minimum power principle in the independent algorithm. Simulation results are provided to demonstrate the significant performance improvements in terms of the average data rate, interference power and energy efficiency over the cutting-edge resource allocation algorithms.
Optimizing Earth Data Search Ranking using Deep Learning and Real-time User Behaviour
NASA Astrophysics Data System (ADS)
Jiang, Y.; Yang, C. P.; Armstrong, E. M.; Huang, T.; Moroni, D. F.; McGibbney, L. J.; Greguska, F. R., III
2017-12-01
Finding Earth science data has been a challenging problem given both the quantity of data available and the heterogeneity of the data across a wide variety of domains. Current search engines in most geospatial data portals tend to induce end users to focus on one single data characteristic dimension (e.g., term frequency-inverse document frequency (TF-IDF) score, popularity, release date, etc.). This approach largely fails to take account of users' multidimensional preferences for geospatial data, and hence may likely result in a less than optimal user experience in discovering the most applicable dataset out of a vast range of available datasets. With users interacting with search engines, sufficient information is already hidden in the log files. Compared with explicit feedback data, information that can be derived/extracted from log files is virtually free and substantially more timely. In this dissertation, I propose an online deep learning framework that can quickly update the learning function based on real-time user clickstream data. The contributions of this framework include 1) a log processor that can ingest, process and create training data from web logs in a real-time manner; 2) a query understanding module to better interpret users' search intent using web log processing results and metadata; 3) a feature extractor that identifies ranking features representing users' multidimensional interests of geospatial data; and 4) a deep learning based ranking algorithm that can be trained incrementally using user behavior data. The search ranking results will be evaluated using precision at K and normalized discounted cumulative gain (NDCG).
NASA Astrophysics Data System (ADS)
He, Hao; Wang, Jun; Zhu, Jiang; Li, Shaoqian
2010-12-01
In this paper, we investigate the cross-layer design of joint channel access and transmission rate adaptation in CR networks with multiple channels for both centralized and decentralized cases. Our target is to maximize the throughput of CR network under transmission power constraint by taking spectrum sensing errors into account. In centralized case, this problem is formulated as a special constrained Markov decision process (CMDP), which can be solved by standard linear programming (LP) method. As the complexity of finding the optimal policy by LP increases exponentially with the size of action space and state space, we further apply action set reduction and state aggregation to reduce the complexity without loss of optimality. Meanwhile, for the convenience of implementation, we also consider the pure policy design and analyze the corresponding characteristics. In decentralized case, where only local information is available and there is no coordination among the CR users, we prove the existence of the constrained Nash equilibrium and obtain the optimal decentralized policy. Finally, in the case that the traffic load parameters of the licensed users are unknown for the CR users, we propose two methods to estimate the parameters for two different cases. Numerical results validate the theoretic analysis.