ERIC Educational Resources Information Center
Martins, Rosane Maria; Chaves, Magali Ribeiro; Pirmez, Luci; Rust da Costa Carmo, Luiz Fernando
2001-01-01
Discussion of the need to filter and retrieval relevant information from the Internet focuses on the use of mobile agents, specific software components which are based on distributed artificial intelligence and integrated systems. Surveys agent technology and discusses the agent building package used to develop two applications using IBM's Aglet…
DOE Office of Scientific and Technical Information (OSTI.GOV)
VOLTTRON is an agent execution platform providing services to its agents that allow them to easily communicate with physical devices and other resources. VOLTTRON delivers an innovative distributed control and sensing software platform that supports modern control strategies, including agent-based and transaction-based controls. It enables mobile and stationary software agents to perform information gathering, processing, and control actions. VOLTTRON can independently manage a wide range of applications, such as HVAC systems, electric vehicles, distributed energy or entire building loads, leading to improved operational efficiency.
Proceedings 3rd NASA/IEEE Workshop on Formal Approaches to Agent-Based Systems (FAABS-III)
NASA Technical Reports Server (NTRS)
Hinchey, Michael (Editor); Rash, James (Editor); Truszkowski, Walt (Editor); Rouff, Christopher (Editor)
2004-01-01
These preceedings contain 18 papers and 4 poster presentation, covering topics such as: multi-agent systems, agent-based control, formalism, norms, as well as physical and biological models of agent-based systems. Some applications presented in the proceedings include systems analysis, software engineering, computer networks and robot control.
Lessons Learned from Autonomous Sciencecraft Experiment
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Sherwood, Rob; Tran, Daniel; Cichy, Benjamin; Rabideau, Gregg; Castano, Rebecca; Davies, Ashley; Mandl, Dan; Frye, Stuart; Trout, Bruce;
2005-01-01
An Autonomous Science Agent has been flying onboard the Earth Observing One Spacecraft since 2003. This software enables the spacecraft to autonomously detect and responds to science events occurring on the Earth such as volcanoes, flooding, and snow melt. The package includes AI-based software systems that perform science data analysis, deliberative planning, and run-time robust execution. This software is in routine use to fly the EO-l mission. In this paper we briefly review the agent architecture and discuss lessons learned from this multi-year flight effort pertinent to deployment of software agents to critical applications.
Integrating CLIPS applications into heterogeneous distributed systems
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1991-01-01
SOCIAL is an advanced, object-oriented development tool for integrating intelligent and conventional applications across heterogeneous hardware and software platforms. SOCIAL defines a family of 'wrapper' objects called agents, which incorporate predefined capabilities for distributed communication and control. Developers embed applications within agents and establish interactions between distributed agents via non-intrusive message-based interfaces. This paper describes a predefined SOCIAL agent that is specialized for integrating C Language Integrated Production System (CLIPS)-based applications. The agent's high-level Application Programming Interface supports bidirectional flow of data, knowledge, and commands to other agents, enabling CLIPS applications to initiate interactions autonomously, and respond to requests and results from heterogeneous remote systems. The design and operation of CLIPS agents are illustrated with two distributed applications that integrate CLIPS-based expert systems with other intelligent systems for isolating and mapping problems in the Space Shuttle Launch Processing System at the NASA Kennedy Space Center.
Smart Aerospace eCommerce: Using Intelligent Agents in a NASA Mission Services Ordering Application
NASA Technical Reports Server (NTRS)
Moleski, Walt; Luczak, Ed; Morris, Kim; Clayton, Bill; Scherf, Patricia; Obenschain, Arthur F. (Technical Monitor)
2002-01-01
This paper describes how intelligent agent technology was successfully prototyped and then deployed in a smart eCommerce application for NASA. An intelligent software agent called the Intelligent Service Validation Agent (ISVA) was added to an existing web-based ordering application to validate complex orders for spacecraft mission services. This integration of intelligent agent technology with conventional web technology satisfies an immediate NASA need to reduce manual order processing costs. The ISVA agent checks orders for completeness, consistency, and correctness, and notifies users of detected problems. ISVA uses NASA business rules and a knowledge base of NASA services, and is implemented using the Java Expert System Shell (Jess), a fast rule-based inference engine. The paper discusses the design of the agent and knowledge base, and the prototyping and deployment approach. It also discusses future directions and other applications, and discusses lessons-learned that may help other projects make their aerospace eCommerce applications smarter.
NASA Technical Reports Server (NTRS)
Shoham, Yoav
1994-01-01
The goal of our research is a methodology for creating robust software in distributed and dynamic environments. The approach taken is to endow software objects with explicit information about one another, to have them interact through a commitment mechanism, and to equip them with a speech-acty communication language. System-level applications include software interoperation and compositionality. A government application of specific interest is an infrastructure for coordination among multiple planners. Daily activity applications include personal software assistants, such as programmable email, scheduling, and new group agents. Research topics include definition of mental state of agents, design of agent languages as well as interpreters for those languages, and mechanisms for coordination within agent societies such as artificial social laws and conventions.
Application of the AHP method in modeling the trust and reputation of software agents
NASA Astrophysics Data System (ADS)
Zytniewski, Mariusz; Klementa, Marek; Skorupka, Dariusz; Stanek, Stanislaw; Duchaczek, Artur
2016-06-01
Given the unique characteristics of cyberspace and, in particular, the number of inherent security threats, communication between software agents becomes a highly complex issue and a major challenge that, on the one hand, needs to be continuously monitored and, on the other, awaits new solutions addressing its vulnerabilities. An approach that has recently come into view mimics mechanisms typical of social systems and is based on trust and reputation that assist agents in deciding which other agents to interact with. The paper offers an enhancement to existing trust and reputation models, involving the application of the AHP method that is widely used for decision support in social systems, notably for risks analysis. To this end, it is proposed to expand the underlying conceptual basis by including such notions as self-trust and social trust, and to apply these to software agents. The discussion is concluded with an account of an experiment aimed at testing the effectiveness of the proposed solution.
Activity-Centric Approach to Distributed Programming
NASA Technical Reports Server (NTRS)
Levy, Renato; Satapathy, Goutam; Lang, Jun
2004-01-01
The first phase of an effort to develop a NASA version of the Cybele software system has been completed. To give meaning to even a highly abbreviated summary of the modifications to be embodied in the NASA version, it is necessary to present the following background information on Cybele: Cybele is a proprietary software infrastructure for use by programmers in developing agent-based application programs [complex application programs that contain autonomous, interacting components (agents)]. Cybele provides support for event handling from multiple sources, multithreading, concurrency control, migration, and load balancing. A Cybele agent follows a programming paradigm, called activity-centric programming, that enables an abstraction over system-level thread mechanisms. Activity centric programming relieves application programmers of the complex tasks of thread management, concurrency control, and event management. In order to provide such functionality, activity-centric programming demands support of other layers of software. This concludes the background information. In the first phase of the present development, a new architecture for Cybele was defined. In this architecture, Cybele follows a modular service-based approach to coupling of the programming and service layers of software architecture. In a service-based approach, the functionalities supported by activity-centric programming are apportioned, according to their characteristics, among several groups called services. A well-defined interface among all such services serves as a path that facilitates the maintenance and enhancement of such services without adverse effect on the whole software framework. The activity-centric application-program interface (API) is part of a kernel. The kernel API calls the services by use of their published interface. This approach makes it possible for any application code written exclusively under the API to be portable for any configuration of Cybele.
Protecting software agents from malicious hosts using quantum computing
NASA Astrophysics Data System (ADS)
Reisner, John; Donkor, Eric
2000-07-01
We evaluate how quantum computing can be applied to security problems for software agents. Agent-based computing, which merges technological advances in artificial intelligence and mobile computing, is a rapidly growing domain, especially in applications such as electronic commerce, network management, information retrieval, and mission planning. System security is one of the more eminent research areas in agent-based computing, and the specific problem of protecting a mobile agent from a potentially hostile host is one of the most difficult of these challenges. In this work, we describe our agent model, and discuss the capabilities and limitations of classical solutions to the malicious host problem. Quantum computing may be extremely helpful in addressing the limitations of classical solutions to this problem. This paper highlights some of the areas where quantum computing could be applied to agent security.
The Role of Web-Based Simulations in Technology Education
ERIC Educational Resources Information Center
Page, Tom
2009-01-01
This paper discusses the theoretical underpinning and main aspects of the development and application of the web-orientation agent (WOA) and presents preliminary results concerning its use in university studies. The web-orientation agent (WOA) is a software based tool which produces an interactive learning environment offering support and guidance…
Application and Implications of Agent Technology for Librarians.
ERIC Educational Resources Information Center
Nardi, Bonnie A.; O'Day, Vicki L.
1998-01-01
Examines intelligent software agents, presents nine design principles aimed specifically at the technology perspective (to personalize task performance and general principles), and discusses what librarians can do that software agents (agents defined as activity-aware software programs) cannot do. Describes an information ecology that integrates…
Integration agent-based models and GIS as a virtual urban dynamic laboratory
NASA Astrophysics Data System (ADS)
Chen, Peng; Liu, Miaolong
2007-06-01
Based on the Agent-based Model and spatial data model, a tight-coupling integrating method of GIS and Agent-based Model (ABM) is to be discussed in this paper. The use of object-orientation for both spatial data and spatial process models facilitates their integration, which can allow exploration and explanation of spatial-temporal phenomena such as urban dynamic. In order to better understand how tight coupling might proceed and to evaluate the possible functional and efficiency gains from such a tight coupling, the agent-based model and spatial data model are discussed, and then the relationships affecting spatial data model and agent-based process models interaction. After that, a realistic crowd flow simulation experiment is presented. Using some tools provided by general GIS systems and a few specific programming languages, a new software system integrating GIS and MAS as a virtual laboratory applicable for simulating pedestrian flows in a crowd activity centre has been developed successfully. Under the environment supported by the software system, as an applicable case, a dynamic evolution process of the pedestrian's flows (dispersed process for the spectators) in a crowds' activity center - The Shanghai Stadium has been simulated successfully. At the end of the paper, some new research problems have been pointed out for the future.
Intelligent Agents for the Digital Battlefield
1998-11-01
specific outcome of our long term research will be the development of a collaborative agent technology system, CATS , that will provide the underlying...software infrastructure needed to build large, heterogeneous, distributed agent applications. CATS will provide a software environment through which multiple...intelligent agents may interact with other agents, both human and computational. In addition, CATS will contain a number of intelligent agent components that will be useful for a wide variety of applications.
GDSCalc: A Web-Based Application for Evaluating Discrete Graph Dynamical Systems
Elmeligy Abdelhamid, Sherif H.; Kuhlman, Chris J.; Marathe, Madhav V.; Mortveit, Henning S.; Ravi, S. S.
2015-01-01
Discrete dynamical systems are used to model various realistic systems in network science, from social unrest in human populations to regulation in biological networks. A common approach is to model the agents of a system as vertices of a graph, and the pairwise interactions between agents as edges. Agents are in one of a finite set of states at each discrete time step and are assigned functions that describe how their states change based on neighborhood relations. Full characterization of state transitions of one system can give insights into fundamental behaviors of other dynamical systems. In this paper, we describe a discrete graph dynamical systems (GDSs) application called GDSCalc for computing and characterizing system dynamics. It is an open access system that is used through a web interface. We provide an overview of GDS theory. This theory is the basis of the web application; i.e., an understanding of GDS provides an understanding of the software features, while abstracting away implementation details. We present a set of illustrative examples to demonstrate its use in education and research. Finally, we compare GDSCalc with other discrete dynamical system software tools. Our perspective is that no single software tool will perform all computations that may be required by all users; tools typically have particular features that are more suitable for some tasks. We situate GDSCalc within this space of software tools. PMID:26263006
GDSCalc: A Web-Based Application for Evaluating Discrete Graph Dynamical Systems.
Elmeligy Abdelhamid, Sherif H; Kuhlman, Chris J; Marathe, Madhav V; Mortveit, Henning S; Ravi, S S
2015-01-01
Discrete dynamical systems are used to model various realistic systems in network science, from social unrest in human populations to regulation in biological networks. A common approach is to model the agents of a system as vertices of a graph, and the pairwise interactions between agents as edges. Agents are in one of a finite set of states at each discrete time step and are assigned functions that describe how their states change based on neighborhood relations. Full characterization of state transitions of one system can give insights into fundamental behaviors of other dynamical systems. In this paper, we describe a discrete graph dynamical systems (GDSs) application called GDSCalc for computing and characterizing system dynamics. It is an open access system that is used through a web interface. We provide an overview of GDS theory. This theory is the basis of the web application; i.e., an understanding of GDS provides an understanding of the software features, while abstracting away implementation details. We present a set of illustrative examples to demonstrate its use in education and research. Finally, we compare GDSCalc with other discrete dynamical system software tools. Our perspective is that no single software tool will perform all computations that may be required by all users; tools typically have particular features that are more suitable for some tasks. We situate GDSCalc within this space of software tools.
Using web technology and Java mobile software agents to manage outside referrals.
Murphy, S. N.; Ng, T.; Sittig, D. F.; Barnett, G. O.
1998-01-01
A prototype, web-based referral application was created with the objective of providing outside primary care providers (PCP's) the means to refer patients to the Massachusetts General Hospital and the Brigham and Women's Hospital. The application was designed to achieve the two primary objectives of providing the consultant with enough data to make decisions even at the initial visit, and providing the PCP with a prompt response from the consultant. The system uses a web browser/server to initiate the referral and Java mobile software agents to support the workflow of the referral. This combination provides a light client implementation that can run on a wide variety of hardware and software platforms found in the office of the PCP. The implementation can guarantee a high degree of security for the computer of the PCP. Agents can be adapted to support the wide variety of data types that may be used in referral transactions, including reports with complex presentation needs and scanned (faxed) images Agents can be delivered to the PCP as running applications that can perform ongoing queries and alerts at the office of the PCP. Finally, the agent architecture is designed to scale in a natural and seamless manner for unforeseen future needs. PMID:9929190
Agents, assemblers, and ANTS: scheduling assembly with market and biological software mechanisms
NASA Astrophysics Data System (ADS)
Toth-Fejel, Tihamer T.
2000-06-01
Nanoscale assemblers will need robust, scalable, flexible, and well-understood mechanisms such as software agents to control them. This paper discusses assemblers and agents, and proposes a taxonomy of their possible interaction. Molecular assembly is seen as a special case of general assembly, subject to many of the same issues, such as the advantages of convergent assembly, and the problem of scheduling. This paper discusses the contract net architecture of ANTS, an agent-based scheduling application under development. It also describes an algorithm for least commitment scheduling, which uses probabilistic committed capacity profiles of resources over time, along with realistic costs, to provide an abstract search space over which the agents can wander to quickly find optimal solutions.
A Coupled Simulation Architecture for Agent-Based/Geohydrological Modelling
NASA Astrophysics Data System (ADS)
Jaxa-Rozen, M.
2016-12-01
The quantitative modelling of social-ecological systems can provide useful insights into the interplay between social and environmental processes, and their impact on emergent system dynamics. However, such models should acknowledge the complexity and uncertainty of both of the underlying subsystems. For instance, the agent-based models which are increasingly popular for groundwater management studies can be made more useful by directly accounting for the hydrological processes which drive environmental outcomes. Conversely, conventional environmental models can benefit from an agent-based depiction of the feedbacks and heuristics which influence the decisions of groundwater users. From this perspective, this work describes a Python-based software architecture which couples the popular NetLogo agent-based platform with the MODFLOW/SEAWAT geohydrological modelling environment. This approach enables users to implement agent-based models in NetLogo's user-friendly platform, while benefiting from the full capabilities of MODFLOW/SEAWAT packages or reusing existing geohydrological models. The software architecture is based on the pyNetLogo connector, which provides an interface between the NetLogo agent-based modelling software and the Python programming language. This functionality is then extended and combined with Python's object-oriented features, to design a simulation architecture which couples NetLogo with MODFLOW/SEAWAT through the FloPy library (Bakker et al., 2016). The Python programming language also provides access to a range of external packages which can be used for testing and analysing the coupled models, which is illustrated for an application of Aquifer Thermal Energy Storage (ATES).
The Methodology for Developing Mobile Agent Application for Ubiquitous Environment
NASA Astrophysics Data System (ADS)
Matsuzaki, Kazutaka; Yoshioka, Nobukazu; Honiden, Shinichi
A methodology which enables a flexible and reusable development of mobile agent application to a mobility aware indoor environment is provided in this study. The methodology is named Workflow-awareness model based on a concept of a pair of mobile agents cooperating to perform a given task. A monolithic mobile agent application with numerous concerns in a mobility aware setting is divided into a master agent (MA) and a shadow agent (SA) according to a type of tasks. The MA executes a main application logic which includes monitoring a user's physical movement and coordinating various services. The SA performs additional tasks depending on environments to aid the MA in achieving efficient execution without losing application logic. "Workflow-awareness (WFA)" means that the SA knows the MA's execution state transition so that the SA can provide a proper task at a proper timing. A prototype implementation of the methodology is done with a practical use of AspectJ. AspectJ is used to automate WFA by weaving communication modules to both MA and SA. Usefulness of this methodology concerning its efficiency and software engineering aspects are analyzed. As for the effectiveness, the overhead of WFA is relatively small to the whole expenditure time. And from the view of the software engineering, WFA is possible to provide a mechanism to deploy one application in various situations.
A Unified Approach to Model-Based Planning and Execution
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Dorais, Gregory A.; Fry, Chuck; Levinson, Richard; Plaunt, Christian; Norvig, Peter (Technical Monitor)
2000-01-01
Writing autonomous software is complex, requiring the coordination of functionally and technologically diverse software modules. System and mission engineers must rely on specialists familiar with the different software modules to translate requirements into application software. Also, each module often encodes the same requirement in different forms. The results are high costs and reduced reliability due to the difficulty of tracking discrepancies in these encodings. In this paper we describe a unified approach to planning and execution that we believe provides a unified representational and computational framework for an autonomous agent. We identify the four main components whose interplay provides the basis for the agent's autonomous behavior: the domain model, the plan database, the plan running module, and the planner modules. This representational and problem solving approach can be applied at all levels of the architecture of a complex agent, such as Remote Agent. In the rest of the paper we briefly describe the Remote Agent architecture. The new agent architecture proposed here aims at achieving the full Remote Agent functionality. We then give the fundamental ideas behind the new agent architecture and point out some implication of the structure of the architecture, mainly in the area of reactivity and interaction between reactive and deliberative decision making. We conclude with related work and current status.
A Software Framework for Remote Patient Monitoring by Using Multi-Agent Systems Support
2017-01-01
Background Although there have been significant advances in network, hardware, and software technologies, the health care environment has not taken advantage of these developments to solve many of its inherent problems. Research activities in these 3 areas make it possible to apply advanced technologies to address many of these issues such as real-time monitoring of a large number of patients, particularly where a timely response is critical. Objective The objective of this research was to design and develop innovative technological solutions to offer a more proactive and reliable medical care environment. The short-term and primary goal was to construct IoT4Health, a flexible software framework to generate a range of Internet of things (IoT) applications, containing components such as multi-agent systems that are designed to perform Remote Patient Monitoring (RPM) activities autonomously. An investigation into its full potential to conduct such patient monitoring activities in a more proactive way is an expected future step. Methods A framework methodology was selected to evaluate whether the RPM domain had the potential to generate customized applications that could achieve the stated goal of being responsive and flexible within the RPM domain. As a proof of concept of the software framework’s flexibility, 3 applications were developed with different implementations for each framework hot spot to demonstrate potential. Agents4Health was selected to illustrate the instantiation process and IoT4Health’s operation. To develop more concrete indicators of the responsiveness of the simulated care environment, an experiment was conducted while Agents4Health was operating, to measure the number of delays incurred in monitoring the tasks performed by agents. Results IoT4Health’s construction can be highlighted as our contribution to the development of eHealth solutions. As a software framework, IoT4Health offers extensibility points for the generation of applications. Applications can extend the framework in the following ways: identification, collection, storage, recovery, visualization, monitoring, anomalies detection, resource notification, and dynamic reconfiguration. Based on other outcomes involving observation of the resulting applications, it was noted that its design contributed toward more proactive patient monitoring. Through these experimental systems, anomalies were detected in real time, with agents sending notifications instantly to the health providers. Conclusions We conclude that the cost-benefit of the construction of a more generic and complex system instead of a custom-made software system demonstrated the worth of the approach, making it possible to generate applications in this domain in a more timely fashion. PMID:28347973
Model-Drive Architecture for Agent-Based Systems
NASA Technical Reports Server (NTRS)
Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.
2004-01-01
The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.
A Buyer Behaviour Framework for the Development and Design of Software Agents in E-Commerce.
ERIC Educational Resources Information Center
Sproule, Susan; Archer, Norm
2000-01-01
Software agents are computer programs that run in the background and perform tasks autonomously as delegated by the user. This paper blends models from marketing research and findings from the field of decision support systems to build a framework for the design of software agents to support in e-commerce buying applications. (Contains 35…
Integration of the Remote Agent for the NASA Deep Space One Autonomy Experiment
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Bernard, Douglas E.; Gamble, Edward B., Jr.; Kanefsky, Bob; Kurien, James; Muscettola, Nicola; Nayak, P. Pandurang; Rajan, Kanna; Lau, Sonie (Technical Monitor)
1998-01-01
This paper describes the integration of the Remote Agent (RA), a spacecraft autonomy system which is scheduled to control the Deep Space 1 spacecraft during a flight experiment in 1999. The RA is a reusable, model-based autonomy system that is quite different from software typically used to control an aerospace system. We describe the integration challenges we faced, how we addressed them, and the lessons learned. We focus on those aspects of integrating the RA that were either easier or more difficult than integrating a more traditional large software application because the RA is a model-based autonomous system. A number of characteristics of the RA made integration process easier. One example is the model-based nature of RA. Since the RA is model-based, most of its behavior is not hard coded into procedural program code. Instead, engineers specify high level models of the spacecraft's components from which the Remote Agent automatically derives correct system-wide behavior on the fly. This high level, modular, and declarative software description allowed some interfaces between RA components and between RA and the flight software to be automatically generated and tested for completeness against the Remote Agent's models. In addition, the Remote Agent's model-based diagnosis system automatically diagnoses when the RA models are not consistent with the behavior of the spacecraft. In flight, this feature is used to diagnose failures in the spacecraft hardware. During integration, it proved valuable in finding problems in the spacecraft simulator or flight software. In addition, when modifications are made to the spacecraft hardware or flight software, the RA models are easily changed because they only capture a description of the spacecraft. one does not have to maintain procedural code that implements the correct behavior for every expected situation. On the other hand, several features of the RA made it more difficult to integrate than typical flight software. For example, the definition of correct behavior is more difficult to specify for a system that is expected to reason about and flexibly react to its environment than for a traditional flight software system. Consequently, whenever a change is made to the RA it is more time consuming to determine if the resulting behavior is correct. We conclude the paper with a discussion of future work on the Remote Agent as well as recommendations to ease integration of similar autonomy projects.
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey; McNeese, Michael; Hall, David
2013-05-01
Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.
A Software Framework for Remote Patient Monitoring by Using Multi-Agent Systems Support.
Fernandes, Chrystinne Oliveira; Lucena, Carlos José Pereira De
2017-03-27
Although there have been significant advances in network, hardware, and software technologies, the health care environment has not taken advantage of these developments to solve many of its inherent problems. Research activities in these 3 areas make it possible to apply advanced technologies to address many of these issues such as real-time monitoring of a large number of patients, particularly where a timely response is critical. The objective of this research was to design and develop innovative technological solutions to offer a more proactive and reliable medical care environment. The short-term and primary goal was to construct IoT4Health, a flexible software framework to generate a range of Internet of things (IoT) applications, containing components such as multi-agent systems that are designed to perform Remote Patient Monitoring (RPM) activities autonomously. An investigation into its full potential to conduct such patient monitoring activities in a more proactive way is an expected future step. A framework methodology was selected to evaluate whether the RPM domain had the potential to generate customized applications that could achieve the stated goal of being responsive and flexible within the RPM domain. As a proof of concept of the software framework's flexibility, 3 applications were developed with different implementations for each framework hot spot to demonstrate potential. Agents4Health was selected to illustrate the instantiation process and IoT4Health's operation. To develop more concrete indicators of the responsiveness of the simulated care environment, an experiment was conducted while Agents4Health was operating, to measure the number of delays incurred in monitoring the tasks performed by agents. IoT4Health's construction can be highlighted as our contribution to the development of eHealth solutions. As a software framework, IoT4Health offers extensibility points for the generation of applications. Applications can extend the framework in the following ways: identification, collection, storage, recovery, visualization, monitoring, anomalies detection, resource notification, and dynamic reconfiguration. Based on other outcomes involving observation of the resulting applications, it was noted that its design contributed toward more proactive patient monitoring. Through these experimental systems, anomalies were detected in real time, with agents sending notifications instantly to the health providers. We conclude that the cost-benefit of the construction of a more generic and complex system instead of a custom-made software system demonstrated the worth of the approach, making it possible to generate applications in this domain in a more timely fashion. ©Chrystinne Oliveira Fernandes, Carlos José Pereira De Lucena. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 27.03.2017.
NASA Astrophysics Data System (ADS)
Tomàs-Buliart, Joan; Fernández, Marcel; Soriano, Miguel
Critical infrastructures are usually controlled by software entities. To monitor the well-function of these entities, a solution based in the use of mobile agents is proposed. Some proposals to detect modifications of mobile agents, as digital signature of code, exist but they are oriented to protect software against modification or to verify that an agent have been executed correctly. The aim of our proposal is to guarantee that the software is being executed correctly by a non trusted host. The way proposed to achieve this objective is by the improvement of the Self-Validating Branch-Based Software Watermarking by Myles et al.. The proposed modification is the incorporation of an external element called sentinel which controls branch targets. This technique applied in mobile agents can guarantee the correct operation of an agent or, at least, can detect suspicious behaviours of a malicious host during the execution of the agent instead of detecting when the execution of the agent have finished.
Agent-based Modeling with MATSim for Hazards Evacuation Planning
NASA Astrophysics Data System (ADS)
Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.
2015-12-01
Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.
NASA Technical Reports Server (NTRS)
Truszkowski, Walt; Obenschain, Arthur F. (Technical Monitor)
2002-01-01
Currently, spacecraft ground systems have a well defined and somewhat standard architecture and operations concept. Based on domain analysis studies of various control centers conducted over the years it is clear that ground systems have core capabilities and functionality that are common across all ground systems. This observation alone supports the realization of reuse. Additionally, spacecraft ground systems are increasing in their ability to do things autonomously. They are being engineered using advanced expert systems technology to provide automated support for operators. A clearer understanding of the possible roles of agent technology is advancing the prospects of greater autonomy for these systems. Many of their functional and management tasks are or could be supported by applied agent technology, the dynamics of the ground system's infrastructure could be monitored by agents, there are intelligent agent-based approaches to user-interfaces, etc. The premise of this paper is that the concepts associated with software reuse, applicable in consideration of classically-engineered ground systems, can be updated to address their application in highly agent-based realizations of future ground systems. As a somewhat simplified example consider the following situation, involving human agents in a ground system context. Let Group A of controllers be working on Mission X. They are responsible for the command, control and health and safety of the Mission X spacecraft. Let us suppose that mission X successfully completes it mission and is turned off. Group A could be dispersed or perhaps move to another Mission Y. In this case there would be reuse of the human agents from Mission X to Mission Y. The Group A agents perform their well-understood functions in a somewhat but related context. There will be a learning or familiarization process that the group A agents go through to make the new context, determined by the new Mission Y, understood. This simplified scenario highlights some of the major issues that need to be addressed when considering the situation where Group A is composed of software-based agents (not their human counterparts) and they migrate from one mission support system to another. This paper will address: - definition of an agent architecture appropriate to support reuse; - identification of non-mission-specific agent capabilities required; - appropriate knowledge representation schemes for mission-specific knowledge; - agent interface with mission-specific knowledge (a type of Learning); development of a fully-operational group of cooperative software agents for ground system support; architecture and operation of a repository of reusable agents that could be the source of intelligent components for realizing an autonomous (or nearly autonomous) agent-based ground system, and an agent-based approach to repository management and operation (an intelligent interface for human use of the repository in a ground-system development activity).
Integrating manufacturing softwares for intelligent planning execution: a CIIMPLEX perspective
NASA Astrophysics Data System (ADS)
Chu, Bei Tseng B.; Tolone, William J.; Wilhelm, Robert G.; Hegedus, M.; Fesko, J.; Finin, T.; Peng, Yun; Jones, Chris H.; Long, Junshen; Matthews, Mike; Mayfield, J.; Shimp, J.; Su, S.
1997-01-01
Recent developments have made it possible to interoperate complex business applications at much lower costs. Application interoperation, along with business process re- engineering can result in significant savings by eliminating work created by disconnected business processes due to isolated business applications. However, we believe much greater productivity benefits can be achieved by facilitating timely decision-making, utilizing information from multiple enterprise perspectives. The CIIMPLEX enterprise integration architecture is designed to enable such productivity gains by helping people to carry out integrated enterprise scenarios. An enterprise scenario is triggered typically by some external event. The goal of an enterprise scenario is to make the right decisions considering the full context of the problem. Enterprise scenarios are difficult for people to carry out because of the interdependencies among various actions. One can easily be overwhelmed by the large amount of information. We propose the use of software agents to help gathering relevant information and present them in the appropriate context of an enterprise scenario. The CIIMPLEX enterprise integration architecture is based on the FAIME methodology for application interoperation and plug-and-play. It also explores the use of software agents in application plug-and- play.
Plug-In Tutor Agents: Still Pluggin'
ERIC Educational Resources Information Center
Ritter, Steven
2016-01-01
"An Architecture for Plug-in Tutor Agents" (Ritter and Koedinger 1996) proposed a software architecture designed around the idea that tutors could be built as plug-ins for existing software applications. Looking back on the paper now, we can see that certain assumptions about the future of software architecture did not come to be, making…
A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture
NASA Technical Reports Server (NTRS)
Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.
2005-01-01
Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.
Cardoso de Moraes, João Luís; de Souza, Wanderley Lopes; Pires, Luís Ferreira; do Prado, Antonio Francisco
2016-10-01
In Pervasive Healthcare, novel information and communication technologies are applied to support the provision of health services anywhere, at anytime and to anyone. Since health systems may offer their health records in different electronic formats, the openEHR Foundation prescribes the use of archetypes for describing clinical knowledge in order to achieve semantic interoperability between these systems. Software agents have been applied to simulate human skills in some healthcare procedures. This paper presents a methodology, based on the use of openEHR archetypes and agent technology, which aims to overcome the weaknesses typically found in legacy healthcare systems, thereby adding value to the systems. This methodology was applied in the design of an agent-based system, which was used in a realistic healthcare scenario in which a medical staff meeting to prepare a cardiac surgery has been supported. We conducted experiments with this system in a distributed environment composed by three cardiology clinics and a center of cardiac surgery, all located in the city of Marília (São Paulo, Brazil). We evaluated this system according to the Technology Acceptance Model. The case study confirmed the acceptance of our agent-based system by healthcare professionals and patients, who reacted positively with respect to the usefulness of this system in particular, and with respect to task delegation to software agents in general. The case study also showed that a software agent-based interface and a tools-based alternative must be provided to the end users, which should allow them to perform the tasks themselves or to delegate these tasks to other people. A Pervasive Healthcare model requires efficient and secure information exchange between healthcare providers. The proposed methodology allows designers to build communication systems for the message exchange among heterogeneous healthcare systems, and to shift from systems that rely on informal communication of actors to a more automated and less error-prone agent-based system. Our methodology preserves significant investment of many years in the legacy systems and allows developers to extend them adding new features to these systems, by providing proactive assistance to the end-users and increasing the user mobility with an appropriate support. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Agent-based models of cellular systems.
Cannata, Nicola; Corradini, Flavio; Merelli, Emanuela; Tesei, Luca
2013-01-01
Software agents are particularly suitable for engineering models and simulations of cellular systems. In a very natural and intuitive manner, individual software components are therein delegated to reproduce "in silico" the behavior of individual components of alive systems at a given level of resolution. Individuals' actions and interactions among individuals allow complex collective behavior to emerge. In this chapter we first introduce the readers to software agents and multi-agent systems, reviewing the evolution of agent-based modeling of biomolecular systems in the last decade. We then describe the main tools, platforms, and methodologies available for programming societies of agents, possibly profiting also of toolkits that do not require advanced programming skills.
Software agents for the dissemination of remote terrestrial sensing data
NASA Technical Reports Server (NTRS)
Toomey, Christopher N.; Simoudis, Evangelos; Johnson, Raymond W.; Mark, William S.
1994-01-01
Remote terrestrial sensing (RTS) data is constantly being collected from a variety of space-based and earth-based sensors. The collected data, and especially 'value-added' analyses of the data, are finding growing application for commercial, government, and scientific purposes. The scale of this data collection and analysis is truly enormous; e.g., by 1995, the amount of data available in just one sector, NASA space science, will reach 5 petabytes. Moreover, the amount of data, and the value of analyzing the data, are expected to increase dramatically as new satellites and sensors become available (e.g., NASA's Earth Observing System satellites). Lockheed and other companies are beginning to provide data and analysis commercially. A critical issue for the exploitation of collected data is the dissemination of data and value-added analyses to a diverse and widely distributed customer base. Customers must be able to use their computational environment (eventually the National Information Infrastructure) to obtain timely and complete information, without having to know the details of where the relevant data resides and how it is accessed. Customers must be able to routinely use standard, widely available (and, therefore, low cost) analyses, while also being able to readily create on demand highly customized analyses to make crucial decisions. The diversity of user needs creates a difficult software problem: how can users easily state their needs, while the computational environment assumes the responsibility of finding (or creating) relevant information, and then delivering the results in a form that users understand? A software agent is a self-contained, active software module that contains an explicit representation of its operational knowledge. This explicit representation allows agents to examine their own capabilities in order to modify their goals to meet changing needs and to take advantage of dynamic opportunities. In addition, the explicit representation allows agents to advertize their capabilities and results to other agents, thereby allowing the collection of agents to reuse each others work.
Intelligent Software Agents: Sensor Integration and Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulesz, James J; Lee, Ronald W
2013-01-01
Abstract In a post Macondo world the buzzwords are Integrity Management and Incident Response Management. The twin processes are not new but the opportunity to link the two is novel. Intelligent software agents can be used with sensor networks in distributed and centralized computing systems to enhance real-time monitoring of system integrity as well as manage the follow-on incident response to changing, and potentially hazardous, environmental conditions. The software components are embedded at the sensor network nodes in surveillance systems used for monitoring unusual events. When an event occurs, the software agents establish a new concept of operation at themore » sensing node, post the event status to a blackboard for software agents at other nodes to see , and then react quickly and efficiently to monitor the scale of the event. The technology addresses a current challenge in sensor networks that prevents a rapid and efficient response when a sensor measurement indicates that an event has occurred. By using intelligent software agents - which can be stationary or mobile, interact socially, and adapt to changing situations - the technology offers features that are particularly important when systems need to adapt to active circumstances. For example, when a release is detected, the local software agent collaborates with other agents at the node to exercise the appropriate operation, such as: targeted detection, increased detection frequency, decreased detection frequency for other non-alarming sensors, and determination of environmental conditions so that adjacent nodes can be informed that an event is occurring and when it will arrive. The software agents at the nodes can also post the data in a targeted manner, so that agents at other nodes and the command center can exercise appropriate operations to recalibrate the overall sensor network and associated intelligence systems. The paper describes the concepts and provides examples of real-world implementations including the Threat Detection and Analysis System (TDAS) at the International Port of Memphis and the Biological Warning and Incident Characterization System (BWIC) Environmental Monitoring (EM) Component. Technologies developed for these 24/7 operational systems have applications for improved real-time system integrity awareness as well as provide incident response (as needed) for production and field applications.« less
Process Management inside ATLAS DAQ
NASA Astrophysics Data System (ADS)
Alexandrov, I.; Amorim, A.; Badescu, E.; Burckhart-Chromek, D.; Caprini, M.; Dobson, M.; Duval, P. Y.; Hart, R.; Jones, R.; Kazarov, A.; Kolos, S.; Kotov, V.; Liko, D.; Lucio, L.; Mapelli, L.; Mineev, M.; Moneta, L.; Nassiakou, M.; Pedro, L.; Ribeiro, A.; Roumiantsev, V.; Ryabov, Y.; Schweiger, D.; Soloviev, I.; Wolters, H.
2002-10-01
The Process Management component of the online software of the future ATLAS experiment data acquisition system is presented. The purpose of the Process Manager is to perform basic job control of the software components of the data acquisition system. It is capable of starting, stopping and monitoring the status of those components on the data acquisition processors independent of the underlying operating system. Its architecture is designed on the basis of a server client model using CORBA based communication. The server part relies on C++ software agent objects acting as an interface between the local operating system and client applications. Some of the major design challenges of the software agents were to achieve the maximum degree of autonomy possible, to create processes aware of dynamic conditions in their environment and with the ability to determine corresponding actions. Issues such as the performance of the agents in terms of time needed for process creation and destruction, the scalability of the system taking into consideration the final ATLAS configuration and minimizing the use of hardware resources were also of critical importance. Besides the details given on the architecture and the implementation, we also present scalability and performance tests results of the Process Manager system.
Rule-based statistical data mining agents for an e-commerce application
NASA Astrophysics Data System (ADS)
Qin, Yi; Zhang, Yan-Qing; King, K. N.; Sunderraman, Rajshekhar
2003-03-01
Intelligent data mining techniques have useful e-Business applications. Because an e-Commerce application is related to multiple domains such as statistical analysis, market competition, price comparison, profit improvement and personal preferences, this paper presents a hybrid knowledge-based e-Commerce system fusing intelligent techniques, statistical data mining, and personal information to enhance QoS (Quality of Service) of e-Commerce. A Web-based e-Commerce application software system, eDVD Web Shopping Center, is successfully implemented uisng Java servlets and an Oracle81 database server. Simulation results have shown that the hybrid intelligent e-Commerce system is able to make smart decisions for different customers.
MonALISA, an agent-based monitoring and control system for the LHC experiments
NASA Astrophysics Data System (ADS)
Balcas, J.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.
2017-10-01
MonALISA, which stands for Monitoring Agents using a Large Integrated Services Architecture, has been developed over the last fifteen years by California Insitute of Technology (Caltech) and its partners with the support of the software and computing program of the CMS and ALICE experiments at the Large Hadron Collider (LHC). The framework is based on Dynamic Distributed Service Architecture and is able to provide complete system monitoring, performance metrics of applications, Jobs or services, system control and global optimization services for complex systems. A short overview and status of MonALISA is given in this paper.
Architectures and Evaluation for Adjustable Control Autonomy for Space-Based Life Support Systems
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schreckenghost, Debra K.
2001-01-01
In the past five years, a number of automation applications for control of crew life support systems have been developed and evaluated in the Adjustable Autonomy Testbed at NASA's Johnson Space Center. This paper surveys progress on an adjustable autonomous control architecture for situations where software and human operators work together to manage anomalies and other system problems. When problems occur, the level of control autonomy can be adjusted, so that operators and software agents can work together on diagnosis and recovery. In 1997 adjustable autonomy software was developed to manage gas transfer and storage in a closed life support test. Four crewmembers lived and worked in a chamber for 91 days, with both air and water recycling. CO2 was converted to O2 by gas processing systems and wheat crops. With the automation software, significantly fewer hours were spent monitoring operations. System-level validation testing of the software by interactive hybrid simulation revealed problems both in software requirements and implementation. Since that time, we have been developing multi-agent approaches for automation software and human operators, to cooperatively control systems and manage problems. Each new capability has been tested and demonstrated in realistic dynamic anomaly scenarios, using the hybrid simulation tool.
Flexibility Support for Homecare Applications Based on Models and Multi-Agent Technology
Armentia, Aintzane; Gangoiti, Unai; Priego, Rafael; Estévez, Elisabet; Marcos, Marga
2015-01-01
In developed countries, public health systems are under pressure due to the increasing percentage of population over 65. In this context, homecare based on ambient intelligence technology seems to be a suitable solution to allow elderly people to continue to enjoy the comforts of home and help optimize medical resources. Thus, current technological developments make it possible to build complex homecare applications that demand, among others, flexibility mechanisms for being able to evolve as context does (adaptability), as well as avoiding service disruptions in the case of node failure (availability). The solution proposed in this paper copes with these flexibility requirements through the whole life-cycle of the target applications: from design phase to runtime. The proposed domain modeling approach allows medical staff to design customized applications, taking into account the adaptability needs. It also guides software developers during system implementation. The application execution is managed by a multi-agent based middleware, making it possible to meet adaptation requirements, assuring at the same time the availability of the system even for stateful applications. PMID:26694416
2011-03-01
functions of the vignette editor include visualizing the state of the UAS team, creating T&E scenarios, monitoring the UAS team performance, and...These behaviors are then executed by the robot sequentially (Figure 2). A state machine mission editor allows mission builders to use behaviors from the...include control, robotics, distributed applications, multimedia applications, databases, design patterns, and software engineering. Mr. Lenzi is the
Research on mixed network architecture collaborative application model
NASA Astrophysics Data System (ADS)
Jing, Changfeng; Zhao, Xi'an; Liang, Song
2009-10-01
When facing complex requirements of city development, ever-growing spatial data, rapid development of geographical business and increasing business complexity, collaboration between multiple users and departments is needed urgently, however conventional GIS software (such as Client/Server model or Browser/Server model) are not support this well. Collaborative application is one of the good resolutions. Collaborative application has four main problems to resolve: consistency and co-edit conflict, real-time responsiveness, unconstrained operation, spatial data recoverability. In paper, application model called AMCM is put forward based on agent and multi-level cache. AMCM can be used in mixed network structure and supports distributed collaborative. Agent is an autonomous, interactive, initiative and reactive computing entity in a distributed environment. Agent has been used in many fields such as compute science and automation. Agent brings new methods for cooperation and the access for spatial data. Multi-level cache is a part of full data. It reduces the network load and improves the access and handle of spatial data, especially, in editing the spatial data. With agent technology, we make full use of its characteristics of intelligent for managing the cache and cooperative editing that brings a new method for distributed cooperation and improves the efficiency.
A Software Product Line Process to Develop Agents for the IoT.
Ayala, Inmaculada; Amor, Mercedes; Fuentes, Lidia; Troya, José M
2015-07-01
One of the most important challenges of this decade is the Internet of Things (IoT), which aims to enable things to be connected anytime, anyplace, with anything and anyone, ideally using any path/network and any service. IoT systems are usually composed of heterogeneous and interconnected lightweight devices that support applications that are subject to change in their external environment and in the functioning of these devices. The management of the variability of these changes, autonomously, is a challenge in the development of these systems. Agents are a good option for developing self-managed IoT systems due to their distributed nature, context-awareness and self-adaptation. Our goal is to enhance the development of IoT applications using agents and software product lines (SPL). Specifically, we propose to use Self-StarMASMAS, multi-agent system) agents and to define an SPL process using the Common Variability Language. In this contribution, we propose an SPL process for Self-StarMAS, paying particular attention to agents embedded in sensor motes.
A New Approach To Secure Federated Information Bases Using Agent Technology.
ERIC Educational Resources Information Center
Weippi, Edgar; Klug, Ludwig; Essmayr, Wolfgang
2003-01-01
Discusses database agents which can be used to establish federated information bases by integrating heterogeneous databases. Highlights include characteristics of federated information bases, including incompatible database management systems, schemata, and frequently changing context; software agent technology; Java agents; system architecture;…
Behavioral biometrics for verification and recognition of malicious software agents
NASA Astrophysics Data System (ADS)
Yampolskiy, Roman V.; Govindaraju, Venu
2008-04-01
Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.
Bosse, Stefan
2015-01-01
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550
Bosse, Stefan
2015-02-16
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.
CulSim: A simulator of emergence and resilience of cultural diversity
NASA Astrophysics Data System (ADS)
Ulloa, Roberto
CulSim is an agent-based computer simulation software that allows further exploration of influential and recent models of emergence of cultural groups grounded in sociological theories. CulSim provides a collection of tools to analyze resilience of cultural diversity when events affect agents, institutions or global parameters of the simulations; upon combination, events can be used to approximate historical circumstances. The software provides a graphical and text-based user interface, and so makes this agent-based modeling methodology accessible to a variety of users from different research fields.
A review of agent-based modeling approach in the supply chain collaboration context
NASA Astrophysics Data System (ADS)
Arvitrida, N. I.
2018-04-01
Collaboration is considered as the key aspect of supply chain management (SCM) success. This issue has been addressed by many studies in recent years, but there are still few research employs agent-based modeling (ABM) approach to study business partnerships in SCM. This paper reviews the use of ABM in modeling collaboration in supply chains and inform the scope of ABM application in the existing literature. The review reveals that ABM can be an effective tool to address various aspects in supply chain relationships, but its applications in SCM studies are still limited. Moreover, where ABM is applied in the SCM context, most of the studies focus on software architecture rather than analyzing the supply chain issues. This paper also provides insights to SCM researchers about the opportunity uses of ABM in studying complexity in supply chain collaboration.
A hierarchical distributed control model for coordinating intelligent systems
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1991-01-01
A hierarchical distributed control (HDC) model for coordinating cooperative problem-solving among intelligent systems is described. The model was implemented using SOCIAL, an innovative object-oriented tool for integrating heterogeneous, distributed software systems. SOCIAL embeds applications in 'wrapper' objects called Agents, which supply predefined capabilities for distributed communication, control, data specification, and translation. The HDC model is realized in SOCIAL as a 'Manager'Agent that coordinates interactions among application Agents. The HDC Manager: indexes the capabilities of application Agents; routes request messages to suitable server Agents; and stores results in a commonly accessible 'Bulletin-Board'. This centralized control model is illustrated in a fault diagnosis application for launch operations support of the Space Shuttle fleet at NASA, Kennedy Space Center.
Software Agents Applications Using Real-Time CORBA
NASA Astrophysics Data System (ADS)
Fowell, S.; Ward, R.; Nielsen, M.
This paper describes current projects being performed by SciSys in the area of the use of software agents, built using CORBA middleware, to improve operations within autonomous satellite/ground systems. These concepts have been developed and demonstrated in a series of experiments variously funded by ESA's Technology Flight Opportunity Initiative (TFO) and Leading Edge Technology for SMEs (LET-SME), and the British National Space Centre's (BNSC) National Technology Programme. Some of this earlier work has already been reported in [1]. This paper will address the trends, issues and solutions associated with this software agent architecture concept, together with its implementation using CORBA within an on-board environment, that is to say taking account of its real- time and resource constrained nature.
Exposure to TCDD from base perimeter application of Agent Orange in Vietnam.
Ross, John H; Hewitt, Andrew; Armitage, James; Solomon, Keith; Watkins, Deborah K; Ginevan, Michael E
2015-04-01
Using recognized methods routinely employed by pesticide regulatory agencies, the exposures of military personnel that were mixer/loader/applicators (M/L/A) of Agent Orange (AO) for perimeter foliage at bases during the Vietnam War were estimated. From the fraction of TCDD in AO, absorbed dosage of the manufacturing contaminant was estimated. Dermal exposure estimated from spray drift to residents of the bases was calculated using internationally recognized software that accounted for proximity, foliar density of application site, droplet size and wind speed among other factors, and produced estimates of deposition. Those that directly handled AO generally had much higher exposures than those further from the areas of use. The differences in exposure potential varied by M/L/A activity, but were typically orders of magnitude greater than bystanders. However, even the most-exposed M/L/A involved in perimeter application had lifetime exposures comparable to persons living in the U.S. at the time, i.e., ~1.3 to 5 pg TCDD/kg bodyweight. Copyright © 2014 Elsevier B.V. All rights reserved.
Using Ontologies to Formalize Services Specifications in Multi-Agent Systems
NASA Technical Reports Server (NTRS)
Breitman, Karin Koogan; Filho, Aluizio Haendchen; Haeusler, Edward Hermann
2004-01-01
One key issue in multi-agent systems (MAS) is their ability to interact and exchange information autonomously across applications. To secure agent interoperability, designers must rely on a communication protocol that allows software agents to exchange meaningful information. In this paper we propose using ontologies as such communication protocol. Ontologies capture the semantics of the operations and services provided by agents, allowing interoperability and information exchange in a MAS. Ontologies are a formal, machine processable, representation that allows to capture the semantics of a domain and, to derive meaningful information by way of logical inference. In our proposal we use a formal knowledge representation language (OWL) that translates into Description Logics (a subset of first order logic), thus eliminating ambiguities and providing a solid base for machine based inference. The main contribution of this approach is to make the requirements explicit, centralize the specification in a single document (the ontology itself), at the same that it provides a formal, unambiguous representation that can be processed by automated inference machines.
A Software Product Line Process to Develop Agents for the IoT
Ayala, Inmaculada; Amor, Mercedes; Fuentes, Lidia; Troya, José M.
2015-01-01
One of the most important challenges of this decade is the Internet of Things (IoT), which aims to enable things to be connected anytime, anyplace, with anything and anyone, ideally using any path/network and any service. IoT systems are usually composed of heterogeneous and interconnected lightweight devices that support applications that are subject to change in their external environment and in the functioning of these devices. The management of the variability of these changes, autonomously, is a challenge in the development of these systems. Agents are a good option for developing self-managed IoT systems due to their distributed nature, context-awareness and self-adaptation. Our goal is to enhance the development of IoT applications using agents and software product lines (SPL). Specifically, we propose to use Self-StarMASMAS, multi-agent system) agents and to define an SPL process using the Common Variability Language. In this contribution, we propose an SPL process for Self-StarMAS, paying particular attention to agents embedded in sensor motes. PMID:26140350
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Pena, Joaquin (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which an evolutionary system is managed and viewed as a software product line. In some embodiments, the core architecture is a relatively unchanging part of the system, and each version of the system is viewed as a product from the product line. Each software product is generated from the core architecture with some agent-based additions. The result may be a multi-agent system software product line.
Multi-Agent Diagnosis and Control of an Air Revitalization System for Life Support in Space
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Kowing, Jeffrey; Nieten, Joseph; Graham, Jeffrey s.; Schreckenghost, Debra; Bonasso, Pete; Fleming, Land D.; MacMahon, Matt; Thronesbery, Carroll
2000-01-01
An architecture of interoperating agents has been developed to provide control and fault management for advanced life support systems in space. In this adjustable autonomy architecture, software agents coordinate with human agents and provide support in novel fault management situations. This architecture combines the Livingstone model-based mode identification and reconfiguration (MIR) system with the 3T architecture for autonomous flexible command and control. The MIR software agent performs model-based state identification and diagnosis. MIR identifies novel recovery configurations and the set of commands required for the recovery. The AZT procedural executive and the human operator use the diagnoses and recovery recommendations, and provide command sequencing. User interface extensions have been developed to support human monitoring of both AZT and MIR data and activities. This architecture has been demonstrated performing control and fault management for an oxygen production system for air revitalization in space. The software operates in a dynamic simulation testbed.
Jeon, Myounghoon; Walker, Bruce N; Gable, Thomas M
2015-09-01
Research has suggested that interaction with an in-vehicle software agent can improve a driver's psychological state and increase road safety. The present study explored the possibility of using an in-vehicle software agent to mitigate effects of driver anger on driving behavior. After either anger or neutral mood induction, 60 undergraduates drove in a simulator with two types of agent intervention. Results showed that both speech-based agents not only enhance driver situation awareness and driving performance, but also reduce their anger level and perceived workload. Regression models show that a driver's anger influences driving performance measures, mediated by situation awareness. The practical implications include design guidelines for the design of social interaction with in-vehicle software agents. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Apex Reference Manual 3.0 Beta
NASA Technical Reports Server (NTRS)
Freed, Michael A.
2005-01-01
Apex is a toolkit for constructing software that behaves intelligently and responsively in demanding task environments. Reflecting its origin at NASA where Apex continues to be developed, current applications include: a) Providing autonomous mission management and tactical control capabilities for unmanned aerial vehicles including an autonomous surveillance helicopter and a simulation prototype of an unmanned fixed-wing aircraft to be used for wildfire mapping; b) Simulating human air traffic controllers, pilots and astronauts to help predict how people might respond to changes in equipment or procedures; and c) Predicting the precise duration and sequence of routine human behaviors based on a human-computer interaction engineering technique called CPM-GOMS. Among Apex s components are a set of implemented reasoning services, such as those for reactive planning and temporal pattern recognition; a software architecture that embeds and integrates these services and allows additional reasoning elements to be added as extensions; a formal language for specifying agent knowledge; a simulation environment to facilitate prototyping and analysis; and Sherpa, a set of tools for visualizing autonomy logic and runtime behavior. In combination, these are meant to provide a flexible and usable framework for creating, testing, and deploying intelligent agent software. Overall, our goal in developing Apex is to lower economic barriers to developing intelligent software agents. New ideas about how to extend or modify the system are evaluated in terms of their impact in reducing the time, expertise, and inventiveness required to build and maintain applications. For example, potential enhancements to the AI reasoning capabilities in the system are reviewed not only for usefulness and distinctiveness, but also for their impact on the readability and general usability of Apex s behavior representation language (PDL) and on the transparency of resulting behavior. A second central part of our approach is to iteratively refine Apex based on lessons learned from as diverse a set of applications as possible. Many applications have been developed by users outside the core development team including engineers, researchers, and students. Usability is thus a central concern for every aspect of Apex visible to a user, including PDL, Sherpa, the Apex installation process, APIs, and user documentation. Apex users vary in their areas of expertise and in their familiarity with autonomy technology. Focusing on usability, a development philosophy summarized by the project motto "Usable Autonomy," has been important part of enabling diverse users to employ Apex successfully and to provide feedback needed to guide iterative, user-centered refinement.
Elliptic Curve Cryptography with Security System in Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Huang, Xu; Sharma, Dharmendra
2010-10-01
The rapid progress of wireless communications and embedded micro-electro-system technologies has made wireless sensor networks (WSN) very popular and even become part of our daily life. WSNs design are generally application driven, namely a particular application's requirements will determine how the network behaves. However, the natures of WSN have attracted increasing attention in recent years due to its linear scalability, a small software footprint, low hardware implementation cost, low bandwidth requirement, and high device performance. It is noted that today's software applications are mainly characterized by their component-based structures which are usually heterogeneous and distributed, including the WSNs. But WSNs typically need to configure themselves automatically and support as hoc routing. Agent technology provides a method for handling increasing software complexity and supporting rapid and accurate decision making. This paper based on our previous works [1, 2], three contributions have made, namely (a) fuzzy controller for dynamic slide window size to improve the performance of running ECC (b) first presented a hidden generation point for protection from man-in-the middle attack and (c) we first investigates multi-agent applying for key exchange together. Security systems have been drawing great attentions as cryptographic algorithms have gained popularity due to the natures that make them suitable for use in constrained environment such as mobile sensor information applications, where computing resources and power availability are limited. Elliptic curve cryptography (ECC) is one of high potential candidates for WSNs, which requires less computational power, communication bandwidth, and memory in comparison with other cryptosystem. For saving pre-computing storages recently there is a trend for the sensor networks that the sensor group leaders rather than sensors communicate to the end database, which highlighted the needs to prevent from the man-in-the middle attack. A designed a hidden generator point that offer a good protection from the man-in-the middle (MinM) attack which becomes one of major worries for the sensor's networks with multiagent system is also discussed.
NASA Astrophysics Data System (ADS)
Black, Randy; Bai, Haowei; Michalicek, Andrew; Shelton, Blaine; Villela, Mark
2008-01-01
Currently, autonomy in space applications is limited by a variety of technology gaps. Innovative application of wireless technology and avionics architectural principles drawn from the Orion crew exploration vehicle provide solutions for several of these gaps. The Vision for Space Exploration envisions extensive use of autonomous systems. Economic realities preclude continuing the level of operator support currently required of autonomous systems in space. In order to decrease the number of operators, more autonomy must be afforded to automated systems. However, certification authorities have been notoriously reluctant to certify autonomous software in the presence of humans or when costly missions may be jeopardized. The Orion avionics architecture, drawn from advanced commercial aircraft avionics, is based upon several architectural principles including partitioning in software. Robust software partitioning provides "brick wall" separation between software applications executing on a single processor, along with controlled data movement between applications. Taking advantage of these attributes, non-deterministic applications can be placed in one partition and a "Safety" application created in a separate partition. This "Safety" partition can track the position of astronauts or critical equipment and prevent any unsafe command from executing. Only the Safety partition need be certified to a human rated level. As a proof-of-concept demonstration, Honeywell has teamed with the Ultra WideBand (UWB) Working Group at NASA Johnson Space Center to provide tracking of humans, autonomous systems, and critical equipment. Using UWB the NASA team can determine positioning to within less than one inch resolution, allowing a Safety partition to halt operation of autonomous systems in the event that an unplanned collision is imminent. Another challenge facing autonomous systems is the coordination of multiple autonomous agents. Current approaches address the issue as one of networking and coordination of multiple independent units, each with its own mission. As a proof-of-concept Honeywell is developing and testing various algorithms that lead to a deterministic, fault tolerant, reliable wireless backplane. Just as advanced avionics systems control several subsystems, actuators, sensors, displays, etc.; a single "master" autonomous agent (or base station computer) could control multiple autonomous systems. The problem is simplified to controlling a flexible body consisting of several sensors and actuators, rather than one of coordinating multiple independent units. By filling technology gaps associated with space based autonomous system, wireless technology and Orion architectural principles provide the means for decreasing operational costs and simplifying problems associated with collaboration of multiple autonomous systems.
Multi-agent integrated password management (MIPM) application secured with encryption
NASA Astrophysics Data System (ADS)
Awang, Norkhushaini; Zukri, Nurul Hidayah Ahmad; Rashid, Nor Aimuni Md; Zulkifli, Zuhri Arafah; Nazri, Nor Afifah Mohd
2017-10-01
Users use weak passwords and reuse them on different websites and applications. Password managers are a solution to store login information for websites and help users log in automatically. This project developed a system that acts as an agent managing passwords. Multi-Agent Integrated Password Management (MIPM) is an application using encryption that provides users with secure storage of their login account information such as their username, emails and passwords. This project was developed on an Android platform with an encryption agent using Java Agent Development Environment (JADE). The purpose of the embedded agents is to act as a third-party software to ease the encryption process, and in the future, the developed encryption agents can form part of the security system. This application can be used by the computer and mobile users. Currently, users log into many applications causing them to use unique passwords to prevent password leaking. The crypto agent handles the encryption process using an Advanced Encryption Standard (AES) 128-bit encryption algorithm. As a whole, MIPM is developed on the Android application to provide a secure platform to store passwords and has high potential to be commercialised for public use.
Brahms Mobile Agents: Architecture and Field Tests
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Kaskiris, Charis; vanHoof, Ron
2002-01-01
We have developed a model-based, distributed architecture that integrates diverse components in a system designed for lunar and planetary surface operations: an astronaut's space suit, cameras, rover/All-Terrain Vehicle (ATV), robotic assistant, other personnel in a local habitat, and a remote mission support team (with time delay). Software processes, called agents, implemented in the Brahms language, run on multiple, mobile platforms. These mobile agents interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. The Brahms-based mobile agent architecture (MAA) uses a novel combination of agent types so the software agents may understand and facilitate communications between people and between system components. A state-of-the-art spoken dialogue interface is integrated with Brahms models, supporting a speech-driven field observation record and rover command system (e.g., return here later and bring this back to the habitat ). This combination of agents, rover, and model-based spoken dialogue interface constitutes a personal assistant. An important aspect of the methodology involves first simulating the entire system in Brahms, then configuring the agents into a run-time system.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Code of Federal Regulations, 2011 CFR
2011-01-01
... through the AES. A service center shall be certified to transmit electronically to the AES. The USPPI, authorized agent, or service center may use a software package designed by a certified vendor to file EEI... software vendor or service center shall complete certification testing. Service centers may only transmit...
Dynamic electronic institutions in agent oriented cloud robotic systems.
Nagrath, Vineet; Morel, Olivier; Malik, Aamir; Saad, Naufal; Meriaudeau, Fabrice
2015-01-01
The dot-com bubble bursted in the year 2000 followed by a swift movement towards resource virtualization and cloud computing business model. Cloud computing emerged not as new form of computing or network technology but a mere remoulding of existing technologies to suit a new business model. Cloud robotics is understood as adaptation of cloud computing ideas for robotic applications. Current efforts in cloud robotics stress upon developing robots that utilize computing and service infrastructure of the cloud, without debating on the underlying business model. HTM5 is an OMG's MDA based Meta-model for agent oriented development of cloud robotic systems. The trade-view of HTM5 promotes peer-to-peer trade amongst software agents. HTM5 agents represent various cloud entities and implement their business logic on cloud interactions. Trade in a peer-to-peer cloud robotic system is based on relationships and contracts amongst several agent subsets. Electronic Institutions are associations of heterogeneous intelligent agents which interact with each other following predefined norms. In Dynamic Electronic Institutions, the process of formation, reformation and dissolution of institutions is automated leading to run time adaptations in groups of agents. DEIs in agent oriented cloud robotic ecosystems bring order and group intellect. This article presents DEI implementations through HTM5 methodology.
Managing the Evolution of an Enterprise Architecture using a MAS-Product-Line Approach
NASA Technical Reports Server (NTRS)
Pena, Joaquin; Hinchey, Michael G.; Resinas, manuel; Sterritt, Roy; Rash, James L.
2006-01-01
We view an evolutionary system ns being n software product line. The core architecture is the unchanging part of the system, and each version of the system may be viewed as a product from the product line. Each "product" may be described as the core architecture with sonre agent-based additions. The result is a multiagent system software product line. We describe an approach to such n Software Product Line-based approach using the MaCMAS Agent-Oriented nzethoclology. The approach scales to enterprise nrchitectures as a multiagent system is an approprinre means of representing a changing enterprise nrchitectclre nnd the inferaction between components in it.
Microgrid and Inverter Control and Simulator Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-13
A collection of software that can simulate the operation of an inverter on a microgrid or control a real inverter. In addition, it can simulate the control of multiple nodes on a microgrid." Application: Simulation of inverters and microgrids; control of inverters on microgrids." The MMI submodule is designed to control custom inverter hardware, and to simulate that hardware. The INVERTER submodule is only the simulator code, and is of an earlier generation than the simulator in MMI. The MICROGRID submodule is an agent-based simulator of multiple nodes on a microgrid which presents a web interface. The WIND submodule producesmore » movies of wind data with a web interface.« less
NASA Technical Reports Server (NTRS)
Benard, Doug; Dorais, Gregory A.; Gamble, Ed; Kanefsky, Bob; Kurien, James; Millar, William; Muscettola, Nicola; Nayak, Pandu; Rouquette, Nicolas; Rajan, Kanna;
2000-01-01
Remote Agent (RA) is a model-based, reusable artificial intelligence (At) software system that enables goal-based spacecraft commanding and robust fault recovery. RA was flight validated during an experiment on board of DS1 between May 17th and May 21th, 1999.
Particle-based simulations of self-motile suspensions
NASA Astrophysics Data System (ADS)
Hinz, Denis F.; Panchenko, Alexander; Kim, Tae-Yeon; Fried, Eliot
2015-11-01
A simple model for simulating flows of active suspensions is investigated. The approach is based on dissipative particle dynamics. While the model is potentially applicable to a wide range of self-propelled particle systems, the specific class of self-motile bacterial suspensions is considered as a modeling scenario. To mimic the rod-like geometry of a bacterium, two dissipative particle dynamics particles are connected by a stiff harmonic spring to form an aggregate dissipative particle dynamics molecule. Bacterial motility is modeled through a constant self-propulsion force applied along the axis of each such aggregate molecule. The model accounts for hydrodynamic interactions between self-propelled agents through the pairwise dissipative interactions conventional to dissipative particle dynamics. Numerical simulations are performed using a customized version of the open-source software package LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) software package. Detailed studies of the influence of agent concentration, pairwise dissipative interactions, and Stokes friction on the statistics of the system are provided. The simulations are used to explore the influence of hydrodynamic interactions in active suspensions. For high agent concentrations in combination with dominating pairwise dissipative forces, strongly correlated motion patterns and a fluid-like spectral distributions of kinetic energy are found. In contrast, systems dominated by Stokes friction exhibit weaker spatial correlations of the velocity field. These results indicate that hydrodynamic interactions may play an important role in the formation of spatially extended structures in active suspensions.
Adaptive awareness for personal and small group decision making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perano, Kenneth J.; Tucker, Steve; Pancerella, Carmen M.
2003-12-01
Many situations call for the use of sensors monitoring physiological and environmental data. In order to use the large amounts of sensor data to affect decision making, we are coupling heterogeneous sensors with small, light-weight processors, other powerful computers, wireless communications, and embedded intelligent software. The result is an adaptive awareness and warning tool, which provides both situation awareness and personal awareness to individuals and teams. Central to this tool is a sensor-independent architecture, which combines both software agents and a reusable core software framework that manages the available hardware resources and provides services to the agents. Agents can recognizemore » cues from the data, warn humans about situations, and act as decision-making aids. Within the agents, self-organizing maps (SOMs) are used to process physiological data in order to provide personal awareness. We have employed a novel clustering algorithm to train the SOM to discern individual body states and activities. This awareness tool has broad applicability to emergency teams, military squads, military medics, individual exercise and fitness monitoring, health monitoring for sick and elderly persons, and environmental monitoring in public places. This report discusses our hardware decisions, software framework, and a pilot awareness tool, which has been developed at Sandia National Laboratories.« less
Mobile Agents: A Distributed Voice-Commanded Sensory and Robotic System for Surface EVA Assistance
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Alena, Rick; Crawford, Sekou; Dowding, John; Graham, Jeff; Kaskiris, Charis; Tyree, Kim S.; vanHoof, Ronnie
2003-01-01
A model-based, distributed architecture integrates diverse components in a system designed for lunar and planetary surface operations: spacesuit biosensors, cameras, GPS, and a robotic assistant. The system transmits data and assists communication between the extra-vehicular activity (EVA) astronauts, the crew in a local habitat, and a remote mission support team. Software processes ("agents"), implemented in a system called Brahms, run on multiple, mobile platforms, including the spacesuit backpacks, all-terrain vehicles, and robot. These "mobile agents" interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. Different types of agents relate platforms to each other ("proxy agents"), devices to software ("comm agents"), and people to the system ("personal agents"). A state-of-the-art spoken dialogue interface enables people to communicate with their personal agents, supporting a speech-driven navigation and scheduling tool, field observation record, and rover command system. An important aspect of the engineering methodology involves first simulating the entire hardware and software system in Brahms, and then configuring the agents into a runtime system. Design of mobile agent functionality has been based on ethnographic observation of scientists working in Mars analog settings in the High Canadian Arctic on Devon Island and the southeast Utah desert. The Mobile Agents system is developed iteratively in the context of use, with people doing authentic work. This paper provides a brief introduction to the architecture and emphasizes the method of empirical requirements analysis, through which observation, modeling, design, and testing are integrated in simulated EVA operations.
A web-based approach for electrocardiogram monitoring in the home.
Magrabi, F; Lovell, N H; Celler, B G
1999-05-01
A Web-based electrocardiogram (ECG) monitoring service in which a longitudinal clinical record is used for management of patients, is described. The Web application is used to collect clinical data from the patient's home. A database on the server acts as a central repository where this clinical information is stored. A Web browser provides access to the patient's records and ECG data. We discuss the technologies used to automate the retrieval and storage of clinical data from a patient database, and the recording and reviewing of clinical measurement data. On the client's Web browser, ActiveX controls embedded in the Web pages provide a link between the various components including the Web server, Web page, the specialised client side ECG review and acquisition software, and the local file system. The ActiveX controls also implement FTP functions to retrieve and submit clinical data to and from the server. An intelligent software agent on the server is activated whenever new ECG data is sent from the home. The agent compares historical data with newly acquired data. Using this method, an optimum patient care strategy can be evaluated, a summarised report along with reminders and suggestions for action is sent to the doctor and patient by email.
NASA Astrophysics Data System (ADS)
Bosse, Stefan
2013-05-01
Sensorial materials consisting of high-density, miniaturized, and embedded sensor networks require new robust and reliable data processing and communication approaches. Structural health monitoring is one major field of application for sensorial materials. Each sensor node provides some kind of sensor, electronics, data processing, and communication with a strong focus on microchip-level implementation to meet the goals of miniaturization and low-power energy environments, a prerequisite for autonomous behaviour and operation. Reliability requires robustness of the entire system in the presence of node, link, data processing, and communication failures. Interaction between nodes is required to manage and distribute information. One common interaction model is the mobile agent. An agent approach provides stronger autonomy than a traditional object or remote-procedure-call based approach. Agents can decide for themselves, which actions are performed, and they are capable of flexible behaviour, reacting on the environment and other agents, providing some degree of robustness. Traditionally multi-agent systems are abstract programming models which are implemented in software and executed on program controlled computer architectures. This approach does not well scale to micro-chip level and requires full equipped computers and communication structures, and the hardware architecture does not consider and reflect the requirements for agent processing and interaction. We propose and demonstrate a novel design paradigm for reliable distributed data processing systems and a synthesis methodology and framework for multi-agent systems implementable entirely on microchip-level with resource and power constrained digital logic supporting Agent-On-Chip architectures (AoC). The agent behaviour and mobility is fully integrated on the micro-chip using pipelined communicating processes implemented with finite-state machines and register-transfer logic. The agent behaviour, interaction (communication), and mobility features are modelled and specified on a machine-independent abstract programming level using a state-based agent behaviour language (APL). With this APL a high-level agent compiler is able to synthesize a hardware model (RTL, VHDL), a software model (C, ML), or a simulation model (XML) suitable to simulate a multi-agent system using the SeSAm simulator framework. Agent communication is provided by a simple tuple-space database implemented on node level providing fault tolerant access of global data. A novel synthesis development kit (SynDK) based on a graph-structured database approach is introduced to support the rapid development of compilers and synthesis tools, used for example for the design and implementation of the APL compiler.
Advantages of Brahms for Specifying and Implementing a Multiagent Human-Robotic Exploration System
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Kaskiris, Charis; vanHoof, Ron
2003-01-01
We have developed a model-based, distributed architecture that integrates diverse components in a system designed for lunar and planetary surface operations: an astronaut's space suit, cameras, all-terrain vehicles, robotic assistant, crew in a local habitat, and mission support team. Software processes ('agents') implemented in the Brahms language, run on multiple, mobile platforms. These mobile agents interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. The Brahms-based mobile agent architecture (MAA) uses a novel combination of agent types so the software agents may understand and facilitate communications between people and between system components. A state-of-the-art spoken dialogue interface is integrated with Brahms models, supporting a speech-driven field observation record and rover command system. An important aspect of the methodology involves first simulating the entire system in Brahms, then configuring the agents into a runtime system Thus, Brahms provides a language, engine, and system builder's toolkit for specifying and implementing multiagent systems.
Metrics of a Paradigm for Intelligent Control
NASA Technical Reports Server (NTRS)
Hexmoor, Henry
1999-01-01
We present metrics for quantifying organizational structures of complex control systems intended for controlling long-lived robotic or other autonomous applications commonly found in space applications. Such advanced control systems are often called integration platforms or agent architectures. Reported metrics span concerns about time, resources, software engineering, and complexities in the world.
Code of Federal Regulations, 2012 CFR
2012-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2010 CFR
2010-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2013 CFR
2013-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2014 CFR
2014-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Improving Automated Lexical and Discourse Analysis of Online Chat Dialog
2007-09-01
include spelling- and grammar-checking on our word processing software; voice-recognition in our automobiles; and telephone-based conversational agents ...conversational agents can help customers make purchases on-line [3]. In addition, discourse analyzers can automatically separate multiple, interleaved...telephone-based conversational agent needs to know if it was asked a question or tasked to do something. Indeed, Stolcke et al demonstrated that
Intelligent Agents and Their Potential for Future Design and Synthesis Environment
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)
1999-01-01
This document contains the proceedings of the Workshop on Intelligent Agents and Their Potential for Future Design and Synthesis Environment, held at NASA Langley Research Center, Hampton, VA, September 16-17, 1998. The workshop was jointly sponsored by the University of Virginia's Center for Advanced Computational Technology and NASA. Workshop attendees came from NASA, industry and universities. The objectives of the workshop were to assess the status of intelligent agents technology and to identify the potential of software agents for use in future design and synthesis environment. The presentations covered the current status of agent technology and several applications of intelligent software agents. Certain materials and products are identified in this publication in order to specify adequately the materials and products that were investigated in the research effort. In no case does such identification imply recommendation or endorsement of products by NASA, nor does it imply that the materials and products are the only ones or the best ones available for this purpose. In many cases equivalent materials and products are available and would probably produce equivalent results.
A comprehensive overview of the applications of artificial life.
Kim, Kyung-Joong; Cho, Sung-Bae
2006-01-01
We review the applications of artificial life (ALife), the creation of synthetic life on computers to study, simulate, and understand living systems. The definition and features of ALife are shown by application studies. ALife application fields treated include robot control, robot manufacturing, practical robots, computer graphics, natural phenomenon modeling, entertainment, games, music, economics, Internet, information processing, industrial design, simulation software, electronics, security, data mining, and telecommunications. In order to show the status of ALife application research, this review primarily features a survey of about 180 ALife application articles rather than a selected representation of a few articles. Evolutionary computation is the most popular method for designing such applications, but recently swarm intelligence, artificial immune network, and agent-based modeling have also produced results. Applications were initially restricted to the robotics and computer graphics, but presently, many different applications in engineering areas are of interest.
EVA: Collaborative Distributed Learning Environment Based in Agents.
ERIC Educational Resources Information Center
Sheremetov, Leonid; Tellez, Rolando Quintero
In this paper, a Web-based learning environment developed within the project called Virtual Learning Spaces (EVA, in Spanish) is presented. The environment is composed of knowledge, collaboration, consulting, experimentation, and personal spaces as a collection of agents and conventional software components working over the knowledge domains. All…
An Ontology for Software Engineering Education
ERIC Educational Resources Information Center
Ling, Thong Chee; Jusoh, Yusmadi Yah; Adbullah, Rusli; Alwi, Nor Hayati
2013-01-01
Software agents communicate using ontology. It is important to build an ontology for specific domain such as Software Engineering Education. Building an ontology from scratch is not only hard, but also incur much time and cost. This study aims to propose an ontology through adaptation of the existing ontology which is originally built based on a…
From Goal-Oriented Requirements to Event-B Specifications
NASA Technical Reports Server (NTRS)
Aziz, Benjamin; Arenas, Alvaro E.; Bicarregui, Juan; Ponsard, Christophe; Massonet, Philippe
2009-01-01
In goal-oriented requirements engineering methodologies, goals are structured into refinement trees from high-level system-wide goals down to fine-grained requirements assigned to specific software/ hardware/human agents that can realise them. Functional goals assigned to software agents need to be operationalised into specification of services that the agent should provide to realise those requirements. In this paper, we propose an approach for operationalising requirements into specifications expressed in the Event-B formalism. Our approach has the benefit of aiding software designers by bridging the gap between declarative requirements and operational system specifications in a rigorous manner, enabling powerful correctness proofs and allowing further refinements down to the implementation level. Our solution is based on verifying that a consistent Event-B machine exhibits properties corresponding to requirements.
In-vitro terahertz spectroscopy of rat skin under the action of dehydrating agents
NASA Astrophysics Data System (ADS)
Kolesnikov, Aleksandr S.; Kolesnikova, Ekaterina A.; Tuchina, Daria K.; Terentyuk, Artem G.; Nazarov, Maxim; Skaptsov, Alexander A.; Shkurinov, Alexander P.; Tuchin, Valery V.
2014-01-01
In the paper we present the results of study of rat skin and rat subcutaneous tumor under the action of dehydrating agents in terahertz (THz) range (15-30 THz). Frustrated Total Internal Reflection (FTIR) spectra were obtained with infrared Fourier spectrometer Nicolet 6700 and then they were recalculated in the transmittance spectra with Omnic software. Experiments were carried out with healthy and xenografted tumor in skin tissue in vitro. As the dehydrating agents 100% glycerol, 40%-water glucose solution, PEG-600, and propylene glycol were used. To determine the effect of the optical clearing agent (OCA), the alterations of terahertz transmittance for the samples were analyzed. The results have shown that PEG-600 and 40%-glucose water solution are the most effective dehydrating agent. The transmittance of healthy skin after PEG-600 application increased approximately by 6% and the transmittance of tumor tissue after PEG- 600 and 40%-glucose water solution application increased approximately by 8%. Obtained data can be useful for further application of terahertz radiation for tumor diagnostics.
Validating agent oriented methodology (AOM) for netlogo modelling and simulation
NASA Astrophysics Data System (ADS)
WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan
2017-10-01
AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.
Component Models for Semantic Web Languages
NASA Astrophysics Data System (ADS)
Henriksson, Jakob; Aßmann, Uwe
Intelligent applications and agents on the Semantic Web typically need to be specified with, or interact with specifications written in, many different kinds of formal languages. Such languages include ontology languages, data and metadata query languages, as well as transformation languages. As learnt from years of experience in development of complex software systems, languages need to support some form of component-based development. Components enable higher software quality, better understanding and reusability of already developed artifacts. Any component approach contains an underlying component model, a description detailing what valid components are and how components can interact. With the multitude of languages developed for the Semantic Web, what are their underlying component models? Do we need to develop one for each language, or is a more general and reusable approach achievable? We present a language-driven component model specification approach. This means that a component model can be (automatically) generated from a given base language (actually, its specification, e.g. its grammar). As a consequence, we can provide components for different languages and simplify the development of software artifacts used on the Semantic Web.
Developing framework for agent- based diabetes disease management system: user perspective.
Mohammadzadeh, Niloofar; Safdari, Reza; Rahimi, Azin
2014-02-01
One of the characteristics of agents is mobility which makes them very suitable for remote electronic health and tele medicine. The aim of this study is developing a framework for agent based diabetes information management at national level through identifying required agents. The main tool is a questioner that is designed in three sections based on studying library resources, performance of major organizations in the field of diabetes in and out of the country and interviews with experts in the medical, health information management and software fields. Questionnaires based on Delphi methods were distributed among 20 experts. In order to design and identify agents required in health information management for the prevention and appropriate and rapid treatment of diabetes, the results were analyzed using SPSS 17 and Results were plotted with FREEPLANE mind map software. ACCESS TO DATA TECHNOLOGY IN PROPOSED FRAMEWORK IN ORDER OF PRIORITY IS: mobile (mean 1/80), SMS, EMAIL (mean 2/80), internet, web (mean 3/30), phone (mean 3/60), WIFI (mean 4/60). In delivering health care to diabetic patients, considering social and human aspects is essential. Having a systematic view for implementation of agent systems and paying attention to all aspects such as feedbacks, user acceptance, budget, motivation, hierarchy, useful standards, affordability of individuals, identifying barriers and opportunities and so on, are necessary.
Designing Distributed Learning Environments with Intelligent Software Agents
ERIC Educational Resources Information Center
Lin, Fuhua, Ed.
2005-01-01
"Designing Distributed Learning Environments with Intelligent Software Agents" reports on the most recent advances in agent technologies for distributed learning. Chapters are devoted to the various aspects of intelligent software agents in distributed learning, including the methodological and technical issues on where and how intelligent agents…
Evacuation Simulation in Kalayaan Residence Hall, up Diliman Using Gama Simulation Software
NASA Astrophysics Data System (ADS)
Claridades, A. R. C.; Villanueva, J. K. S.; Macatulad, E. G.
2016-09-01
Agent-Based Modeling (ABM) has recently been adopted in some studies for the modelling of events as a dynamic system given a set of events and parameters. In principle, ABM employs individual agents with assigned attributes and behaviors and simulates their behavior around their environment and interaction with other agents. This can be a useful tool in both micro and macroscale-applications. In this study, a model initially created and applied to an academic building was implemented in a dormitory. In particular, this research integrates three-dimensional Geographic Information System (GIS) with GAMA as the multi-agent based evacuation simulation and is implemented in Kalayaan Residence Hall. A three-dimensional GIS model is created based on the floor plans and demographic data of the dorm, including respective pathways as networks, rooms, floors, exits and appropriate attributes. This model is then re-implemented in GAMA. Different states of the agents and their effect on their evacuation time were then observed. GAMA simulation with varying path width was also implemented. It has been found out that compared to their original states, panic, eating and studying will hasten evacuation, and on the other hand, sleeping and being on the bathrooms will be impedances. It is also concluded that evacuation time will be halved when path widths are doubled, however it is recommended for further studies for pathways to be modeled as spaces instead of lines. A more scientific basis for predicting agent behavior in these states is also recommended for more realistic results.
CLOUDCLOUD : general-purpose instrument monitoring and data managing software
NASA Astrophysics Data System (ADS)
Dias, António; Amorim, António; Tomé, António
2016-04-01
An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.
More About Software for No-Loss Computing
NASA Technical Reports Server (NTRS)
Edmonds, Iarina
2007-01-01
A document presents some additional information on the subject matter of "Integrated Hardware and Software for No- Loss Computing" (NPO-42554), which appears elsewhere in this issue of NASA Tech Briefs. To recapitulate: The hardware and software designs of a developmental parallel computing system are integrated to effectuate a concept of no-loss computing (NLC). The system is designed to reconfigure an application program such that it can be monitored in real time and further reconfigured to continue a computation in the event of failure of one of the computers. The design provides for (1) a distributed class of NLC computation agents, denoted introspection agents, that effects hierarchical detection of anomalies; (2) enhancement of the compiler of the parallel computing system to cause generation of state vectors that can be used to continue a computation in the event of a failure; and (3) activation of a recovery component when an anomaly is detected.
Generic Divide and Conquer Internet-Based Computing
NASA Technical Reports Server (NTRS)
Radenski, Atanas; Follen, Gregory J. (Technical Monitor)
2001-01-01
The rapid growth of internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of new, internet-oriented software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high -performance computing applications community. The general goal of this research project is to contribute to better understanding of the transition to internet-based high -performance computing and to develop solutions for some of the difficulties of this transition. More specifically, our goal is to design an architecture for generic divide and conquer internet-based computing, to develop a portable implementation of this architecture, to create an example library of high-performance divide-and-conquer computing agents that run on top of this architecture, and to evaluate the performance of these agents. We have been designing an architecture that incorporates a master task-pool server and utilizes satellite computational servers that operate on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. Our designed architecture is intended to be complementary to and accessible from computational grids such as Globus, Legion, and Condor. Grids provide remote access to existing high-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end internet nodes. Our project is focused on a generic divide-and-conquer paradigm and its applications that operate on a loose and ever changing pool of lower-end internet nodes.
Building distributed rule-based systems using the AI Bus
NASA Technical Reports Server (NTRS)
Schultz, Roger D.; Stobie, Iain C.
1990-01-01
The AI Bus software architecture was designed to support the construction of large-scale, production-quality applications in areas of high technology flux, running heterogeneous distributed environments, utilizing a mix of knowledge-based and conventional components. These goals led to its current development as a layered, object-oriented library for cooperative systems. This paper describes the concepts and design of the AI Bus and its implementation status as a library of reusable and customizable objects, structured by layers from operating system interfaces up to high-level knowledge-based agents. Each agent is a semi-autonomous process with specialized expertise, and consists of a number of knowledge sources (a knowledge base and inference engine). Inter-agent communication mechanisms are based on blackboards and Actors-style acquaintances. As a conservative first implementation, we used C++ on top of Unix, and wrapped an embedded Clips with methods for the knowledge source class. This involved designing standard protocols for communication and functions which use these protocols in rules. Embedding several CLIPS objects within a single process was an unexpected problem because of global variables, whose solution involved constructing and recompiling a C++ version of CLIPS. We are currently working on a more radical approach to incorporating CLIPS, by separating out its pattern matcher, rule and fact representations and other components as true object oriented modules.
Knowledge focus via software agents
NASA Astrophysics Data System (ADS)
Henager, Donald E.
2001-09-01
The essence of military Command and Control (C2) is making knowledge intensive decisions in a limited amount of time using uncertain, incorrect, or outdated information. It is essential to provide tools to decision-makers that provide: * Management of friendly forces by treating the "friendly resources as a system". * Rapid assessment of effects of military actions againt the "enemy as a system". * Assessment of how an enemy should, can, and could react to friendly military activities. Software agents in the form of mission agents, target agents, maintenance agents, and logistics agents can meet this information challenge. The role of each agent is to know all the details about its assigned mission, target, maintenance, or logistics entity. The Mission Agent would fight for mission resources based on the mission priority and analyze the effect that a proposed mission's results would have on the enemy. The Target Agent (TA) communicates with other targets to determine its role in the system of targets. A system of TAs would be able to inform a planner or analyst of the status of a system of targets, the effect of that status, adn the effect of attacks on that system. The system of TAs would also be able to analyze possible enemy reactions to attack by determining ways to minimize the effect of attack, such as rerouting traffic or using deception. The Maintenance Agent would scheudle maintenance events and notify the maintenance unit. The Logistics Agent would manage shipment and delivery of supplies to maintain appropriate levels of weapons, fuel and spare parts. The central idea underlying this case of software agents is knowledge focus. Software agents are createad automatically to focus their attention on individual real-world entities (e.g., missions, targets) and view the world from that entities perspective. The agent autonomously monitors the entity, identifies problems/opportunities, formulates solutions, and informs the decision-maker. The agent must be able to communicate to receive and disseminate information and provide the decision-maker with assistance via focused knowledge. THe agent must also be able to monitor the state of its own environment and make decisions necessary to carry out its delegated tasks. Agents bring three elements to the C2 domain that offer to improve decision-making. First, they provide higher-quality feedback and provide it more often. In doing so, the feedback loop becomes nearly continuous, reducing or eliminating delays in situation updates to decision-makers. Working with the most current information possible improves the control process, thus enabling effects based operations. Second, the agents accept delegation of actions and perform those actions following an established process. Agents' consistent actions reduce the variability of human input and stabilize the control process. Third, through the delegation of actions, agents ensure 100 percent consideration of plan details.
Launch Commit Criteria Monitoring Agent
NASA Technical Reports Server (NTRS)
Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Dan A.; Kelly, Andrew O.; Boeloeni, Ladislau
2005-01-01
The Spaceport Processing Systems Branch at NASA Kennedy Space Center has developed and deployed a software agent to monitor the Space Shuttle's ground processing telemetry stream. The application, the Launch Commit Criteria Monitoring Agent, increases situational awareness for system and hardware engineers during Shuttle launch countdown. The agent provides autonomous monitoring of the telemetry stream, automatically alerts system engineers when predefined criteria have been met, identifies limit warnings and violations of launch commit criteria, aids Shuttle engineers through troubleshooting procedures, and provides additional insight to verify appropriate troubleshooting of problems by contractors. The agent has successfully detected launch commit criteria warnings and violations on a simulated playback data stream. Efficiency and safety are improved through increased automation.
Biowep: a workflow enactment portal for bioinformatics applications.
Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano
2007-03-08
The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics - LITBIO.
Biowep: a workflow enactment portal for bioinformatics applications
Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano
2007-01-01
Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of effective workflows can significantly improve automation of in-silico analysis. Biowep is available for interested researchers as a reference portal. They are invited to submit their workflows to the workflow repository. Biowep is further being developed in the sphere of the Laboratory of Interdisciplinary Technologies in Bioinformatics – LITBIO. PMID:17430563
Literature Review on Systems of Systems (SoS): A Methodology With Preliminary Results
2013-11-01
Appendix H. The Enhanced ISAAC Neural Simulation Toolkit (EINSTein) 73 Appendix I. The Map Aware Nonuniform Automata (MANA) Agent-Based Model 81...83 Figure I-3. Quadrant chart addressing SoS and associated SoSA designs for the Map Aware Nonuniform Automata (MANA) agent...Map Aware Nonuniform Automata (MANA) agent-based model. 85 Table I-2. SoS and SoSA software component maturation scores associated with the Map
WE-G-BRA-02: SafetyNet: Automating Radiotherapy QA with An Event Driven Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadley, S; Kessler, M; Litzenberg, D
2015-06-15
Purpose: Quality assurance is an essential task in radiotherapy that often requires many manual tasks. We investigate the use of an event driven framework in conjunction with software agents to automate QA and eliminate wait times. Methods: An in house developed subscription-publication service, EventNet, was added to the Aria OIS to be a message broker for critical events occurring in the OIS and software agents. Software agents operate without user intervention and perform critical QA steps. The results of the QA are documented and the resulting event is generated and passed back to EventNet. Users can subscribe to those eventsmore » and receive messages based on custom filters designed to send passing or failing results to physicists or dosimetrists. Agents were developed to expedite the following QA tasks: Plan Revision, Plan 2nd Check, SRS Winston-Lutz isocenter, Treatment History Audit, Treatment Machine Configuration. Results: Plan approval in the Aria OIS was used as the event trigger for plan revision QA and Plan 2nd check agents. The agents pulled the plan data, executed the prescribed QA, stored the results and updated EventNet for publication. The Winston Lutz agent reduced QA time from 20 minutes to 4 minutes and provided a more accurate quantitative estimate of radiation isocenter. The Treatment Machine Configuration agent automatically reports any changes to the Treatment machine or HDR unit configuration. The agents are reliable, act immediately, and execute each task identically every time. Conclusion: An event driven framework has inverted the data chase in our radiotherapy QA process. Rather than have dosimetrists and physicists push data to QA software and pull results back into the OIS, the software agents perform these steps immediately upon receiving the sentinel events from EventNet. Mr Keranen is an employee of Varian Medical Systems. Dr. Moran’s institution receives research support for her effort for a linear accelerator QA project from Varian Medical Systems. Other quality projects involving her effort are funded by Blue Cross Blue Shield of Michigan, Breast Cancer Research Foundation, and the NIH.« less
Govaerts, Paul J; Vaerenberg, Bart; De Ceulaer, Geert; Daemers, Kristin; De Beukelaer, Carina; Schauwers, Karen
2010-08-01
An intelligent agent, Fitting to Outcomes eXpert, was developed to optimize and automate Cochlear implant (CI) programming. The current article describes the rationale, development, and features of this tool. Cochlear implant fitting is a time-consuming procedure to define the value of a subset of the available electric parameters based primarily on behavioral responses. It is comfort-driven with high intraindividual and interindividual variability both with respect to the patient and to the clinician. Its validity in terms of process control can be questioned. Good clinical practice would require an outcome-driven approach. An intelligent agent may help solve the complexity of addressing more electric parameters based on a range of outcome measures. A software application was developed that consists of deterministic rules that analyze the map settings in the processor together with psychoacoustic test results (audiogram, A(section sign)E phoneme discrimination, A(section sign)E loudness scaling, speech audiogram) obtained with that map. The rules were based on the daily clinical practice and the expertise of the CI programmers. The data transfer to and from this agent is either manual or through seamless digital communication with the CI fitting database and the psychoacoustic test suite. It recommends and executes modifications to the map settings to improve the outcome. Fitting to Outcomes eXpert is an operational intelligent agent, the principles of which are described. Its development and modes of operation are outlined, and a case example is given. Fitting to Outcomes eXpert is in use for more than a year now and seems to be capable to improve the measured outcome. It is argued that this novel tool allows a systematic approach focusing on outcome, reducing the fitting time, and improving the quality of fitting. It introduces principles of artificial intelligence in the process of CI fitting.
Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment
NASA Technical Reports Server (NTRS)
Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun
2006-01-01
Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.
Mobile Router Developed and Tested
NASA Technical Reports Server (NTRS)
Ivancic, William D.
2002-01-01
The NASA Glenn Research Center, under a NASA Space Act Agreement with Cisco Systems, has been performing joint networking research to apply Internet-based technologies and protocols to space-based communications. As a result of this research, NASA performed stringent performance testing of the mobile router, including the interaction of routing and the transport-level protocol. In addition, Cisco Systems developed the mobile router for both commercial and Government markets. The code has become part of the Cisco Systems Internetworking Operating System (IOS) as of release 12.2 (4) T--which will make this capability available to the community at large. The mobile router is software code that resides in a network router and enables entire networks to roam while maintaining connectivity to the Internet. This router code is pertinent to a myriad of applications for both Government and commercial sectors, including the "wireless battlefield." NASA and the Department of Defense will utilize this technology for near-planetary observation and sensing spacecraft. It is also a key enabling technology for aviation-based information applications. Mobile routing will make it possible for information such as weather, air traffic control, voice, and video to be transmitted to aircraft using Internet-based protocols. This technology shows great promise in reducing congested airways and mitigating aviation disasters due to bad weather. The mobile router can also be incorporated into emergency vehicles (such as ambulances and life-flight aircraft) to provide real-time connectivity back to the hospital and health-care experts, enabling the timely application of emergency care. Commercial applications include entertainment services, Internet protocol (IP) telephone, and Internet connectivity for cruise ships, commercial shipping, tour buses, aircraft, and eventually cars. A mobile router, which is based on mobile IP, allows hosts (mobile nodes) to seamlessly "roam" among various IP subnetworks. This is essential in many wireless networks. A mobile router, unlike a mobile IP node, allows entire networks to roam. Hence, a device connected to the mobile router does not need to be a mobile node because the mobile router provides the roaming capabilities. There are three basic elements in the mobile IP: the home agent, the foreign agent, and the mobile node. The home agent is a router on a mobile node's home network that tunnels datagrams for delivery to the mobile node when it is away from home. The foreign agent is a router on a remote network that provides routing services to a registered mobile node. The mobile node is a host or router that changes its point of attachment from one network or subnetwork to another. In mobile routing, virtual communications are maintained by the home agent, which forwards all packets for the mobile networks to the foreign agent. The foreign agent passes the packets to the mobile router, which then forwards the packets to the devices on its networks. As the mobile router moves, it will register with its home agent on its whereabouts via the foreign agent to assure continuous connectivity.
Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assu...
Overview of the joint services lightweight standoff chemical agent detector (JSLSCAD)
NASA Astrophysics Data System (ADS)
Hammond, Barney; Popa, Mirela
2005-05-01
This paper presents a system-level description of the Joint Services Lightweight Standoff Chemical Agent Detector (JSLSCAD). JSLSCAD is a passive Fourier Transform InfraRed (FTIR) based remote sensing system for detecting chemical warfare agents. Unlike predecessor systems, JSLSCAD is capable of operating while on the move to accomplish reconnaissance, surveillance, and contamination avoidance missions. Additionally, the system is designed to meet the needs for application on air and sea as well as ground mobile and fixed site platforms. The core of the system is a rugged Michelson interferometer with a flexure spring bearing mechanism and bi-directional data acquisition capability. The sensor is interfaced to a small, high performance spatial scanner that provides high-speed, two-axis area coverage. Command, control, and processing electronics have been coupled with real time control software and robust detection/discrimination algorithms. Operator interfaces include local and remote options in addition to interfaces to external communications networks. The modular system design facilitates interfacing to the many platforms targeted for JSLSCAD.
2006-12-01
NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI
A model-based approach for automated in vitro cell tracking and chemotaxis analyses.
Debeir, Olivier; Camby, Isabelle; Kiss, Robert; Van Ham, Philippe; Decaestecker, Christine
2004-07-01
Chemotaxis may be studied in two main ways: 1) counting cells passing through an insert (e.g., using Boyden chambers), and 2) directly observing cell cultures (e.g., using Dunn chambers), both in response to stationary concentration gradients. This article promotes the use of Dunn chambers and in vitro cell-tracking, achieved by video microscopy coupled with automatic image analysis software, in order to extract quantitative and qualitative measurements characterizing the response of cells to a diffusible chemical agent. Previously, we set up a videomicroscopy system coupled with image analysis software that was able to compute cell trajectories from in vitro cell cultures. In the present study, we are introducing a new software increasing the application field of this system to chemotaxis studies. This software is based on an adapted version of the active contour methodology, enabling each cell to be efficiently tracked for hours and resulting in detailed descriptions of individual cell trajectories. The major advantages of this method come from an improved robustness with respect to variability in cell morphologies between different cell lines and dynamical changes in cell shape during cell migration. Moreover, the software includes a very small number of parameters which do not require overly sensitive tuning. Finally, the running time of the software is very short, allowing improved possibilities in acquisition frequency and, consequently, improved descriptions of complex cell trajectories, i.e. trajectories including cell division and cell crossing. We validated this software on several artificial and real cell culture experiments in Dunn chambers also including comparisons with manual (human-controlled) analyses. We developed new software and data analysis tools for automated cell tracking which enable cell chemotaxis to be efficiently analyzed. Copyright 2004 Wiley-Liss, Inc.
A Mode of Combined ERP and KMS Knowledge Management System Construction
NASA Astrophysics Data System (ADS)
Yuena, Kang; Yangeng, Wen; Qun, Zhou
The core of ERP and knowledge management is quite similar; both will send appropriate knowledge (goods, funds) to the right people (position) at the right time. It is reasonable to believe that increase the knowledge management system in ERP will help companies achieve their goals better. This paper compares the concept of logical point of hall three-dimensional structure of the knowledge management system and the ERP in methodology level. And found they are very similar in the time dimension, logic dimension and knowledge dimension. This laid the basis of methodology in the simultaneous planning, implementation and applications. And then proposed a knowledge-based ERP Multi-Agent Management System Model. Finally, the paper described the process from planning to implementation of knowledge management ERP system with multi-Agent interaction and impact from three concepts, management thinking, software and system.
An Agent-Based Model of Farmer Decision Making in Jordan
NASA Astrophysics Data System (ADS)
Selby, Philip; Medellin-Azuara, Josue; Harou, Julien; Klassert, Christian; Yoon, Jim
2016-04-01
We describe an agent based hydro-economic model of groundwater irrigated agriculture in the Jordan Highlands. The model employs a Multi-Agent-Simulation (MAS) framework and is designed to evaluate direct and indirect outcomes of climate change scenarios and policy interventions on farmer decision making, including annual land use, groundwater use for irrigation, and water sales to a water tanker market. Land use and water use decisions are simulated for groups of farms grouped by location and their behavioural and economic similarities. Decreasing groundwater levels, and the associated increase in pumping costs, are important drivers for change within Jordan'S agricultural sector. We describe how this is considered by coupling of agricultural and groundwater models. The agricultural production model employs Positive Mathematical Programming (PMP), a method for calibrating agricultural production functions to observed planted areas. PMP has successfully been used with disaggregate models for policy analysis. We adapt the PMP approach to allow explicit evaluation of the impact of pumping costs, groundwater purchase fees and a water tanker market. The work demonstrates the applicability of agent-based agricultural decision making assessment in the Jordan Highlands and its integration with agricultural model calibration methods. The proposed approach is designed and implemented with software such that it could be used to evaluate a variety of physical and human influences on decision making in agricultural water management.
An agent-oriented approach to automated mission operations
NASA Technical Reports Server (NTRS)
Truszkowski, Walt; Odubiyi, Jide
1994-01-01
As we plan for the next generation of Mission Operations Control Center (MOCC) systems, there are many opportunities for the increased utilization of innovative knowledge-based technologies. The innovative technology discussed is an advanced use of agent-oriented approaches to the automation of mission operations. The paper presents an overview of this technology and discusses applied operational scenarios currently being investigated and prototyped. A major focus of the current work is the development of a simple user mechanism that would empower operations staff members to create, in real time, software agents to assist them in common, labor intensive operations tasks. These operational tasks would include: handling routine data and information management functions; amplifying the capabilities of a spacecraft analyst/operator to rapidly identify, analyze, and correct spacecraft anomalies by correlating complex data/information sets and filtering error messages; improving routine monitoring and trend analysis by detecting common failure signatures; and serving as a sentinel for spacecraft changes during critical maneuvers enhancing the system's capabilities to support nonroutine operational conditions with minimum additional staff. An agent-based testbed is under development. This testbed will allow us to: (1) more clearly understand the intricacies of applying agent-based technology in support of the advanced automation of mission operations and (2) access the full set of benefits that can be realized by the proper application of agent-oriented technology in a mission operations environment. The testbed under development addresses some of the data management and report generation functions for the Explorer Platform (EP)/Extreme UltraViolet Explorer (EUVE) Flight Operations Team (FOT). We present an overview of agent-oriented technology and a detailed report on the operation's concept for the testbed.
Abstract-Reasoning Software for Coordinating Multiple Agents
NASA Technical Reports Server (NTRS)
Clement, Bradley; Barrett, Anthony; Rabideau, Gregg; Knight, Russell
2003-01-01
A computer program for scheduling the activities of multiple agents that share limited resources has been incorporated into the Automated Scheduling and Planning Environment (ASPEN) software system, aspects of which have been reported in several previous NASA Tech Briefs articles. In the original intended application, the agents would be multiple spacecraft and/or robotic vehicles engaged in scientific exploration of distant planets. The program could also be used on Earth in such diverse settings as production lines and military maneuvers. This program includes a planning/scheduling subprogram of the iterative repair type that reasons about the activities of multiple agents at abstract levels in order to greatly improve the scheduling of their use of shared resources. The program summarizes the information about the constraints on, and resource requirements of, abstract activities on the basis of the constraints and requirements that pertain to their potential refinements (decomposition into less-abstract and ultimately to primitive activities). The advantage of reasoning about summary information is that time needed to find consistent schedules is exponentially smaller than the time that would be needed for reasoning about the same tasks at the primitive level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kargupta, H.; Stafford, B.; Hamzaoglu, I.
This paper describes an experimental parallel/distributed data mining system PADMA (PArallel Data Mining Agents) that uses software agents for local data accessing and analysis and a web based interface for interactive data visualization. It also presents the results of applying PADMA for detecting patterns in unstructured texts of postmortem reports and laboratory test data for Hepatitis C patients.
Scenario-Based Spoken Interaction with Virtual Agents
ERIC Educational Resources Information Center
Morton, Hazel; Jack, Mervyn A.
2005-01-01
This paper describes a CALL approach which integrates software for speaker independent continuous speech recognition with embodied virtual agents and virtual worlds to create an immersive environment in which learners can converse in the target language in contextualised scenarios. The result is a self-access learning package: SPELL (Spoken…
NASA Astrophysics Data System (ADS)
Tošić, Saša; Mitrović, Dejan; Ivanović, Mirjana
2013-10-01
Agent-oriented programming languages are designed to simplify the development of software agents, especially those that exhibit complex, intelligent behavior. This paper presents recent improvements of AgScala, an agent-oriented programming language based on Scala. AgScala includes declarative constructs for managing beliefs, actions and goals of intelligent agents. Combined with object-oriented and functional programming paradigms offered by Scala, it aims to be an efficient framework for developing both purely reactive, and more complex, deliberate agents. Instead of the Prolog back-end used initially, the new version of AgScala relies on Agent Planning Package, a more advanced system for automated planning and reasoning.
Intelligent web agents for a 3D virtual community
NASA Astrophysics Data System (ADS)
Dave, T. M.; Zhang, Yanqing; Owen, G. S. S.; Sunderraman, Rajshekhar
2003-08-01
In this paper, we propose an Avatar-based intelligent agent technique for 3D Web based Virtual Communities based on distributed artificial intelligence, intelligent agent techniques, and databases and knowledge bases in a digital library. One of the goals of this joint NSF (IIS-9980130) and ACM SIGGRAPH Education Committee (ASEC) project is to create a virtual community of educators and students who have a common interest in comptuer graphics, visualization, and interactive techniqeus. In this virtual community (ASEC World) Avatars will represent the educators, students, and other visitors to the world. Intelligent agents represented as specially dressed Avatars will be available to assist the visitors to ASEC World. The basic Web client-server architecture of the intelligent knowledge-based avatars is given. Importantly, the intelligent Web agent software system for the 3D virtual community is implemented successfully.
The practice of agent-based model visualization.
Dorin, Alan; Geard, Nicholas
2014-01-01
We discuss approaches to agent-based model visualization. Agent-based modeling has its own requirements for visualization, some shared with other forms of simulation software, and some unique to this approach. In particular, agent-based models are typified by complexity, dynamism, nonequilibrium and transient behavior, heterogeneity, and a researcher's interest in both individual- and aggregate-level behavior. These are all traits requiring careful consideration in the design, experimentation, and communication of results. In the case of all but final communication for dissemination, researchers may not make their visualizations public. Hence, the knowledge of how to visualize during these earlier stages is unavailable to the research community in a readily accessible form. Here we explore means by which all phases of agent-based modeling can benefit from visualization, and we provide examples from the available literature and online sources to illustrate key stages and techniques.
Validation techniques of agent based modelling for geospatial simulations
NASA Astrophysics Data System (ADS)
Darvishi, M.; Ahmadi, G.
2014-10-01
One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.
Intelligent Agents for Design and Synthesis Environments: My Summary
NASA Technical Reports Server (NTRS)
Norvig, Peter
1999-01-01
This presentation gives a summary of intelligent agents for design synthesis environments. We'll start with the conclusions, and work backwards to justify them. First, an important assumption is that agents (whatever they are) are good for software engineering. This is especially true for software that operates in an uncertain, changing environment. The "real world" of physical artifacts is like that: uncertain in what we can measure, changing in that things are always breaking down, and we must interact with non-software entities. The second point is that software engineering techniques can contribute to good design. There may have been a time when we wanted to build simple artifacts containing little or no software. But modern aircraft and spacecraft are complex, and rely on a great deal of software. So better software engineering leads to better designed artifacts, especially when we are designing a series of related artifacts and can amortize the costs of software development. The third point is that agents are especially useful for design tasks, above and beyond their general usefulness for software engineering, and the usefulness of software engineering to design.
NASA Astrophysics Data System (ADS)
Huang, Hong-bin; Liu, Wei-ping; Chen, Shun-er; Zheng, Liming
2005-02-01
A new type of CATV network management system developed by universal MCU, which supports SNMP, is proposed in this paper. From the point of view in both hardware and software, the function and method of every modules inside the system, which include communications in the physical layer, protocol process, data process, and etc, are analyzed. In our design, the management system takes IP MAN as data transmission channel and every controlled object in the management structure has a SNMP agent. In the SNMP agent developed, there are four function modules, including physical layer communication module, protocol process module, internal data process module and MIB management module. In the paper, the structure and function of every module are designed and demonstrated while the related hardware circuit, software flow as well as the experimental results are tested. Furthermore, by introducing RTOS into the software programming, the universal MCU procedure can conducts such multi-thread management as fast Ethernet controller driving, TCP/IP process, serial port signal monitoring and so on, which greatly improves efficiency of CPU.
DataSync - sharing data via filesystem
NASA Astrophysics Data System (ADS)
Ulbricht, Damian; Klump, Jens
2014-05-01
Usually research work is a cycle of to hypothesize, to collect data, to corroborate the hypothesis, and finally to publish the results. In this sequence there are possibilities to base the own work on the work of others. Maybe there are candidates of physical samples listed in the IGSN-Registry and there is no need to go on excursion to acquire physical samples. Hopefully the DataCite catalogue lists already metadata of datasets that meet the constraints of the hypothesis and that are now open for reappraisal. After all, working with the measured data to corroborate the hypothesis involves new methods, and proven methods as well as different software tools. A cohort of intermediate data is created that can be shared with colleagues to discuss the research progress and receive a first evaluation. In consequence, the intermediate data should be versioned to easily get back to valid intermediate data, when you notice you get on the wrong track. Things are different for project managers. They want to know what is currently done, what has been done, and what is the last valid data, if somebody has to continue the work. To make life of members of small science projects easier we developed Datasync [1] as a software for sharing and versioning data. Datasync is designed to synchronize directory trees between different computers of a research team over the internet. The software is developed as JAVA application and watches a local directory tree for changes that are replicated as eSciDoc-objects into an eSciDoc-infrastructure [2] using the eSciDoc REST API. Modifications to the local filesystem automatically create a new version of an eSciDoc-object inside the eSciDoc-infrastructure. This way individual folders can be shared between team members while project managers can get a general idea of current status by synchronizing whole project inventories. Additionally XML metadata from separate files can be managed together with data files inside the eSciDoc-objects. While Datasync's major task is to distribute directory trees, we complement its functionality with the PHP-based application panMetaDocs [3]. panMetaDocs is the successor to panMetaWorks [4] and inherits most of its functionality. Through an internet browser PanMetaDocs provides a web-based overview of the datasets inside the eSciDoc-infrastructure. The software allows to upload further data, to add and edit metadata using the metadata editor, and it disseminates metadata through various channels. In addition, previous versions of a file can be downloaded and access rights can be defined on files and folders to control visibility of files for users of both panMetaDocs and Datasync. panMetaDocs serves as a publication agent for datasets and it serves as a registration agent for dataset DOIs. The application stack presented here allows sharing, versioning, and central storage of data from the very beginning of project activities by using the file synchronization service Datasync. The web-application panMetaDocs complements the functionality of DataSync by providing a dataset publication agent and other tools to handle administrative tasks on the data. [1] http://github.com/ulbricht/datasync [2] http://github.com/escidoc [3] http://panmetadocs.sf.net [4] http://metaworks.pangaea.de
Biermann, Martin
2014-04-01
Clinical trials aiming for regulatory approval of a therapeutic agent must be conducted according to Good Clinical Practice (GCP). Clinical Data Management Systems (CDMS) are specialized software solutions geared toward GCP-trials. They are however less suited for data management in small non-GCP research projects. For use in researcher-initiated non-GCP studies, we developed a client-server database application based on the public domain CakePHP framework. The underlying MySQL database uses a simple data model based on only five data tables. The graphical user interface can be run in any web browser inside the hospital network. Data are validated upon entry. Data contained in external database systems can be imported interactively. Data are automatically anonymized on import, and the key lists identifying the subjects being logged to a restricted part of the database. Data analysis is performed by separate statistics and analysis software connecting to the database via a generic Open Database Connectivity (ODBC) interface. Since its first pilot implementation in 2011, the solution has been applied to seven different clinical research projects covering different clinical problems in different organ systems such as cancer of the thyroid and the prostate glands. This paper shows how the adoption of a generic web application framework is a feasible, flexible, low-cost, and user-friendly way of managing multidimensional research data in researcher-initiated non-GCP clinical projects. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
Intelligent sensor and controller framework for the power grid
Akyol, Bora A.; Haack, Jereme Nathan; Craig, Jr., Philip Allen; Tews, Cody William; Kulkarni, Anand V.; Carpenter, Brandon J.; Maiden, Wendy M.; Ciraci, Selim
2015-07-28
Disclosed below are representative embodiments of methods, apparatus, and systems for monitoring and using data in an electric power grid. For example, one disclosed embodiment comprises a sensor for measuring an electrical characteristic of a power line, electrical generator, or electrical device; a network interface; a processor; and one or more computer-readable storage media storing computer-executable instructions. In this embodiment, the computer-executable instructions include instructions for implementing an authorization and authentication module for validating a software agent received at the network interface; instructions for implementing one or more agent execution environments for executing agent code that is included with the software agent and that causes data from the sensor to be collected; and instructions for implementing an agent packaging and instantiation module for storing the collected data in a data container of the software agent and for transmitting the software agent, along with the stored data, to a next destination.
Intelligent sensor and controller framework for the power grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akyol, Bora A.; Haack, Jereme Nathan; Craig, Jr., Philip Allen
Disclosed below are representative embodiments of methods, apparatus, and systems for monitoring and using data in an electric power grid. For example, one disclosed embodiment comprises a sensor for measuring an electrical characteristic of a power line, electrical generator, or electrical device; a network interface; a processor; and one or more computer-readable storage media storing computer-executable instructions. In this embodiment, the computer-executable instructions include instructions for implementing an authorization and authentication module for validating a software agent received at the network interface; instructions for implementing one or more agent execution environments for executing agent code that is included with themore » software agent and that causes data from the sensor to be collected; and instructions for implementing an agent packaging and instantiation module for storing the collected data in a data container of the software agent and for transmitting the software agent, along with the stored data, to a next destination.« less
Basic MR relaxation mechanisms and contrast agent design.
De León-Rodríguez, Luis M; Martins, André F; Pinho, Marco C; Rofsky, Neil M; Sherry, A Dean
2015-09-01
The diagnostic capabilities of magnetic resonance imaging (MRI) have undergone continuous and substantial evolution by virtue of hardware and software innovations and the development and implementation of exogenous contrast media. Thirty years since the first MRI contrast agent was approved for clinical use, a reliance on MR contrast media persists, largely to improve image quality with higher contrast resolution and to provide additional functional characterization of normal and abnormal tissues. Further development of MR contrast media is an important component in the quest for continued augmentation of diagnostic capabilities. In this review we detail the many important considerations when pursuing the design and use of MR contrast media. We offer a perspective on the importance of chemical stability, particularly kinetic stability, and how this influences one's thinking about the safety of metal-ligand-based contrast agents. We discuss the mechanisms involved in MR relaxation in the context of probe design strategies. A brief description of currently available contrast agents is accompanied by an in-depth discussion that highlights promising MRI contrast agents in the development of future clinical and research applications. Our intention is to give a diverse audience an improved understanding of the factors involved in developing new types of safe and highly efficient MR contrast agents and, at the same time, provide an appreciation of the insights into physiology and disease that newer types of responsive agents can provide. © 2015 Wiley Periodicals, Inc.
"Basic MR Relaxation Mechanisms & Contrast Agent Design"
De León-Rodríguez, Luis M.; Martins, André F.; Pinho, Marco; Rofsky, Neil; Sherry, A. Dean
2015-01-01
The diagnostic capabilities of magnetic resonance imaging (MRI) have undergone continuous and substantial evolution by virtue of hardware and software innovations and the development and implementation of exogenous contrast media. Thirty years since the first MRI contrast agent was approved for clinical use, a reliance on MR contrast media persists largely to improve image quality with higher contrast resolution and to provide additional functional characterization of normal and abnormal tissues. Further development of MR contrast media is an important component in the quest for continued augmentation of diagnostic capabilities. In this review we will detail the many important considerations when pursuing the design and use of MR contrast media. We will offer a perspective on the importance of chemical stability, particularly kinetic stability, and how this influences one's thinking about the safety of metal-ligand based contrast agents. We will discuss the mechanisms involved in magnetic resonance relaxation in the context of probe design strategies. A brief description of currently available contrast agents will be accompanied by an in-depth discussion that highlights promising MRI contrast agents in development for future clinical and research applications. Our intention is to give a diverse audience an improved understanding of the factors involved in developing new types of safe and highly efficient MR contrast agents and, at the same time, provide an appreciation of the insights into physiology and disease that newer types of responsive agents can provide. PMID:25975847
A hardware/software environment to support R D in intelligent machines and mobile robotic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, R.C.
1990-01-01
The Center for Engineering Systems Advanced Research (CESAR) serves as a focal point at the Oak Ridge National Laboratory (ORNL) for basic and applied research in intelligent machines. R D at CESAR addresses issues related to autonomous systems, unstructured (i.e. incompletely known) operational environments, and multiple performing agents. Two mobile robot prototypes (HERMIES-IIB and HERMIES-III) are being used to test new developments in several robot component technologies. This paper briefly introduces the computing environment at CESAR which includes three hypercube concurrent computers (two on-board the mobile robots), a graphics workstation, VAX, and multiple VME-based systems (several on-board the mobile robots).more » The current software environment at CESAR is intended to satisfy several goals, e.g.: code portability, re-usability in different experimental scenarios, modularity, concurrent computer hardware transparent to applications programmer, future support for multiple mobile robots, support human-machine interface modules, and support for integration of software from other, geographically disparate laboratories with different hardware set-ups. 6 refs., 1 fig.« less
Detection of disease outbreaks by the use of oral manifestations.
Torres-Urquidy, M H; Wallstrom, G; Schleyer, T K L
2009-01-01
Oral manifestations of diseases caused by bioterrorist agents could be a potential data source for biosurveillance. This study had the objectives of determining the oral manifestations of diseases caused by bioterrorist agents, measuring the prevalence of these manifestations in emergency department reports, and constructing and evaluating a detection algorithm based on them. We developed a software application to detect oral manifestations in free text and identified positive reports over three years of data. The normal frequency in reports for oral manifestations related to anthrax (including buccal ulcers-sore throat) was 7.46%. The frequency for tularemia was 6.91%. For botulism and smallpox, the frequencies were 0.55% and 0.23%. We simulated outbreaks for these bioterrorism diseases and evaluated the performance of our system. The detection algorithm performed better for smallpox and botulism than for anthrax and tularemia. We found that oral manifestations can be a valuable tool for biosurveillance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nehrir, M. Hashem
In this Project we collaborated with two DOE National Laboratories, Pacific Northwest National Lab (PNNL) and Lawrence Berkeley National Lab (LBL). Dr. Hammerstrom of PNNL initially supported our project and was on the graduate committee of one of the Ph.D. students (graduated in 2014) who was supported by this project. He is also a committee member of a current graduate student of the PI who was supported by this project in the last two years (August 2014-July 2016). The graduate student is now supported be the Electrical and Computer Engineering (ECE) Department at Montana State University (MSU). Dr. Chris Marneymore » of LBL provided actual load data, and the software WEBOPT developed at LBL for microgrid (MG) design for our project. NEC-Labs America, a private industry, also supported our project, providing expert support and modest financial support. We also used the software “HOMER,” originally developed at the National Renewable Energy Laboratory (NREL) and the most recent version made available to us by HOMER Energy, Inc., for MG (hybrid energy system) unit sizing. We compared the findings from WebOpt and HOMER and designed appropriately sized hybrid systems for our case studies. The objective of the project was to investigate real-time power management strategies for MGs using intelligent control, considering maximum feasible energy sustainability, reliability and efficiency while, minimizing cost and undesired environmental impact (emissions). Through analytic and simulation studies, we evaluated the suitability of several heuristic and artificial-intelligence (AI)-based optimization techniques that had potential for real-time MG power management, including genetic algorithms (GA), ant colony optimization (ACO), particle swarm optimization (PSO), and multi-agent systems (MAS), which is based on the negotiation of smart software-based agents. We found that PSO and MAS, in particular, distributed MAS, were more efficient and better suited for our work. We investigated the following: • Intelligent load control - demand response (DR) - for frequency stabilization in islanded MGs (partially supported by PNNL). • The impact of high penetration of solar photovoltaic (PV)-generated power at the distribution level (partially supported by PNNL). • The application of AI approaches to renewable (wind, PV) power forecasting (proposed by the reviewers of our proposal). • Application of AI approaches and DR for real-time MG power management (partially supported by NEC Labs-America) • Application of DR in dealing with the variability of wind power • Real-time MG power management using DR and storage (partially supported by NEC Labs-America) • Application of DR in enhancing the performance of load-frequency controller • MAS-based whole-sale and retail power market design for smart grid A« less
ERIC Educational Resources Information Center
Farzaneh, Mandana; Vanani, Iman Raeesi; Sohrabi, Babak
2012-01-01
E-learning is one of the most important learning approaches within which intelligent software agents can be efficiently used so as to automate and facilitate the process of learning. The aim of this paper is to illustrate a comprehensive categorization of intelligent software agent features, which is valuable for being deployed in the virtual…
Evaluating Software Assurance Knowledge and Competency of Acquisition Professionals
2014-10-01
of ISO 12207 -2008, both internationally and in the United States [7]. That standard documents a comprehensive set of activities and supporting...grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance Competency...cyberattacks grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance
Nankivil, Derek; Gonzalez, Alex; Arrieta, Esdras; Rowaan, Cornelis; Aguilar, Mariela C; Sotolongo, Krystal; Cabot, Florence A; Yoo, Sonia H; Parel, Jean-Marie A
2014-06-19
To develop a safe, noninvasive, noncontact, continuous in vivo method to measure the dehydration rate of the precorneal tear film and to compare the effectiveness of a viscoelastic agent in maintaining the precorneal tear film to that of a balanced salt solution. Software was designed to analyze the corneal reflection produced by the operating microscope's coaxial illumination. The software characterized the shape of the reflection, which became distorted as the precorneal tear film evaporated; characterization was accomplished by fitting an ellipse to the reflection and measuring its projected surface area. Balanced salt solution Plus (BSS+) and a 2% hydroxypropylmethylcellulose viscoelastic were used as the test agents. The tear film evaporation rate was characterized and compared over a period of 20 minutes in 20 eyes from 10 New Zealand white rabbits. The ellipse axes ratio and surface area were found to decrease initially after each application of either viscoelastic or BSS+ and then to increase linearly as the tear film began to evaporate (P < 0.001) for eyes treated with BSS+ only. Eyes treated with BSS+ required 7.5 ± 2.7 applications to maintain sufficient corneal hydration during the 20-minute test period, whereas eyes treated with viscoelastic required 1.4 ± 0.5 applications. The rates of evaporation differed significantly (P < 0.043) between viscoelastic and BSS+. The shape and surface area of the corneal reflection are strongly correlated with the state of the tear film. Rabbits' corneas treated with viscoelastic remained hydrated significantly longer than corneas treated with BSS+. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.
Open multi-agent control architecture to support virtual-reality-based man-machine interfaces
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen; Brasch, Marcel
2001-10-01
Projective Virtual Reality is a new and promising approach to intuitively operable man machine interfaces for the commanding and supervision of complex automation systems. The user interface part of Projective Virtual Reality heavily builds on latest Virtual Reality techniques, a task deduction component and automatic action planning capabilities. In order to realize man machine interfaces for complex applications, not only the Virtual Reality part has to be considered but also the capabilities of the underlying robot and automation controller are of great importance. This paper presents a control architecture that has proved to be an ideal basis for the realization of complex robotic and automation systems that are controlled by Virtual Reality based man machine interfaces. The architecture does not just provide a well suited framework for the real-time control of a multi robot system but also supports Virtual Reality metaphors and augmentations which facilitate the user's job to command and supervise a complex system. The developed control architecture has already been used for a number of applications. Its capability to integrate sensor information from sensors of different levels of abstraction in real-time helps to make the realized automation system very responsive to real world changes. In this paper, the architecture will be described comprehensively, its main building blocks will be discussed and one realization that is built based on an open source real-time operating system will be presented. The software design and the features of the architecture which make it generally applicable to the distributed control of automation agents in real world applications will be explained. Furthermore its application to the commanding and control of experiments in the Columbus space laboratory, the European contribution to the International Space Station (ISS), is only one example which will be described.
Kamel Boulos, Maged N; Cai, Qiang; Padget, Julian A; Rushton, Gerard
2006-04-01
Confidentiality constraints often preclude the release of disaggregate data about individuals, which limits the types and accuracy of the results of geographical health analyses that could be done. Access to individually geocoded (disaggregate) data often involves lengthy and cumbersome procedures through review boards and committees for approval (and sometimes is not possible). Moreover, current data confidentiality-preserving solutions compatible with fine-level spatial analyses either lack flexibility or yield less than optimal results (because of confidentiality-preserving changes they introduce to disaggregate data), or both. In this paper, we present a simulation case study to illustrate how some analyses cannot be (or will suffer if) done on aggregate data. We then quickly review some existing data confidentiality-preserving techniques, and move on to explore a solution based on software agents with the potential of providing flexible, controlled (software-only) access to unmodified confidential disaggregate data and returning only results that do not expose any person-identifiable details. The solution is thus appropriate for micro-scale geographical analyses where no person-identifiable details are required in the final results (i.e., only aggregate results are needed). Our proposed software agent technique also enables post-coordinated analyses to be designed and carried out on the confidential database(s), as needed, compared to a more conventional solution based on the Web Services model that would only support a rigid, pre-coordinated (pre-determined) and rather limited set of analyses. The paper also provides an exploratory discussion of mobility, security, and trust issues associated with software agents, as well as possible directions/solutions to address these issues, including the use of virtual organizations. Successful partnerships between stakeholder organizations, proper collaboration agreements, clear policies, and unambiguous interpretations of laws and regulations are also much needed to support and ensure the success of any technological solution.
Development of Methodology for Programming Autonomous Agents
NASA Technical Reports Server (NTRS)
Erol, Kutluhan; Levy, Renato; Lang, Lun
2004-01-01
A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently
Agent-based modeling: Methods and techniques for simulating human systems
Bonabeau, Eric
2002-01-01
Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407
ERIC Educational Resources Information Center
Gu, X.; Blackmore, K. L.
2015-01-01
This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…
Applying Dynamic Fuzzy Petri Net to Web Learning System
ERIC Educational Resources Information Center
Chen, Juei-Nan; Huang, Yueh-Min; Chu, William
2005-01-01
This investigation presents a DFPN (Dynamic Fuzzy Petri Net) model to increase the flexibility of the tutoring agent's behaviour and thus provide a learning content structure for a lecture course. The tutoring agent is a software assistant for a single user, who may be an expert in an e-Learning course. Based on each learner's behaviour, the…
2009-11-12
Service (IaaS) Software -as-a- Service ( SaaS ) Cloud Computing Types Platform-as-a- Service (PaaS) Based on Type of Capability Based on access Based...Mellon University Software -as-a- Service ( SaaS ) Application-specific capabilities, e.g., service that provides customer management Allows organizations...as a Service ( SaaS ) Model of software deployment in which a provider licenses an application to customers for use as a service on
An Evolvable Multi-Agent Approach to Space Operations Engineering
NASA Technical Reports Server (NTRS)
Mandutianu, Sanda; Stoica, Adrian
1999-01-01
A complex system of spacecraft and ground tracking stations, as well as a constellation of satellites or spacecraft, has to be able to reliably withstand sudden environment changes, resource fluctuations, dynamic resource configuration, limited communication bandwidth, etc., while maintaining the consistency of the system as a whole. It is not known in advance when a change in the environment might occur or when a particular exchange will happen. A higher degree of sophistication for the communication mechanisms between different parts of the system is required. The actual behavior has to be determined while the system is performing and the course of action can be decided at the individual level. Under such circumstances, the solution will highly benefit from increased on-board and on the ground adaptability and autonomy. An evolvable architecture based on intelligent agents that communicate and cooperate with each other can offer advantages in this direction. This paper presents an architecture of an evolvable agent-based system (software and software/hardware hybrids) as well as some plans for further implementation.
A Mechanism to Avoid Collusion Attacks Based on Code Passing in Mobile Agent Systems
NASA Astrophysics Data System (ADS)
Jaimez, Marc; Esparza, Oscar; Muñoz, Jose L.; Alins-Delgado, Juan J.; Mata-Díaz, Jorge
Mobile agents are software entities consisting of code, data, state and itinerary that can migrate autonomously from host to host executing their code. Despite its benefits, security issues strongly restrict the use of code mobility. The protection of mobile agents against the attacks of malicious hosts is considered the most difficult security problem to solve in mobile agent systems. In particular, collusion attacks have been barely studied in the literature. This paper presents a mechanism that avoids collusion attacks based on code passing. Our proposal is based on a Multi-Code agent, which contains a different variant of the code for each host. A Trusted Third Party is responsible for providing the information to extract its own variant to the hosts, and for taking trusted timestamps that will be used to verify time coherence.
Smarter Software For Enhanced Vehicle Health Monitoring and Inter-Planetary Exploration
NASA Technical Reports Server (NTRS)
Larson, William E.; Goodrich, Charles H.; Steinrock, Todd (Technical Monitor)
2001-01-01
The existing philosophy for space mission control was born in the early days of the space program when technology did not exist to put significant control responsibility onboard the spacecraft. NASA relied on a team of ground control experts to troubleshoot systems when problems occurred. As computing capability improved, more responsibility was handed over to the systems software. However, there is still a large contingent of both launch and flight controllers supporting each mission. New technology can update this philosophy to increase mission assurance and reduce the cost of inter-planetary exploration. The advent of model-based diagnosis and intelligent planning software enables spacecraft to handle most routine problems automatically and allocate resources in a flexible way to realize mission objectives. The manifests for recent missions include multiple subsystems and complex experiments. Spacecraft must operate at longer distances from earth where communications delays make earthbound command and control impractical. NASA's Ames Research Center (ARC) has demonstrated the utility of onboard diagnosis and planning with the Remote Agent experiment in 1999. KSC has pioneered model-based diagnosis and demonstrated its utility for ground support operations. KSC and ARC are cooperating in research to improve the state of the art of this technology. This paper highlights model-based reasoning applications for Moon and Mars missions including in-situ resource utilization and enhanced vehicle health monitoring.
The Evolution of Sonic Ecosystems
NASA Astrophysics Data System (ADS)
McCormack, Jon
This chapter describes a novel type of artistic artificial life software environment. Agents that have the ability to make and listen to sound populate a synthetic world. An evolvable, rule-based classifier system drives agent behavior. Agents compete for limited resources in a virtual environment that is influenced by the presence and movement of people observing the system. Electronic sensors create a link between the real and virtual spaces, virtual agents evolve implicitly to try to maintain the interest of the human audience, whose presence provides them with life-sustaining food.
Hyperspectral fluorescence imaging with multi wavelength LED excitation
NASA Astrophysics Data System (ADS)
Luthman, A. Siri; Dumitru, Sebastian; Quirós-Gonzalez, Isabel; Bohndiek, Sarah E.
2016-04-01
Hyperspectral imaging (HSI) can combine morphological and molecular information, yielding potential for real-time and high throughput multiplexed fluorescent contrast agent imaging. Multiplexed readout from targets, such as cell surface receptors overexpressed in cancer cells, could improve both sensitivity and specificity of tumor identification. There remains, however, a need for compact and cost effective implementations of the technology. We have implemented a low-cost wide-field multiplexed fluorescence imaging system, which combines LED excitation at 590, 655 and 740 nm with a compact commercial solid state HSI system operating in the range 600 - 1000 nm. A key challenge for using reflectance-based HSI is the separation of contrast agent fluorescence from the reflectance of the excitation light. Here, we illustrate how it is possible to address this challenge in software, using two offline reflectance removal methods, prior to least-squares spectral unmixing. We made a quantitative comparison of the methods using data acquired from dilutions of contrast agents prepared in well-plates. We then established the capability of our HSI system for non-invasive in vivo fluorescence imaging in small animals using the optimal reflectance removal method. The HSI presented here enables quantitative unmixing of at least four fluorescent contrast agents (Alexa Fluor 610, 647, 700 and 750) simultaneously in living mice. A successful unmixing of the four fluorescent contrast agents was possible both using the pure contrast agents and with mixtures. The system could in principle also be applied to imaging of ex vivo tissue or intraoperative imaging in a clinical setting. These data suggest a promising approach for developing clinical applications of HSI based on multiplexed fluorescence contrast agent imaging.
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Schrenkenghost, Debra K.
2001-01-01
The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.
Dynamic Network Security Control Using Software Defined Networking
2016-03-24
Most importantly I thank my family for understanding, loving , and thriving in the hectic world of military spouse and children. Michael C. Todd v...RBAC poses access to objects as a user to member-of group relationship . This construct results in a set of rules to govern access to objects based...API. Agent Agent.py Event.py Message.py ModSysStatus.py Event Message ModSysStatus Event - Message - ModSysStatus Relationship Figure 12. Agent Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auld, Joshua; Hope, Michael; Ley, Hubert
This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typicallymore » done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.« less
Advanced Autonomous Systems for Space Operations
NASA Astrophysics Data System (ADS)
Gross, A. R.; Smith, B. D.; Muscettola, N.; Barrett, A.; Mjolssness, E.; Clancy, D. J.
2002-01-01
New missions of exploration and space operations will require unprecedented levels of autonomy to successfully accomplish their objectives. Inherently high levels of complexity, cost, and communication distances will preclude the degree of human involvement common to current and previous space flight missions. With exponentially increasing capabilities of computer hardware and software, including networks and communication systems, a new balance of work is being developed between humans and machines. This new balance holds the promise of not only meeting the greatly increased space exploration requirements, but simultaneously dramatically reducing the design, development, test, and operating costs. New information technologies, which take advantage of knowledge-based software, model-based reasoning, and high performance computer systems, will enable the development of a new generation of design and development tools, schedulers, and vehicle and system health management capabilities. Such tools will provide a degree of machine intelligence and associated autonomy that has previously been unavailable. These capabilities are critical to the future of advanced space operations, since the science and operational requirements specified by such missions, as well as the budgetary constraints will limit the current practice of monitoring and controlling missions by a standing army of ground-based controllers. System autonomy capabilities have made great strides in recent years, for both ground and space flight applications. Autonomous systems have flown on advanced spacecraft, providing new levels of spacecraft capability and mission safety. Such on-board systems operate by utilizing model-based reasoning that provides the capability to work from high-level mission goals, while deriving the detailed system commands internally, rather than having to have such commands transmitted from Earth. This enables missions of such complexity and communication` distances as are not otherwise possible, as well as many more efficient and low cost applications. In addition, utilizing component and system modeling and reasoning capabilities, autonomous systems will play an increasing role in ground operations for space missions, where they will both reduce the human workload as well as provide greater levels of monitoring and system safety. This paper will focus specifically on new and innovative software for remote, autonomous, space systems flight operations. Topics to be presented will include a brief description of key autonomous control concepts, the Remote Agent program that commanded the Deep Space 1 spacecraft to new levels of system autonomy, recent advances in distributed autonomous system capabilities, and concepts for autonomous vehicle health management systems. A brief description of teaming spacecraft and rovers for complex exploration missions will also be provided. New on-board software for autonomous science data acquisition for planetary exploration will be described, as well as advanced systems for safe planetary landings. A new multi-agent architecture that addresses some of the challenges of autonomous systems will be presented. Autonomous operation of ground systems will also be considered, including software for autonomous in-situ propellant production and management, and closed- loop ecological life support systems (CELSS). Finally, plans and directions for the future will be discussed.
SHINE Virtual Machine Model for In-flight Updates of Critical Mission Software
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2008-01-01
This software is a new target for the Spacecraft Health Inference Engine (SHINE) knowledge base that compiles a knowledge base to a language called Tiny C - an interpreted version of C that can be embedded on flight processors. This new target allows portions of a running SHINE knowledge base to be updated on a "live" system without needing to halt and restart the containing SHINE application. This enhancement will directly provide this capability without the risk of software validation problems and can also enable complete integration of BEAM and SHINE into a single application. This innovation enables SHINE deployment in domains where autonomy is used during flight-critical applications that require updates. This capability eliminates the need for halting the application and performing potentially serious total system uploads before resuming the application with the loss of system integrity. This software enables additional applications at JPL (microsensors, embedded mission hardware) and increases the marketability of these applications outside of JPL.
Study on the E-commerce platform based on the agent
NASA Astrophysics Data System (ADS)
Fu, Ruixue; Qin, Lishuan; Gao, Yinmin
2011-10-01
To solve problem of dynamic integration in e-commerce, the Multi-Agent architecture of electronic commerce platform system based on Agent and Ontology has been introduced, which includes three major types of agent, Ontology and rule collection. In this architecture, service agent and rule are used to realize the business process reengineering, the reuse of software component, and agility of the electronic commerce platform. To illustrate the architecture, a simulation work has been done and the results imply that the architecture provides a very efficient method to design and implement the flexible, distributed, open and intelligent electronic commerce platform system to solve problem of dynamic integration in ecommerce. The objective of this paper is to illustrate the architecture of electronic commerce platform system, and the approach how Agent and Ontology support the electronic commerce platform system.
Experimental study on foam coverage on simulated longwall roof.
Reed, W R; Zheng, Y; Klima, S; Shahan, M R; Beck, T W
2017-01-01
Testing was conducted to determine the ability of foam to maintain roof coverage in a simulated longwall mining environment. Approximately 27 percent of respirable coal mine dust can be attributed to longwall shield movement, and developing controls for this dust source has been difficult. The application of foam is a possible dust control method for this source. Laboratory testing of two foam agents was conducted to determine the ability of the foam to adhere to a simulated longwall face roof surface. Two different foam generation methods were used: compressed air and blower air. Using a new imaging technology, image processing and analysis utilizing ImageJ software produced quantifiable results of foam roof coverage. For compressed air foam in 3.3 m/s (650 fpm) ventilation, 98 percent of agent A was intact while 95 percent of agent B was intact on the roof at three minutes after application. At 30 minutes after application, 94 percent of agent A was intact while only 20 percent of agent B remained. For blower air in 3.3 m/s (650 fpm) ventilation, the results were dependent upon nozzle type. Three different nozzles were tested. At 30 min after application, 74 to 92 percent of foam agent A remained, while 3 to 50 percent of foam agent B remained. Compressed air foam seems to remain intact for longer durations and is easier to apply than blower air foam. However, more water drained from the foam when using compressed air foam, which demonstrates that blower air foam retains more water at the roof surface. Agent A seemed to be the better performer as far as roof application is concerned. This testing demonstrates that roof application of foam is feasible and is able to withstand a typical face ventilation velocity, establishing this technique's potential for longwall shield dust control.
Experimental study on foam coverage on simulated longwall roof
Reed, W.R.; Zheng, Y.; Klima, S.; Shahan, M.R.; Beck, T.W.
2018-01-01
Testing was conducted to determine the ability of foam to maintain roof coverage in a simulated longwall mining environment. Approximately 27 percent of respirable coal mine dust can be attributed to longwall shield movement, and developing controls for this dust source has been difficult. The application of foam is a possible dust control method for this source. Laboratory testing of two foam agents was conducted to determine the ability of the foam to adhere to a simulated longwall face roof surface. Two different foam generation methods were used: compressed air and blower air. Using a new imaging technology, image processing and analysis utilizing ImageJ software produced quantifiable results of foam roof coverage. For compressed air foam in 3.3 m/s (650 fpm) ventilation, 98 percent of agent A was intact while 95 percent of agent B was intact on the roof at three minutes after application. At 30 minutes after application, 94 percent of agent A was intact while only 20 percent of agent B remained. For blower air in 3.3 m/s (650 fpm) ventilation, the results were dependent upon nozzle type. Three different nozzles were tested. At 30 min after application, 74 to 92 percent of foam agent A remained, while 3 to 50 percent of foam agent B remained. Compressed air foam seems to remain intact for longer durations and is easier to apply than blower air foam. However, more water drained from the foam when using compressed air foam, which demonstrates that blower air foam retains more water at the roof surface. Agent A seemed to be the better performer as far as roof application is concerned. This testing demonstrates that roof application of foam is feasible and is able to withstand a typical face ventilation velocity, establishing this technique’s potential for longwall shield dust control. PMID:29563765
NASA Astrophysics Data System (ADS)
Osnos, V. B.; Kuneevsky, V. V.; Larionov, V. M.; Saifullin, E. R.; Gainetdinov, A. V.; Vankov, Yu V.; Larionova, I. V.
2017-01-01
The method of natural thermal convection with heat agent recirculation (NTC HAR) in oil reservoirs is described. The analysis of the effectiveness of this method for oil reservoir heating with the values of water saturation from 0 to 0.5 units is conducted. As the test element Ashalchinskoye oil field is taken. CMG STARS software was used for calculations. Dynamics of cumulative production, recovery factor and specific energy consumption per 1 m3 of crude oil produced in the application of the heat exchanger with heat agent in cases of different initial water saturation are defined and presented as graphs.
ERIC Educational Resources Information Center
Antony, Laljith
2016-01-01
Failing to prevent leaks of confidential and proprietary information to unauthorized users from software applications is a major challenge that companies face. Access control policies defined in software applications with access control mechanisms are unable to prevent information leaks from software applications to unauthorized users. Role-based…
Applications of Agent Based Approaches in Business (A Three Essay Dissertation)
ERIC Educational Resources Information Center
Prawesh, Shankar
2013-01-01
The goal of this dissertation is to investigate the enabling role that agent based simulation plays in business and policy. The aforementioned issue has been addressed in this dissertation through three distinct, but related essays. The first essay is a literature review of different research applications of agent based simulation in various…
Security patterns and a weighting scheme for mobile agents
NASA Astrophysics Data System (ADS)
Walker, Jessie J.
The notion of mobility has always been a prime factor in human endeavor and achievement. This need to migrate by humans has been distilled into software entities, which are their representatives on distant environments. Software agents are developed to act on behalf of a user. Mobile agents were born from the understanding that many times it was much more useful to move the code (program) to where the resources are located, instead of connecting remotely. Within the mobile agent research community, security has traditionally been the most defining issue facing the community and preventing the paradigm from gaining wide acceptance. There are still numerous difficult problems being addressed with very few practical solutions, such as the malicious host and agent problems. These problems are some of the most active areas of research within the mobile agent community. The major principles, facets, fundamental concepts, techniques and architectures of the field are well understood within the community. This is evident by the many mobile agent systems developed in the last decade that share common core components such as agent management, communication facilities, and mobility services. In other words new mobile agent systems and frameworks do not provide any new insights into agent system architecture or mobility services, agent coordination, communication that could be useful to the agent research community, although these new mobile agent systems do in many instances validate, refine, demonstrate the reuse of many previously proposed and discussed mobile agent research elements. Since mobile agent research for the last decade has been defined by security and related issues, our research into security patterns are within this narrow arena of mobile agent research. The research presented in this thesis examines the issue of mobile agent security from the standpoint of security pattern documented from the universe of mobile agent systems. In addition, we explore how these documented security patterns can be quantitatively compared based on a unique weighting scheme. The scheme is formalized into a theory that can be used improve the development of secure mobile agents and agent-based systems.
An Application of Artificial Intelligence to the Implementation of Electronic Commerce
NASA Astrophysics Data System (ADS)
Srivastava, Anoop Kumar
In this paper, we present an application of Artificial Intelligence (AI) to the implementation of Electronic Commerce. We provide a multi autonomous agent based framework. Our agent based architecture leads to flexible design of a spectrum of multiagent system (MAS) by distributing computation and by providing a unified interface to data and programs. Autonomous agents are intelligent enough and provide autonomy, simplicity of communication, computation, and a well developed semantics. The steps of design and implementation are discussed in depth, structure of Electronic Marketplace, an ontology, the agent model, and interaction pattern between agents is given. We have developed mechanisms for coordination between agents using a language, which is called Virtual Enterprise Modeling Language (VEML). VEML is a integration of Java and Knowledge Query and Manipulation Language (KQML). VEML provides application programmers with potential to globally develop different kinds of MAS based on their requirements and applications. We have implemented a multi autonomous agent based system called VE System. We demonstrate efficacy of our system by discussing experimental results and its salient features.
Ant-Based Cyber Defense (also known as
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glenn Fink, PNNL
2015-09-29
ABCD is a four-level hierarchy with human supervisors at the top, a top-level agent called a Sergeant controlling each enclave, Sentinel agents located at each monitored host, and mobile Sensor agents that swarm through the enclaves to detect cyber malice and misconfigurations. The code comprises four parts: (1) the core agent framework, (2) the user interface and visualization, (3) test-range software to create a network of virtual machines including a simulated Internet and user and host activity emulation scripts, and (4) a test harness to allow the safe running of adversarial code within the framework of monitored virtual machines.
Methodology of decreasing software complexity using ontology
NASA Astrophysics Data System (ADS)
DÄ browska-Kubik, Katarzyna
2015-09-01
In this paper a model of web application`s source code, based on the OSD ontology (Ontology for Software Development), is proposed. This model is applied to implementation and maintenance phase of software development process through the DevOntoCreator tool [5]. The aim of this solution is decreasing software complexity of that source code, using many different maintenance techniques, like creation of documentation, elimination dead code, cloned code or bugs, which were known before [1][2]. Due to this approach saving on software maintenance costs of web applications will be possible.
Knowledge-based control of an adaptive interface
NASA Technical Reports Server (NTRS)
Lachman, Roy
1989-01-01
The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.
Advanced software development workstation project ACCESS user's guide
NASA Technical Reports Server (NTRS)
1990-01-01
ACCESS is a knowledge based software information system designed to assist the user in modifying retrieved software to satisfy user specifications. A user's guide is presented for the knowledge engineer who wishes to create for ACCESS a knowledge base consisting of representations of objects in some software system. This knowledge is accessible to an end user who wishes to use the catalogued software objects to create a new application program or an input stream for an existing system. The application specific portion of an ACCESS knowledge base consists of a taxonomy of object classes, as well as instances of these classes. All objects in the knowledge base are stored in an associative memory. ACCESS provides a standard interface for the end user to browse and modify objects. In addition, the interface can be customized by the addition of application specific data entry forms and by specification of display order for the taxonomy and object attributes. These customization options are described.
Agile-Lean Software Engineering (ALSE) Evaluating Kanban in Systems Engineering
2013-03-06
Boeing) Garry Roedler (Lockheed Martin) Karl Scotland (Rally Software, UK) Alan Shalloway (NetObjectives) Neil Shirk (Lockheed Martin... Neil Siegel (Northrop Grumman) James Sutton (Jubata Group) Thanks are also due to the members of the SERC Research Council, particularly Barry...Incremental Commitment Model to Brownfield Systems Development, Proceedings, CSER 2009, April 2009. 16. Heath , B. et al. (2009). A survey of agent-based
Software agents and the route to the information economy.
Kephart, Jeffrey O
2002-05-14
Humans are on the verge of losing their status as the sole economic species on the planet. In private laboratories and in the Internet laboratory, researchers and developers are creating a variety of autonomous economically motivated software agents endowed with algorithms for maximizing profit or utility. Many economic software agents will function as miniature businesses, purchasing information inputs from other agents, combining and refining them into information goods and services, and selling them to humans or other agents. Their mutual interactions will form the information economy: a complex economic web of information goods and services that will adapt to the ever-changing needs of people and agents. The information economy will be the largest multiagent system ever conceived and an integral part of the world's economy. I discuss a possible route toward this vision, beginning with present-day Internet trends suggesting that agents will charge one another for information goods and services. Then, to establish that agents can be competent price setters, I describe some laboratory experiments pitting software bidding agents against human bidders. The agents' superior performance suggests they will be used on a broad scale, which in turn suggests that interactions among agents will become frequent and significant. How will this affect macroscopic economic behavior? I describe some interesting phenomena that my colleagues and I have observed in simulations of large populations of automated buyers and sellers, such as price war cycles. I conclude by discussing fundamental scientific challenges that remain to be addressed as we journey toward the information economy.
The Real-Time ObjectAgent Software Architecture for Distributed Satellite Systems
2001-01-01
real - time operating system selection are also discussed. The fourth section describes a simple demonstration of real-time ObjectAgent. Finally, the...experience with C++. After selecting the programming language, it was necessary to select a target real - time operating system (RTOS) and embedded...ObjectAgent software to run on the OSE Real Time Operating System . In addition, she is responsible for the integration of ObjectAgent
NASA Astrophysics Data System (ADS)
Chooramun, N.; Lawrence, P. J.; Galea, E. R.
2017-08-01
In all evacuation simulation tools, the space through which agents navigate and interact is represented by one the following methods, namely Coarse regions, Fine nodes and Continuous regions. Each of the spatial representation methods has its benefits and limitations. For instance, the Coarse approach allows simulations to be processed very rapidly, but is unable to represent the interactions of the agents from an individual perspective; the Continuous approach provides a detailed representation of agent movement and interaction but suffers from relatively poor computational performance. The Fine nodal approach presents a compromise between the Continuous and Coarse approaches such that it allows agent interaction to be modelled while providing good computational performance. Our approach for representing space in an evacuation simulation tool differs such that it allows evacuation simulations to be run using a combination of Coarse regions, Fine nodes and Continuous regions. This approach, which we call Hybrid Spatial Discretisation (HSD), is implemented within the buildingEXODUS evacuation simulation software. The HSD incorporates the benefits of each of the spatial representation methods whilst providing an optimal environment for representing agent movement and interaction. In this work, we demonstrate the effectiveness of the HSD through its application to a moderately large case comprising of an underground rail tunnel station with a population of 2,000 agents.
EVA: An Interactive Web-Based Collaborative Learning Environment
ERIC Educational Resources Information Center
Sheremetov, Leonid; Arenas, Adolfo Guzman
2002-01-01
In this paper, a Web-based learning environment developed within the project called Virtual Learning Spaces (EVA, in Spanish) is described. The environment is composed of knowledge, collaboration, consulting and experimentation spaces as a collection of agents and conventional software components working over the knowledge domains. All user…
Building Software Agents for Planning, Monitoring, and Optimizing Travel
2004-01-01
defined as plans in the Theseus Agent Execution language (Barish et al. 2002). In the Web environment, sources can be quite slow and the latencies of...executor is based on a dataflow paradigm, actions are executed as soon as the data becomes available. Second, Theseus performs the actions in a...while Thesues provides an expressive language for defining information gathering and monitoring plans. The Theseus language supports capabilities
Enhancing E-Health Information Systems with Agent Technology
Nguyen, Minh Tuan; Fuhrer, Patrik; Pasquier-Rocha, Jacques
2009-01-01
Agent Technology is an emerging and promising research area in software technology, which increasingly contributes to the development of value-added information systems for large healthcare organizations. Through the MediMAS prototype, resulting from a case study conducted at a local Swiss hospital, this paper aims at presenting the advantages of reinforcing such a complex E-health man-machine information organization with software agents. The latter will work on behalf of human agents, taking care of routine tasks, and thus increasing the speed, the systematic, and ultimately the reliability of the information exchanges. We further claim that the modeling of the software agent layer can be methodically derived from the actual “classical” laboratory organization and practices, as well as seamlessly integrated with the existing information system. PMID:19096509
The Effectiveness of Interactivity in Multimedia Software Tutorials
ERIC Educational Resources Information Center
Whitman, Lisa
2013-01-01
Many people face the challenge of finding effective computer-based software instruction, including employees who must learn how to use software applications for their job and students of distance education classes. Therefore, it is important to conduct research on how computer-based multimedia software tutorials should be designed so they are as…
Autonomous Agents and Intelligent Assistants for Exploration Operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2000-01-01
Human exploration of space will involve remote autonomous crew and systems in long missions. Data to earth will be delayed and limited. Earth control centers will not receive continuous real-time telemetry data, and there will be communication round trips of up to one hour. There will be reduced human monitoring on the planet and earth. When crews are present on the planet, they will be occupied with other activities, and system management will be a low priority task. Earth control centers will use multi-tasking "night shift" and on-call specialists. A new project at Johnson Space Center is developing software to support teamwork between distributed human and software agents in future interplanetary work environments. The Engineering and Mission Operations Directorates at Johnson Space Center (JSC) are combining laboratories and expertise to carry out this project, by establishing a testbed for hWl1an centered design, development and evaluation of intelligent autonomous and assistant systems. Intelligent autonomous systems for managing systems on planetary bases will commuicate their knowledge to support distributed multi-agent mixed-initiative operations. Intelligent assistant agents will respond to events by developing briefings and responses according to instructions from human agents on earth and in space.
Automatic Conflict Detection on Contracts
NASA Astrophysics Data System (ADS)
Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo
Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
Security Verification Techniques Applied to PatchLink COTS Software
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer
2006-01-01
Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.
Next Generation System and Software Architectures: Challenges from Future NASA Exploration Missions
NASA Technical Reports Server (NTRS)
Sterritt, Roy; Rouff, Christopher A.; Hinchey, Michael G.; Rash, James L.; Truszkowski, Walt
2006-01-01
The four key objective properties of a system that are required of it in order for it to qualify as "autonomic" are now well-accepted-self-configuring, self-healing, self-protecting, and self-optimizing- together with the attribute properties-viz. self-aware, environment-aware, self-monitoring and self- adjusting. This paper describes the need for next generation system software architectures, where components are agents, rather than objects masquerading as agents, and where support is provided for self-* properties (both existing self-chop and emerging self-* properties). These are discussed as exhibited in NASA missions, and in particular with reference to a NASA concept mission, ANTS, which is illustrative of future NASA exploration missions based on the technology of intelligent swarms.
USDA-ARS?s Scientific Manuscript database
This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-08-01
An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less
Software agents and the route to the information economy
Kephart, Jeffrey O.
2002-01-01
Humans are on the verge of losing their status as the sole economic species on the planet. In private laboratories and in the Internet laboratory, researchers and developers are creating a variety of autonomous economically motivated software agents endowed with algorithms for maximizing profit or utility. Many economic software agents will function as miniature businesses, purchasing information inputs from other agents, combining and refining them into information goods and services, and selling them to humans or other agents. Their mutual interactions will form the information economy: a complex economic web of information goods and services that will adapt to the ever-changing needs of people and agents. The information economy will be the largest multiagent system ever conceived and an integral part of the world's economy. I discuss a possible route toward this vision, beginning with present-day Internet trends suggesting that agents will charge one another for information goods and services. Then, to establish that agents can be competent price setters, I describe some laboratory experiments pitting software bidding agents against human bidders. The agents' superior performance suggests they will be used on a broad scale, which in turn suggests that interactions among agents will become frequent and significant. How will this affect macroscopic economic behavior? I describe some interesting phenomena that my colleagues and I have observed in simulations of large populations of automated buyers and sellers, such as price war cycles. I conclude by discussing fundamental scientific challenges that remain to be addressed as we journey toward the information economy. PMID:12011399
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jurrus, Elizabeth; Engel, Dave; Star, Keith
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suitemore » of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.« less
An Integrated Software Package to Enable Predictive Simulation Capabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Fitzhenry, Erin B.; Jin, Shuangshuang
The power grid is increasing in complexity due to the deployment of smart grid technologies. Such technologies vastly increase the size and complexity of power grid systems for simulation and modeling. This increasing complexity necessitates not only the use of high-performance-computing (HPC) techniques, but a smooth, well-integrated interplay between HPC applications. This paper presents a new integrated software package that integrates HPC applications and a web-based visualization tool based on a middleware framework. This framework can support the data communication between different applications. Case studies with a large power system demonstrate the predictive capability brought by the integrated software package,more » as well as the better situational awareness provided by the web-based visualization tool in a live mode. Test results validate the effectiveness and usability of the integrated software package.« less
A Study of the Effectiveness of Web-Based Homework in Teaching Undergraduate Business Statistics
ERIC Educational Resources Information Center
Palocsay, Susan W.; Stevens, Scott P.
2008-01-01
Web-based homework (WBH) Technology can simplify the creation and grading of assignments as well as provide a feasible platform for assessment testing, but its effect on student learning in business statistics is unknown. This is particularly true of the latest software development of Web-based tutoring agents that dynamically evaluate individual…
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond
Falenski, Alexander; Weiser, Armin A.; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations. PMID:26247028
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond.
Falenski, Alexander; Weiser, Armin A; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations.
Kortüm, K; Reznicek, L; Leicht, S; Ulbig, M; Wolf, A
2013-07-01
The importance and complexity of clinical trials is continuously increasing, especially in innovative specialties like ophthalmology. Therefore an efficient clinical trial site organisational structure is essential. In modern internet times, this can be accomplished by web-based applications. In total, 3 software applications (Vibe on Prem, Sharepoint and open source software) were evaluated in a clinical trial site in ophthalmology. Assessment criteria were set; they were: reliability, easiness of administration, usability, scheduling, task list, knowledge management, operating costs and worldwide availability. Vibe on Prem customised by the local university met the assessment criteria best. Other applications were not as strong. By introducing a web-based application for administrating and organising an ophthalmological trial site, studies can be conducted in a more efficient and reliable manner. Georg Thieme Verlag KG Stuttgart · New York.
NASA Technical Reports Server (NTRS)
Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor); Penn, Joaquin (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which in some embodiments, an agent-oriented specification modeled with MaCMAS, is analyzed, flaws in the agent-oriented specification modeled with MaCMAS are corrected, and an implementation is derived from the corrected agent-oriented specification. Described herein are systems, method and apparatus that produce fully (mathematically) tractable development of agent-oriented specification(s) modeled with methodology fragment for analyzing complex multiagent systems (MaCMAS) and policies for autonomic systems from requirements through to code generation. The systems, method and apparatus described herein are illustrated through an example showing how user formulated policies can be translated into a formal mode which can then be converted to code. The requirements-based programming systems, method and apparatus described herein may provide faster, higher quality development and maintenance of autonomic systems based on user formulation of policies.
A Large Scale, High Resolution Agent-Based Insurgency Model
2013-09-30
CUDA) is NVIDIA Corporation’s software development model for General Purpose Programming on Graphics Processing Units (GPGPU) ( NVIDIA Corporation ...Conference. Argonne National Laboratory, Argonne, IL, October, 2005. NVIDIA Corporation . NVIDIA CUDA Programming Guide 2.0 [Online]. NVIDIA Corporation
Natural language processing-based COTS software and related technologies survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.
Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.
The Consumer Juggernaut: Web-Based and Mobile Applications as Innovation Pioneer
NASA Astrophysics Data System (ADS)
Messerschmitt, David G.
As happened previously in electronics, software targeted at consumers is increasingly the focus of investment and innovation. Some of the areas where it is leading is animated interfaces, treating users as a community, audio and video information, software as a service, agile software development, and the integration of business models with software design. As a risk-taking and experimental market, and as a source of ideas, consumer software can benefit other areas of applications software. The influence of consumer software can be magnified by research into the internal organizations and processes of the innovative firms at its foundation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Svetlana Shasharina
The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.
System integration test plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, D.
This document presents the system integration test plan for the Commercial-Off-The-Shelf, PassPort and PeopleSoft software, and custom software created to work with the COTS products. The PP software is an integrated application for AP, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheet. The PS software is an integrated application for Project Costing, General Ledger, Human Resources/Training, Payroll, and Base Benefits.
Miniature microwave applicator for murine bladder hyperthermia studies.
Salahi, Sara; Maccarini, Paolo F; Rodrigues, Dario B; Etienne, Wiguins; Landon, Chelsea D; Inman, Brant A; Dewhirst, Mark W; Stauffer, Paul R
2012-01-01
Novel combinations of heat with chemotherapeutic agents are often studied in murine tumour models. Currently, no device exists to selectively heat small tumours at depth in mice. In this project we modelled, built and tested a miniature microwave heat applicator, the physical dimensions of which can be scaled to adjust the volume and depth of heating to focus on the tumour volume. Of particular interest is a device that can selectively heat murine bladder. Using Avizo(®) segmentation software, we created a numerical mouse model based on micro-MRI scan data. The model was imported into HFSS™ (Ansys) simulation software and parametric studies were performed to optimise the dimensions of a water-loaded circular waveguide for selective power deposition inside a 0.15 mL bladder. A working prototype was constructed operating at 2.45 GHz. Heating performance was characterised by mapping fibre-optic temperature sensors along catheters inserted at depths of 0-1 mm (subcutaneous), 2-3 mm (vaginal), and 4-5 mm (rectal) below the abdominal wall, with the mid depth catheter adjacent to the bladder. Core temperature was monitored orally. Thermal measurements confirm the simulations which demonstrate that this applicator can provide local heating at depth in small animals. Measured temperatures in murine pelvis show well-localised bladder heating to 42-43°C while maintaining normothermic skin and core temperatures. Simulation techniques facilitate the design optimisation of microwave antennas for use in pre-clinical applications such as localised tumour heating in small animals. Laboratory measurements demonstrate the effectiveness of a new miniature water-coupled microwave applicator for localised heating of murine bladder.
New Technologies and the World Ahead: The Top 20 Plus 5
2011-01-01
Specialized Agent Software Programs. Bots represent the next great milestone in soft- ware development. The general deployment of bots is projected to be in...knowledge and areas of interest. Powerful personal- agent 206 Moving from Vision to Action programs will search the Internet and its databases based on...language- capable chatbot and avatar interfaces that can control electronic data and also change and manipulate things in the physical world. These
NASA Astrophysics Data System (ADS)
Hu, Philip; Mingozzi, Marco; Higgins, Laura M.; Ganapathy, Vidya; Zevon, Margot; Riman, Richard E.; Roth, Charles M.; Moghe, Prabhas V.; Pierce, Mark C.
2015-03-01
We report the design, calibration, and testing of a pre-clinical small animal imaging platform for use with short-wave infrared (SWIR) emitting contrast agents. Unlike materials emitting at visible or near-infrared wavelengths, SWIR-emitting agents require detection systems with sensitivity in the 1-2 μm wavelength region, beyond the range of commercially available small animal imagers. We used a collimated 980 nm laser beam to excite rare-earth-doped NaYF4:Er,Yb nanocomposites, as an example of a SWIR emitting material under development for biomedical imaging applications. This beam was raster scanned across the animal, with fluorescence in the 1550 nm wavelength region detected by an InGaAs area camera. Background adjustment and intensity non-uniformity corrections were applied in software. The final SWIR fluorescence image was overlaid onto a standard white-light image for registration of contrast agent uptake with respect to anatomical features.
Devices development and techniques research for space life sciences
NASA Astrophysics Data System (ADS)
Zhang, A.; Liu, B.; Zheng, C.
The development process and the status quo of the devices and techniques for space life science in China and the main research results in this field achieved by Shanghai Institute of Technical Physics SITP CAS are reviewed concisely in this paper On the base of analyzing the requirements of devices and techniques for supporting space life science experiments and researches one designment idea of developing different intelligent modules with professional function standard interface and easy to be integrated into system is put forward and the realization method of the experiment system with intelligent distributed control based on the field bus are discussed in three hierarchies Typical sensing or control function cells with certain self-determination control data management and communication abilities are designed and developed which are called Intelligent Agents Digital hardware network system which are consisted of the distributed Agents as the intelligent node is constructed with the normative opening field bus technology The multitask and real-time control application softwares are developed in the embedded RTOS circumstance which is implanted into the system hardware and space life science experiment system platform with characteristic of multitasks multi-courses professional and instant integration will be constructed
A Multi Agent Based Approach for Prehospital Emergency Management.
Safdari, Reza; Shoshtarian Malak, Jaleh; Mohammadzadeh, Niloofar; Danesh Shahraki, Azimeh
2017-07-01
To demonstrate an architecture to automate the prehospital emergency process to categorize the specialized care according to the situation at the right time for reducing the patient mortality and morbidity. Prehospital emergency process were analyzed using existing prehospital management systems, frameworks and the extracted process were modeled using sequence diagram in Rational Rose software. System main agents were identified and modeled via component diagram, considering the main system actors and by logically dividing business functionalities, finally the conceptual architecture for prehospital emergency management was proposed. The proposed architecture was simulated using Anylogic simulation software. Anylogic Agent Model, State Chart and Process Model were used to model the system. Multi agent systems (MAS) had a great success in distributed, complex and dynamic problem solving environments, and utilizing autonomous agents provides intelligent decision making capabilities. The proposed architecture presents prehospital management operations. The main identified agents are: EMS Center, Ambulance, Traffic Station, Healthcare Provider, Patient, Consultation Center, National Medical Record System and quality of service monitoring agent. In a critical condition like prehospital emergency we are coping with sophisticated processes like ambulance navigation health care provider and service assignment, consultation, recalling patients past medical history through a centralized EHR system and monitoring healthcare quality in a real-time manner. The main advantage of our work has been the multi agent system utilization. Our Future work will include proposed architecture implementation and evaluation of its impact on patient quality care improvement.
A Multi Agent Based Approach for Prehospital Emergency Management
Safdari, Reza; Shoshtarian Malak, Jaleh; Mohammadzadeh, Niloofar; Danesh Shahraki, Azimeh
2017-01-01
Objective: To demonstrate an architecture to automate the prehospital emergency process to categorize the specialized care according to the situation at the right time for reducing the patient mortality and morbidity. Methods: Prehospital emergency process were analyzed using existing prehospital management systems, frameworks and the extracted process were modeled using sequence diagram in Rational Rose software. System main agents were identified and modeled via component diagram, considering the main system actors and by logically dividing business functionalities, finally the conceptual architecture for prehospital emergency management was proposed. The proposed architecture was simulated using Anylogic simulation software. Anylogic Agent Model, State Chart and Process Model were used to model the system. Results: Multi agent systems (MAS) had a great success in distributed, complex and dynamic problem solving environments, and utilizing autonomous agents provides intelligent decision making capabilities. The proposed architecture presents prehospital management operations. The main identified agents are: EMS Center, Ambulance, Traffic Station, Healthcare Provider, Patient, Consultation Center, National Medical Record System and quality of service monitoring agent. Conclusion: In a critical condition like prehospital emergency we are coping with sophisticated processes like ambulance navigation health care provider and service assignment, consultation, recalling patients past medical history through a centralized EHR system and monitoring healthcare quality in a real-time manner. The main advantage of our work has been the multi agent system utilization. Our Future work will include proposed architecture implementation and evaluation of its impact on patient quality care improvement. PMID:28795061
2014-10-01
designed an Internet-based and mobile application (software) to assist with the following domains pertinent to diabetes self-management: 1...management that provides education, reminders, and support. The new tool is an internet-based and mobile application (software), now called Tracking...is mobile , provides decision support with actionable options, and is based on user input, will enhance diabetes self-care, improve glycemic control
NASA Technical Reports Server (NTRS)
Stephan, Amy; Erikson, Carol A.
1991-01-01
As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.
Adaptive Long-Term Monitoring at Environmental Restoration Sites (ER-0629)
2009-05-01
Figures Figure 2-1 General Flowchart of Software Application Figure 2-2 Overview of the Genetic Algorithm Approach Figure 2-3 Example of a...and Model Builder) are highlighted on Figure 2-1, which is a general flowchart illustrating the application of the software. The software is applied...monitoring event (e.g., contaminant mass based on interpolation) that modeling is provided by Model Builder. 4 Figure 2-1. General Flowchart of Software
Programs Model the Future of Air Traffic Management
NASA Technical Reports Server (NTRS)
2010-01-01
Through Small Business Innovation Research (SBIR) contracts with Ames Research Center, Intelligent Automation Inc., based in Rockville, Maryland, advanced specialized software the company had begun developing with U.S. Department of Defense funding. The agent-based infrastructure now allows NASA's Airspace Concept Evaluation System to explore ways of improving the utilization of the National Airspace System (NAS), providing flexible modeling of every part of the NAS down to individual planes, airports, control centers, and even weather. The software has been licensed to a number of aerospace and robotics customers, and has even been used to model the behavior of crowds.
Application of Toxic Chinese Medicine in Chinese Pharmacopoeia
NASA Astrophysics Data System (ADS)
Zhao, Hui; Feng, Yu; Mao, Mingsan
2018-01-01
Objective: Explore the application characteristics of proprietary Chinese medicine prescriptions containing toxic herbs in pharmacopoeia. Methods: In this paper, according to the clinical application of pharmacopoeia proprietary Chinese medicine is divided into table agent, Qushu agent, diarrhea agent, heat agent, Wen Li agent, cough and asthma agents, resuscitation agent, Gutian agent, Fuzheng agent, Anshen agent, hemostatic agent, The traditional Chinese medicine prescription and the clinical application of the Chinese herbal medicine containing the toxic Chinese medicine were analyzed and sorted out., Summed up the compatibility of toxic herbs and application characteristics. Results: Toxic Chinese herbal medicine in the cure of traditional Chinese medicine to play a long-standing role, through the overall thinking, dialectical thinking, and thinking of toxic Chinese medicine in the analysis of Chinese medicine that [2], toxic Chinese medicine in the application of proprietary Chinese medicine can not lack. Conclusion: Pharmacopoeia included proprietary Chinese medicine not only in the clinical treatment of good, but also the application of its toxic traditional Chinese medicine and its understanding of the enrichment of the toxic characteristics of traditional Chinese medicine and treatment-related disease pathology between the points of contact for patients with clinical applications Based on and theoretical guidance of Chinese medicine [3].
Stellman, Jeanne Mager; Stellman, Steven D; Weber, Tracy; Tomasallo, Carrie; Stellman, Andrew B; Christian, Richard
2003-03-01
Between 1961 and 1971, U.S. military forces dispersed more than 19 million gallons of phenoxy and other herbicidal agents in the Republic of Vietnam, including more than 12 million gallons of dioxin-contaminated Agent Orange, yet only comparatively limited epidemiologic and environmental research has been carried out on the distribution and health effects of this contamination. As part of a response to a National Academy of Sciences' request for development of exposure methodologies for carrying out epidemiologic research, a conceptual framework for estimating exposure opportunity to herbicides and a geographic information system (GIS) have been developed. The GIS is based on a relational database system that integrates extensive data resources on dispersal of herbicides (e.g., HERBS records of Ranch Hand aircraft flight paths, gallonage, and chemical agent), locations of military units and bases, dynamic movement of combat troops in Vietnam, and locations of civilian population centers. The GIS can provide a variety of proximity counts for exposure to 9,141 herbicide application missions. In addition, the GIS can be used to generate a quantitative exposure opportunity index that accounts for quantity of herbicide sprayed, distance, and environmental decay of a toxic factor such as dioxin, and is flexible enough to permit substitution of other mathematical exposure models by the user. The GIS thus provides a basis for estimation of herbicide exposure for use in large-scale epidemiologic studies. To facilitate widespread use of the GIS, a user-friendly software package was developed to permit researchers to assign exposure opportunity indexes to troops, locations, or individuals.
2015-03-01
UNCLASSIFIED UNCLASSIFIED Biotechnology on the Battlefield: An Application of Agent-based Modelling for Emerging Technology Assessment...wounds might be treatable using advanced biotechnologies to control haemorrhaging and reduce blood-loss until medical evacuation can be completed. This...APPROVED FOR PUBLIC RELEASE UNCLASSIFIED UNCLASSIFIED Biotechnology on the Battlefield: An Application
Multimodal nanoparticle imaging agents: design and applications
NASA Astrophysics Data System (ADS)
Burke, Benjamin P.; Cawthorne, Christopher; Archibald, Stephen J.
2017-10-01
Molecular imaging, where the location of molecules or nanoscale constructs can be tracked in the body to report on disease or biochemical processes, is rapidly expanding to include combined modality or multimodal imaging. No single imaging technique can offer the optimum combination of properties (e.g. resolution, sensitivity, cost, availability). The rapid technological advances in hardware to scan patients, and software to process and fuse images, are pushing the boundaries of novel medical imaging approaches, and hand-in-hand with this is the requirement for advanced and specific multimodal imaging agents. These agents can be detected using a selection from radioisotope, magnetic resonance and optical imaging, among others. Nanoparticles offer great scope in this area as they lend themselves, via facile modification procedures, to act as multifunctional constructs. They have relevance as therapeutics and drug delivery agents that can be tracked by molecular imaging techniques with the particular development of applications in optically guided surgery and as radiosensitizers. There has been a huge amount of research work to produce nanoconstructs for imaging, and the parameters for successful clinical translation and validation of therapeutic applications are now becoming much better understood. It is an exciting time of progress for these agents as their potential is closer to being realized with translation into the clinic. The coming 5-10 years will be critical, as we will see if the predicted improvement in clinical outcomes becomes a reality. Some of the latest advances in combination modality agents are selected and the progression pathway to clinical trials analysed. This article is part of the themed issue 'Challenges for chemistry in molecular imaging'.
Performing Verification and Validation in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1999-01-01
The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.
Software/hardware distributed processing network supporting the Ada environment
NASA Astrophysics Data System (ADS)
Wood, Richard J.; Pryk, Zen
1993-09-01
A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.
Software reuse in spacecraft planning and scheduling systems
NASA Technical Reports Server (NTRS)
Mclean, David; Tuchman, Alan; Broseghini, Todd; Yen, Wen; Page, Brenda; Johnson, Jay; Bogovich, Lynn; Burkhardt, Chris; Mcintyre, James; Klein, Scott
1993-01-01
The use of a software toolkit and development methodology that supports software reuse is described. The toolkit includes source-code-level library modules and stand-alone tools which support such tasks as data reformatting and report generation, simple relational database applications, user interfaces, tactical planning, strategic planning and documentation. The current toolkit is written in C and supports applications that run on IBM-PC's under DOS and UNlX-based workstations under OpenLook and Motif. The toolkit is fully integrated for building scheduling systems that reuse AI knowledge base technology. A typical scheduling scenario and three examples of applications that utilize the reuse toolkit will be briefly described. In addition to the tools themselves, a description of the software evolution and reuse methodology that was used is presented.
Vasilyev, K N
2013-01-01
When developing new software products and adapting existing software, project leaders have to decide which functionalities to keep, adapt or develop. They have to consider that the cost of making errors during the specification phase is extremely high. In this paper a formalised approach is proposed that considers the main criteria for selecting new software functions. The application of this approach minimises the chances of making errors in selecting the functions to apply. Based on the work on software development and support projects in the area of water resources and flood damage evaluation in economic terms at CH2M HILL (the developers of the flood modelling package ISIS), the author has defined seven criteria for selecting functions to be included in a software product. The approach is based on the evaluation of the relative significance of the functions to be included into the software product. Evaluation is achieved by considering each criterion and the weighting coefficients of each criterion in turn and applying the method of normalisation. This paper includes a description of this new approach and examples of its application in the development of new software products in the are of the water resources management.
Software Template for Instruction in Mathematics
NASA Technical Reports Server (NTRS)
Shelton, Robert O.; Moebes, Travis A.; Beall, Anna
2005-01-01
Intelligent Math Tutor (IMT) is a software system that serves as a template for creating software for teaching mathematics. IMT can be easily connected to artificial-intelligence software and other analysis software through input and output of files. IMT provides an easy-to-use interface for generating courses that include tests that contain both multiple-choice and fill-in-the-blank questions, and enables tracking of test scores. IMT makes it easy to generate software for Web-based courses or to manufacture compact disks containing executable course software. IMT also can function as a Web-based application program, with features that run quickly on the Web, while retaining the intelligence of a high-level language application program with many graphics. IMT can be used to write application programs in text, graphics, and/or sound, so that the programs can be tailored to the needs of most handicapped persons. The course software generated by IMT follows a "back to basics" approach of teaching mathematics by inducing the student to apply creative mathematical techniques in the process of learning. Students are thereby made to discover mathematical fundamentals and thereby come to understand mathematics more deeply than they could through simple memorization.
Applications and testing of the LSCAD system
NASA Astrophysics Data System (ADS)
Althouse, Mark L.; Gross, Robert L.; Ditillo, John T.; Lagna, William M.; Kolodzey, Steve J.; Keiser, Christopher C.; Nasers, Gary D.
1996-06-01
The lightweight standoff chemical agent detector (LSCAD) is an infrared Michelson interferometer operating in the 8 - 13 micron band and is designed primarily for military contamination avoidance and early warning applications. The system is designed to be operated autonomously from a vehicle while on the move and provide 360 degree coverage. The first group of prototypes were delivered in 1994 and have undergone integration into several platforms including the HMMWV, the M2 Bradley Fighting Vehicle, the M109 self- propelled Howitzer and the Pioneer and Hurricane unmanned air vehicles (UAVs). Additional vehicles and platforms are planned. To meet the restrictions of military applications, the prototype interferometer subsystem has a weight of about 10 lbs and is approximately 0.20 cu fit in size. The full system size and weight depends upon the particular platform and its operational requirements. LSCAD employs onboard instrument control, data collection, analysis and target detection decision software, all of which are critical to real-time operation. The hardware, software, and test results are discussed.
Large Scale Portability of Hospital Information System Software
Munnecke, Thomas H.; Kuhn, Ingeborg M.
1986-01-01
As part of its Decentralized Hospital Computer Program (DHCP) the Veterans Administration installed new hospital information systems in 169 of its facilities during 1984 and 1985. The application software for these systems is based on the ANS MUMPS language, is public domain, and is designed to be operating system and hardware independent. The software, developed by VA employees, is built upon a layered approach, where application packages layer on a common data dictionary which is supported by a Kernel of software. Communications between facilities are based on public domain Department of Defense ARPA net standards for domain naming, mail transfer protocols, and message formats, layered on a variety of communications technologies.
NanoDesign: Concepts and Software for a Nanotechnology Based on Functionalized Fullerenes
NASA Technical Reports Server (NTRS)
Globus, Al; Jaffe, Richard; Chancellor, Marisa K. (Technical Monitor)
1996-01-01
Eric Drexler has proposed a hypothetical nanotechnology based on diamond and investigated the properties of such molecular systems. While attractive, diamonoid nanotechnology is not physically accessible with straightforward extensions of current laboratory techniques. We propose a nanotechnology based on functionalized fullerenes and investigate carbon nanotube based gears with teeth added via a benzyne reaction known to occur with C60. The gears are single-walled carbon nanotubes with appended coenzyme groups for teeth. Fullerenes are in widespread laboratory use and can be functionalized in many ways. Companion papers computationally demonstrate the properties of these gears (they appear to work) and the accessibility of the benzyne/nanotube reaction. This paper describes the molecular design techniques and rationale as well as the software that implements these design techniques. The software is a set of persistent C++ objects controlled by TCL command scripts. The c++/tcl interface is automatically generated by a software system called tcl_c++ developed by the author and described here. The objects keep track of different portions of the molecular machinery to allow different simulation techniques and boundary conditions to be applied as appropriate. This capability has been required to demonstrate (computationally) our gear's feasibility. A new distributed software architecture featuring a WWW universal client, CORBA distributed objects, and agent software is under consideration. The software architecture is intended to eventually enable a widely disbursed group to develop complex simulated molecular machines.
Documenting clinical pharmacist intervention before and after the introduction of a web-based tool.
Nurgat, Zubeir A; Al-Jazairi, Abdulrazaq S; Abu-Shraie, Nada; Al-Jedai, Ahmed
2011-04-01
To develop a database for documenting pharmacist intervention through a web-based application. The secondary endpoint was to determine if the new, web-based application provides any benefits with regards to documentation compliance by clinical pharmacists and ease of calculating cost savings compared with our previous method of documenting pharmacist interventions. A tertiary care hospital in Saudi Arabia. The documentation of interventions using a web-based documentation application was retrospectively compared with previous methods of documentation of clinical pharmacists' interventions (multi-user PC software). The number and types of interventions recorded by pharmacists, data mining of archived data, efficiency, cost savings, and the accuracy of the data generated. The number of documented clinical interventions increased from 4,926, using the multi-user PC software, to 6,840 for the web-based application. On average, we observed 653 interventions per clinical pharmacist using the web-based application, which showed an increase compared to an average of 493 interventions using the old multi-user PC software. However, using a paired Student's t-test there was no statistical significance difference between the two means (P = 0.201). Using a χ² test, which captured management level and the type of system used, we found a strong effect of management level (P < 2.2 × 10⁻¹⁶) on the number of documented interventions. We also found a moderately significant relationship between educational level and the number of interventions documented (P = 0.045). The mean ± SD time required to document an intervention using the web-based application was 66.55 ± 8.98 s. Using the web-based application, 29.06% of documented interventions resulted in cost-savings, while using the multi-user PC software only 4.75% of interventions did so. The majority of cost savings across both platforms resulted from the discontinuation of unnecessary drugs and a change in dosage regimen. Data collection using the web-based application was consistently more complete when compared to the multi-user PC software. The web-based application is an efficient system for documenting pharmacist interventions. Its flexibility and accessibility, as well as its detailed report functionality is a useful tool that will hopefully encourage other primary and secondary care facilities to adopt similar applications.
NASA Astrophysics Data System (ADS)
Wang, H. T.; Chen, T. T.; Yan, C.; Pan, H.
2018-05-01
For App recommended areas of mobile phone software, made while using conduct App application recommended combined weighted Slope One algorithm collaborative filtering algorithm items based on further improvement of the traditional collaborative filtering algorithm in cold start, data matrix sparseness and other issues, will recommend Spark stasis parallel algorithm platform, the introduction of real-time streaming streaming real-time computing framework to improve real-time software applications recommended.
2008-09-01
IWPC 21 Berners - Lee , Tim . (1999). Weaving the Web. New York: HarperCollins Publishers, Inc. 22... Berners - Lee , Tim . (1999). Weaving the Web. New York: HarperCollins Publishers, Inc. Berners - Lee , T., Hendler, J., & Lassila, O. (2001). The Semantic...environment where software agents roaming from page to page can readily carry out sophisticated tasks for users. T. Berners - Lee , J. Hendler, and O
The Impact of Software Culture on the Management of Community Data
NASA Astrophysics Data System (ADS)
Collins, J. A.; Pulsifer, P. L.; Sheffield, E.; Lewis, S.; Oldenburg, J.
2013-12-01
The Exchange for Local Observations and Knowledge of the Arctic (ELOKA), a program hosted at the National Snow and Ice Data Center (NSIDC), supports the collection, curation, and distribution of Local and Traditional Knowledge (LTK) data, as well as some quantitative data products. Investigations involving LTK data often involve community participation, and therefore require flexible and robust user interfaces to support a reliable process of data collection and management. Often, investigators focused on LTK and community-based monitoring choose to use ELOKA's data services based on our ability to provide rapid proof-of-concepts and economical delivery of a usable product. To satisfy these two overarching criteria, ELOKA is experimenting with modifications to its software development culture both in terms of how the software applications are developed as well as the kind of software applications (or components) being developed. Over the past several years, NSIDC has shifted its software development culture from one of assigning individual scientific programmers to support particular principal investigators or projects, to an Agile Software Methodology implementation using Scrum practices. ELOKA has participated in this process by working with other product owners to schedule and prioritize development work which is then implemented by a team of application developers. Scrum, along with practices such as Test Driven Development (TDD) and paired programming, improves the quality of the software product delivered to the user community. To meet the need for rapid prototyping and to maximize product development and support with limited developer input, our software development efforts are now focused on creating a platform of application modules that can be quickly customized to suit the needs of a variety of LTK projects. This approach is in contrast to the strategy of delivering custom applications for individual projects. To date, we have integrated components of the Nunaliit Atlas framework (a Java/JavaScript client-server web-based application) with an existing Ruby on Rails application. This approach requires transitioning individual applications to expose a service layer, thus allowing interapplication communication via RESTful services. In this presentation we will report on our experiences using Agile Scrum practices, our efforts to move from custom solutions to a platform of customizable modules, and the impact of each on our ability to support researchers and Arctic residents in the domain of community-based observations and knowledge.
Classification software technique assessment
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.
1976-01-01
A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.
Design and implementation of a cloud based lithography illumination pupil processing application
NASA Astrophysics Data System (ADS)
Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie
2017-02-01
Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.
Software Agents to Assist in Distance Learning Environments
ERIC Educational Resources Information Center
Choy, Sheung-On; Ng, Sin-Chun; Tsang, Yiu-Chung
2005-01-01
The Open University of Hong Kong (OUHK) is a distance education university with about 22,500 students. In fulfilling its mission, the university has adopted various Web-based and electronic means to support distance learning. For instance, OUHK uses a Web-based course management system (CMS) to provide students with a flexible way to obtain course…
Improvements to the APBS biomolecular solvation software suite.
Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A
2018-01-01
The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.
Gignac, Paul M; Kley, Nathan J; Clarke, Julia A; Colbert, Matthew W; Morhardt, Ashley C; Cerio, Donald; Cost, Ian N; Cox, Philip G; Daza, Juan D; Early, Catherine M; Echols, M Scott; Henkelman, R Mark; Herdina, A Nele; Holliday, Casey M; Li, Zhiheng; Mahlow, Kristin; Merchant, Samer; Müller, Johannes; Orsbon, Courtney P; Paluh, Daniel J; Thies, Monte L; Tsai, Henry P; Witmer, Lawrence M
2016-06-01
Morphologists have historically had to rely on destructive procedures to visualize the three-dimensional (3-D) anatomy of animals. More recently, however, non-destructive techniques have come to the forefront. These include X-ray computed tomography (CT), which has been used most commonly to examine the mineralized, hard-tissue anatomy of living and fossil metazoans. One relatively new and potentially transformative aspect of current CT-based research is the use of chemical agents to render visible, and differentiate between, soft-tissue structures in X-ray images. Specifically, iodine has emerged as one of the most widely used of these contrast agents among animal morphologists due to its ease of handling, cost effectiveness, and differential affinities for major types of soft tissues. The rapid adoption of iodine-based contrast agents has resulted in a proliferation of distinct specimen preparations and scanning parameter choices, as well as an increasing variety of imaging hardware and software preferences. Here we provide a critical review of the recent contributions to iodine-based, contrast-enhanced CT research to enable researchers just beginning to employ contrast enhancement to make sense of this complex new landscape of methodologies. We provide a detailed summary of recent case studies, assess factors that govern success at each step of the specimen storage, preparation, and imaging processes, and make recommendations for standardizing both techniques and reporting practices. Finally, we discuss potential cutting-edge applications of diffusible iodine-based contrast-enhanced computed tomography (diceCT) and the issues that must still be overcome to facilitate the broader adoption of diceCT going forward. © 2016 The Authors. Journal of Anatomy published by John Wiley & Sons Ltd on behalf of Anatomical Society.
V&V Within Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1996-01-01
Verification and Validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission-critical software. V&V is a systems engineering discipline that evaluates the software in a systems context, and is currently applied during the development of a specific application system. In order to bring the effectiveness of V&V to bear within reuse-based software engineering, V&V must be incorporated within the domain engineering process.
NASA Astrophysics Data System (ADS)
Criado, Javier; Padilla, Nicolás; Iribarne, Luis; Asensio, Jose-Andrés
Due to the globalization of the information and knowledge society on the Internet, modern Web-based Information Systems (WIS) must be flexible and prepared to be easily accessible and manageable in real-time. In recent times it has received a special interest the globalization of information through a common vocabulary (i.e., ontologies), and the standardized way in which information is retrieved on the Web (i.e., powerful search engines, and intelligent software agents). These same principles of globalization and standardization should also be valid for the user interfaces of the WIS, but they are built on traditional development paradigms. In this paper we present an approach to reduce the gap of globalization/standardization in the generation of WIS user interfaces by using a real-time "bottom-up" composition perspective with COTS-interface components (type interface widgets) and trading services.
Runtime Performance Monitoring Tool for RTEMS System Software
NASA Astrophysics Data System (ADS)
Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.
2007-08-01
RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.
Study of a unified hardware and software fault-tolerant architecture
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan; Alger, Linda; Friend, Steven; Greeley, Gregory; Sacco, Stephen; Adams, Stuart
1989-01-01
A unified architectural concept, called the Fault Tolerant Processor Attached Processor (FTP-AP), that can tolerate hardware as well as software faults is proposed for applications requiring ultrareliable computation capability. An emulation of the FTP-AP architecture, consisting of a breadboard Motorola 68010-based quadruply redundant Fault Tolerant Processor, four VAX 750s as attached processors, and four versions of a transport aircraft yaw damper control law, is used as a testbed in the AIRLAB to examine a number of critical issues. Solutions of several basic problems associated with N-Version software are proposed and implemented on the testbed. This includes a confidence voter to resolve coincident errors in N-Version software. A reliability model of N-Version software that is based upon the recent understanding of software failure mechanisms is also developed. The basic FTP-AP architectural concept appears suitable for hosting N-Version application software while at the same time tolerating hardware failures. Architectural enhancements for greater efficiency, software reliability modeling, and N-Version issues that merit further research are identified.
A Stigmergy Collaboration Approach in the Open Source Software Developer Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Xiaohui; Pullum, Laura L; Treadwell, Jim N
2009-01-01
The communication model of some self-organized online communities is significantly different from the traditional social network based community. It is problematic to use social network analysis to analyze the collaboration structure and emergent behaviors in these communities because these communities lack peer-to-peer connections. Stigmergy theory provides an explanation of the collaboration model of these communities. In this research, we present a stigmergy approach for building an agent-based simulation to simulate the collaboration model in the open source software (OSS) developer community. We used a group of actors who collaborate on OSS projects through forums as our frame of reference andmore » investigated how the choices actors make in contributing their work on the projects determines the global status of the whole OSS project. In our simulation, the forum posts serve as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing the developer agents behavior selection probability.« less
Target Trailing With Safe Navigation for Maritime Autonomous Surface Vehicles
NASA Technical Reports Server (NTRS)
Wolf, Michael; Kuwata, Yoshiaki; Zarzhitsky, Dimitri V.
2013-01-01
This software implements a motion-planning module for a maritime autonomous surface vehicle (ASV). The module trails a given target while also avoiding static and dynamic surface hazards. When surface hazards are other moving boats, the motion planner must apply International Regulations for Avoiding Collisions at Sea (COLREGS). A key subset of these rules has been implemented in the software. In case contact with the target is lost, the software can receive and follow a "reacquisition route," provided by a complementary system, until the target is reacquired. The programmatic intention is that the trailed target is a submarine, although any mobile naval platform could serve as the target. The algorithmic approach to combining motion with a (possibly moving) goal location, while avoiding local hazards, may be applicable to robotic rovers, automated landing systems, and autonomous airships. The software operates in JPL s CARACaS (Control Architecture for Robotic Agent Command and Sensing) software architecture and relies on other modules for environmental perception data and information on the predicted detectability of the target, as well as the low-level interface to the boat controls.
A Multi-Agent Environment for Negotiation
NASA Astrophysics Data System (ADS)
Hindriks, Koen V.; Jonker, Catholijn M.; Tykhonov, Dmytro
In this chapter we introduce the System for Analysis of Multi-Issue Negotiation (SAMIN). SAMIN offers a negotiation environment that supports and facilitates the setup of various negotiation setups. The environment has been designed to analyse negotiation processes between human negotiators, between human and software agents, and between software agents. It offers a range of different agents, different domains, and other options useful to define a negotiation setup. The environment has been used to test and evaluate a range of negotiation strategies in various domains playing against other negotiating agents as well as humans. We discuss some of the results obtained by means of these experiments.
A multi-agent approach to intelligent monitoring in smart grids
NASA Astrophysics Data System (ADS)
Vallejo, D.; Albusac, J.; Glez-Morcillo, C.; Castro-Schez, J. J.; Jiménez, L.
2014-04-01
In this paper, we propose a scalable multi-agent architecture to give support to smart grids, paying special attention to the intelligent monitoring of distribution substations. The data gathered by multiple sensors are used by software agents that are responsible for monitoring different aspects or events of interest, such as normal voltage values or unbalanced intensity values that can end up blowing fuses and decreasing the quality of service of end consumers. The knowledge bases of these agents have been built by means of a formal model for normality analysis that has been successfully used in other surveillance domains. The architecture facilitates the integration of new agents and can be easily configured and deployed to monitor different environments. The experiments have been conducted over a power distribution network.
Web-based interactive 2D/3D medical image processing and visualization software.
Mahmoudi, Seyyed Ehsan; Akhondi-Asl, Alireza; Rahmani, Roohollah; Faghih-Roohi, Shahrooz; Taimouri, Vahid; Sabouri, Ahmad; Soltanian-Zadeh, Hamid
2010-05-01
There are many medical image processing software tools available for research and diagnosis purposes. However, most of these tools are available only as local applications. This limits the accessibility of the software to a specific machine, and thus the data and processing power of that application are not available to other workstations. Further, there are operating system and processing power limitations which prevent such applications from running on every type of workstation. By developing web-based tools, it is possible for users to access the medical image processing functionalities wherever the internet is available. In this paper, we introduce a pure web-based, interactive, extendable, 2D and 3D medical image processing and visualization application that requires no client installation. Our software uses a four-layered design consisting of an algorithm layer, web-user-interface layer, server communication layer, and wrapper layer. To compete with extendibility of the current local medical image processing software, each layer is highly independent of other layers. A wide range of medical image preprocessing, registration, and segmentation methods are implemented using open source libraries. Desktop-like user interaction is provided by using AJAX technology in the web-user-interface. For the visualization functionality of the software, the VRML standard is used to provide 3D features over the web. Integration of these technologies has allowed implementation of our purely web-based software with high functionality without requiring powerful computational resources in the client side. The user-interface is designed such that the users can select appropriate parameters for practical research and clinical studies. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.
Applications of agent-based modeling to nutrient movement Lake Michigan
As part of an ongoing project aiming to provide useful information for nearshore management (harmful algal blooms, nutrient loading), we explore the value of agent-based models in Lake Michigan. Agent-based models follow many individual “agents” moving through a simul...
High-throughput state-machine replication using software transactional memory.
Zhao, Wenbing; Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin
2016-11-01
State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload.
High-throughput state-machine replication using software transactional memory
Yang, William; Zhang, Honglei; Yang, Jack; Luo, Xiong; Zhu, Yueqin; Yang, Mary; Luo, Chaomin
2017-01-01
State-machine replication is a common way of constructing general purpose fault tolerance systems. To ensure replica consistency, requests must be executed sequentially according to some total order at all non-faulty replicas. Unfortunately, this could severely limit the system throughput. This issue has been partially addressed by identifying non-conflicting requests based on application semantics and executing these requests concurrently. However, identifying and tracking non-conflicting requests require intimate knowledge of application design and implementation, and a custom fault tolerance solution developed for one application cannot be easily adopted by other applications. Software transactional memory offers a new way of constructing concurrent programs. In this article, we present the mechanisms needed to retrofit existing concurrency control algorithms designed for software transactional memory for state-machine replication. The main benefit for using software transactional memory in state-machine replication is that general purpose concurrency control mechanisms can be designed without deep knowledge of application semantics. As such, new fault tolerance systems based on state-machine replications with excellent throughput can be easily designed and maintained. In this article, we introduce three different concurrency control mechanisms for state-machine replication using software transactional memory, namely, ordered strong strict two-phase locking, conventional timestamp-based multiversion concurrency control, and speculative timestamp-based multiversion concurrency control. Our experiments show that speculative timestamp-based multiversion concurrency control mechanism has the best performance in all types of workload, the conventional timestamp-based multiversion concurrency control offers the worst performance due to high abort rate in the presence of even moderate contention between transactions. The ordered strong strict two-phase locking mechanism offers the simplest solution with excellent performance in low contention workload, and fairly good performance in high contention workload. PMID:29075049
Measures and metrics for software development
NASA Technical Reports Server (NTRS)
1984-01-01
The evaluations of and recommendations for the use of software development measures based on the practical and analytical experience of the Software Engineering Laboratory are discussed. The basic concepts of measurement and system of classification for measures are described. The principal classes of measures defined are explicit, analytic, and subjective. Some of the major software measurement schemes appearing in the literature are derived. The applications of specific measures in a production environment are explained. These applications include prediction and planning, review and assessment, and evaluation and selection.
The Need for V&V in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.
Zheng, Song; Zhang, Qi; Zheng, Rong; Huang, Bi-Qin; Song, Yi-Lin; Chen, Xin-Chu
2017-01-01
In recent years, the smart home field has gained wide attention for its broad application prospects. However, families using smart home systems must usually adopt various heterogeneous smart devices, including sensors and devices, which makes it more difficult to manage and control their home system. How to design a unified control platform to deal with the collaborative control problem of heterogeneous smart devices is one of the greatest challenges in the current smart home field. The main contribution of this paper is to propose a universal smart home control platform architecture (IAPhome) based on a multi-agent system and communication middleware, which shows significant adaptability and advantages in many aspects, including heterogeneous devices connectivity, collaborative control, human-computer interaction and user self-management. The communication middleware is an important foundation to design and implement this architecture which makes it possible to integrate heterogeneous smart devices in a flexible way. A concrete method of applying the multi-agent software technique to solve the integrated control problem of the smart home system is also presented. The proposed platform architecture has been tested in a real smart home environment, and the results indicate that the effectiveness of our approach for solving the collaborative control problem of different smart devices. PMID:28926957
Zheng, Song; Zhang, Qi; Zheng, Rong; Huang, Bi-Qin; Song, Yi-Lin; Chen, Xin-Chu
2017-09-16
In recent years, the smart home field has gained wide attention for its broad application prospects. However, families using smart home systems must usually adopt various heterogeneous smart devices, including sensors and devices, which makes it more difficult to manage and control their home system. How to design a unified control platform to deal with the collaborative control problem of heterogeneous smart devices is one of the greatest challenges in the current smart home field. The main contribution of this paper is to propose a universal smart home control platform architecture (IAPhome) based on a multi-agent system and communication middleware, which shows significant adaptability and advantages in many aspects, including heterogeneous devices connectivity, collaborative control, human-computer interaction and user self-management. The communication middleware is an important foundation to design and implement this architecture which makes it possible to integrate heterogeneous smart devices in a flexible way. A concrete method of applying the multi-agent software technique to solve the integrated control problem of the smart home system is also presented. The proposed platform architecture has been tested in a real smart home environment, and the results indicate that the effectiveness of our approach for solving the collaborative control problem of different smart devices.
The Application of V&V within Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward
1996-01-01
Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.
A usability evaluation of medical software at an expert conference setting.
Bond, Raymond Robert; Finlay, Dewar D; Nugent, Chris D; Moore, George; Guldenring, Daniel
2014-01-01
A usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram [ECG] viewer) and the other is a medical research tool (electrode misplacement simulator [EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. The medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each session, participants completed a pre-test and a post-test questionnaire respectively. The average duration of a usability session at the conference was 34.69 min (SD=10.28). However, taking into account that 10 min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69 min (SD=10.28). Given we have shown that usability data can be collected at conferences, this paper details the advantages of conference-based usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign. Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. This study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their usability at an expert conference setting. The primary advantage of a conference-based usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Sarmaga, Don; DuBois, Jeffrey A; Lyon, Martha E
2011-01-01
Background Off-meter dosed photometric glucose-oxidase-based glucose meters have been reported to be susceptible to interference by hydrogen-peroxide-based disinfecting agents. The objective of this study was to determine if a single application of hydrogen-peroxide-containing Accel® wipe to disinfect an on-meter dosed amperometric glucose-oxidase-based glucose meter will influence its performance. Method The performance of five on-meter dosed amperometric glucose-oxidase-based glucose meters was determined before and after disinfecting the devices with a single application of either CaviWipes® (14.3% isopropanol and 0.23% diisobutyl-phenoxy-ethoxyethyl dimethyl benzyl ammonium chloride) or Accel (0.5% hydrogen peroxide) wipes. Replicate glucose measurements were conducted before disinfecting the devices, immediately after disinfecting, and then 1 and 2 min postdisinfecting, with measurements in triplicate. Analysis was sequentially completed for five different meters. Results were analyzed by a two-way analysis of variance (Analyze-it software). Results No clinical (<0.3 mmol/liter) or statistical differences (p > .05) in glucose concentration were detected when the on-meter dosed amperometric glucose-oxidase-based glucose meters were disinfected with either CaviWipes or Accel wipes and measured immediately or 1 or 2 min postdisinfecting. No clinically significant difference in glucose concentration was detected between meters (<0.3 mmol/liter). Conclusion The on-meter dosed glucose oxidase amperometric-based glucose meters are not analytically susceptible to interference by a single application of hydrogen-peroxide-containing Accel disinfectant wipes. PMID:22226263
Sarmaga, Don; Dubois, Jeffrey A; Lyon, Martha E
2011-11-01
Off-meter dosed photometric glucose-oxidase-based glucose meters have been reported to be susceptible to interference by hydrogen-peroxide-based disinfecting agents. The objective of this study was to determine if a single application of hydrogen-peroxide-containing Accel® wipe to disinfect an on-meter dosed amperometric glucose-oxidase-based glucose meter will influence its performance. The performance of five on-meter dosed amperometric glucose-oxidase-based glucose meters was determined before and after disinfecting the devices with a single application of either CaviWipes® (14.3% isopropanol and 0.23% diisobutyl-phenoxy-ethoxyethyl dimethyl benzyl ammonium chloride) or Accel (0.5% hydrogen peroxide) wipes. Replicate glucose measurements were conducted before disinfecting the devices, immediately after disinfecting, and then 1 and 2 min postdisinfecting, with measurements in triplicate. Analysis was sequentially completed for five different meters. Results were analyzed by a two-way analysis of variance (Analyze-it software). No clinical (<0.3 mmol/liter) or statistical differences (p > .05) in glucose concentration were detected when the on-meter dosed amperometric glucose-oxidase-based glucose meters were disinfected with either CaviWipes or Accel wipes and measured immediately or 1 or 2 min postdisinfecting. No clinically significant difference in glucose concentration was detected between meters (<0.3 mmol/liter). The on-meter dosed glucose oxidase amperometric-based glucose meters are not analytically susceptible to interference by a single application of hydrogen-peroxide-containing Accel disinfectant wipes. © 2011 Diabetes Technology Society.
Review of Software Platforms for Agent Based Models
2008-04-01
EINSTein 4.3.2 Battlefield Python (optional, for batch runs) MANA 4.3.3 Battlefield N/A MASON 4.3.4 General Java NetLogo 4.3.5 General Logo-variant...through the use of relatively simple Python scripts. It also has built-in functions for parameter sweeps, and can plot the resulting fitness landscape ac...Nonetheless its ease of use, and support for automatic drawing of agents in 2D or 3D2 makes this a suitable platform for beginner programmers. 2Only in the
Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment
Lee, Woojin; Kim, Juil; Kang, JangMook
2010-01-01
In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment. PMID:22163678
Automated construction of node software using attributes in a ubiquitous sensor network environment.
Lee, Woojin; Kim, Juil; Kang, JangMook
2010-01-01
In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric-the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.
NASA Astrophysics Data System (ADS)
Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina
Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).
Pynamic: the Python Dynamic Benchmark
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, G L; Ahn, D H; de Supinksi, B R
2007-07-10
Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, wemore » present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.« less
Flight Software Development for the CHEOPS Instrument with the CORDET Framework
NASA Astrophysics Data System (ADS)
Cechticky, V.; Ottensamer, R.; Pasetti, A.
2015-09-01
CHEOPS is an ESA S-class mission dedicated to the precise measurement of radii of already known exoplanets using ultra-high precision photometry. The instrument flight software controlling the instrument and handling the science data is developed by the University of Vienna using the CORDET Framework offered by P&P Software GmbH. The CORDET Framework provides a generic software infrastructure for PUS-based applications. This paper describes how the framework is used for the CHEOPS application software to provide a consistent solution for to the communication and control services, event handling and FDIR procedures. This approach is innovative in four respects: (a) it is a true third-party reuse; (b) re-use is done at specification, validation and code level; (c) the re-usable assets and their qualification data package are entirely open-source; (d) re-use is based on call-back with the application developer providing functions which are called by the reusable architecture. File names missing from here on out (I tried to mimic the files names from before.)
NASA Astrophysics Data System (ADS)
Lanciotti, E.; Merino, G.; Bria, A.; Blomer, J.
2011-12-01
In a distributed computing model as WLCG the software of experiment specific application software has to be efficiently distributed to any site of the Grid. Application software is currently installed in a shared area of the site visible for all Worker Nodes (WNs) of the site through some protocol (NFS, AFS or other). The software is installed at the site by jobs which run on a privileged node of the computing farm where the shared area is mounted in write mode. This model presents several drawbacks which cause a non-negligible rate of job failure. An alternative model for software distribution based on the CERN Virtual Machine File System (CernVM-FS) has been tried at PIC, the Spanish Tierl site of WLCG. The test bed used and the results are presented in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.
VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less
MathBrowser: Web-Enabled Mathematical Software with Application to the Chemistry Curriculum, v 1.0
NASA Astrophysics Data System (ADS)
Goldsmith, Jack G.
1997-10-01
MathSoft: Cambridge, MA, 1996; free via ftp from www.mathsoft.com. The movement to provide computer-based applications in chemistry has come to focus on three main areas: software aimed at specific applications (drawing, simulation, data analysis, etc.), multimedia applications designed to assist in the presentation of conceptual information, and packages to be used in conjunction with a particular textbook at a specific point in the chemistry curriculum. The result is a situation where no single software package devoted to problem solving can be used across a large segment of the curriculum. Adoption of World Wide Web (WWW) technology by a manufacturer of mathematical software, however, has produced software that provides an attractive means of providing a problem-solving resource to students in courses from freshman through senior level.
A Critical Review of Mode of Action (MOA) Assignment ...
There are various structure-based classification schemes to categorize chemicals based on mode of action (MOA) which have been applied for both eco and human health toxicology. With increasing calls to assess thousands of chemicals, some of which have little available information other than structure, clear understanding how each of these MOA schemes was devised, what information they are based on, and the limitations of each approach is critical. Several groups are developing low-tier methods to more easily classify or assess chemicals, using approaches such as the ecological threshold of concern (eco-TTC) and chemical-activity. Evaluation of these approaches and determination of their domain of applicability is partly dependent on the MOA classification that is used. The most commonly used MOA classification schemes for ecotoxicology include Verhaar and Russom (included in ASTER), both of which are used to predict acute aquatic toxicity MOA. Verhaar is a QSAR-based system that classifies chemicals into one of 4 classes, with a 5th class specified for those chemicals that are not classified in the other 4. ASTER/Russom includes 8 classifications: narcotics (3 groups), oxidative phosphorylation uncouplers, respiratory inhibitors, electrophiles/proelectrophiles, AChE inhibitors, or CNS seizure agents. Other methodologies include TEST (Toxicity Estimation Software Tool), a computational chemistry-based application that allows prediction to one of 5 broad MOA
The evolution of gadolinium based contrast agents: from single-modality to multi-modality
NASA Astrophysics Data System (ADS)
Zhang, Li; Liu, Ruiqing; Peng, Hui; Li, Penghui; Xu, Zushun; Whittaker, Andrew K.
2016-05-01
Gadolinium-based contrast agents are extensively used as magnetic resonance imaging (MRI) contrast agents due to their outstanding signal enhancement and ease of chemical modification. However, it is increasingly recognized that information obtained from single modal molecular imaging cannot satisfy the higher requirements on the efficiency and accuracy for clinical diagnosis and medical research, due to its limitation and default rooted in single molecular imaging technique itself. To compensate for the deficiencies of single function magnetic resonance imaging contrast agents, the combination of multi-modality imaging has turned to be the research hotpot in recent years. This review presents an overview on the recent developments of the functionalization of gadolinium-based contrast agents, and their application in biomedicine applications.
An overview of platforms for cloud based development.
Fylaktopoulos, G; Goumas, G; Skolarikis, M; Sotiropoulos, A; Maglogiannis, I
2016-01-01
This paper provides an overview of the state of the art technologies for software development in cloud environments. The surveyed systems cover the whole spectrum of cloud-based development including integrated programming environments, code repositories, software modeling, composition and documentation tools, and application management and orchestration. In this work we evaluate the existing cloud development ecosystem based on a wide number of characteristics like applicability (e.g. programming and database technologies supported), productivity enhancement (e.g. editor capabilities, debugging tools), support for collaboration (e.g. repository functionality, version control) and post-development application hosting and we compare the surveyed systems. The conducted survey proves that software engineering in the cloud era has made its initial steps showing potential to provide concrete implementation and execution environments for cloud-based applications. However, a number of important challenges need to be addressed for this approach to be viable. These challenges are discussed in the article, while a conclusion is drawn that although several steps have been made, a compact and reliable solution does not yet exist.
Miniature Microwave Applicator for Murine Bladder Hyperthermia Studies
Salahi, Sara; Maccarini, Paolo F.; Rodrigues, Dario B.; Etienne, Wiguins; Landon, Chelsea D.; Inman, Brant A.; Dewhirst, Mark W.; Stauffer, Paul R.
2012-01-01
Purpose Novel combinations of heat with chemotherapeutic agents are often studied in murine tumor models. Currently, no device exists to selectively heat small tumors at depth in mice. In this project, we modelled, built and tested a miniature microwave heat applicator, the physical dimensions of which can be scaled to adjust the volume and depth of heating to focus on the tumor volume. Of particular interest is a device that can selectively heat murine bladder. Materials and Methods Using Avizo® segmentation software, we created a numerical mouse model based on micro-MRI scan data. The model was imported into HFSS™ simulation software and parametric studies were performed to optimize the dimensions of a water-loaded circular waveguide for selective power deposition inside a 0.15ml bladder. A working prototype was constructed operating at 2.45GHz. Heating performance was characterized by mapping fiber-optic temperature sensors along catheters inserted at depths of 0-1mm (subcutaneous), 2-3mm (vaginal), and 4-5mm (rectal) below the abdominal wall, with the mid-depth catheter adjacent to the bladder. Core temperature was monitored orally. Results Thermal measurements confirm the simulations which demonstrate that this applicator can provide local heating at depth in small animals. Measured temperatures in murine pelvis show well-localized bladder heating to 42-43°C while maintaining normothermic skin and core temperatures. Conclusions Simulation techniques facilitate the design optimization of microwave antennas for use in pre-clinical applications such as localized tumor heating in small animals. Laboratory measurements demonstrate the effectiveness of a new miniature water-coupled microwave applicator for localized heating of murine bladder. PMID:22690856
Brownian dynamics simulation of rigid particles of arbitrary shape in external fields.
Fernandes, Miguel X; de la Torre, José García
2002-12-01
We have developed a Brownian dynamics simulation algorithm to generate Brownian trajectories of an isolated, rigid particle of arbitrary shape in the presence of electric fields or any other external agents. Starting from the generalized diffusion tensor, which can be calculated with the existing HYDRO software, the new program BROWNRIG (including a case-specific subprogram for the external agent) carries out a simulation that is analyzed later to extract the observable dynamic properties. We provide a variety of examples of utilization of this method, which serve as tests of its performance, and also illustrate its applicability. Examples include free diffusion, transport in an electric field, and diffusion in a restricting environment.
Study of fault tolerant software technology for dynamic systems
NASA Technical Reports Server (NTRS)
Caglayan, A. K.; Zacharias, G. L.
1985-01-01
The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.
A Unified Algebraic and Logic-Based Framework Towards Safe Routing Implementations
2015-08-13
Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative...and debugging several SDN applications. Example-based SDN synthesis. Recent emergence of software - defined networks offers an opportunity to design...domain of Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative networking
Multi-agent robotic systems and applications for satellite missions
NASA Astrophysics Data System (ADS)
Nunes, Miguel A.
A revolution in the space sector is happening. It is expected that in the next decade there will be more satellites launched than in the previous sixty years of space exploration. Major challenges are associated with this growth of space assets such as the autonomy and management of large groups of satellites, in particular with small satellites. There are two main objectives for this work. First, a flexible and distributed software architecture is presented to expand the possibilities of spacecraft autonomy and in particular autonomous motion in attitude and position. The approach taken is based on the concept of distributed software agents, also referred to as multi-agent robotic system. Agents are defined as software programs that are social, reactive and proactive to autonomously maximize the chances of achieving the set goals. Part of the work is to demonstrate that a multi-agent robotic system is a feasible approach for different problems of autonomy such as satellite attitude determination and control and autonomous rendezvous and docking. The second main objective is to develop a method to optimize multi-satellite configurations in space, also known as satellite constellations. This automated method generates new optimal mega-constellations designs for Earth observations and fast revisit times on large ground areas. The optimal satellite constellation can be used by researchers as the baseline for new missions. The first contribution of this work is the development of a new multi-agent robotic system for distributing the attitude determination and control subsystem for HiakaSat. The multi-agent robotic system is implemented and tested on the satellite hardware-in-the-loop testbed that simulates a representative space environment. The results show that the newly proposed system for this particular case achieves an equivalent control performance when compared to the monolithic implementation. In terms on computational efficiency it is found that the multi-agent robotic system has a consistent lower CPU load of 0.29 +/- 0.03 compared to 0.35 +/- 0.04 for the monolithic implementation, a 17.1 % reduction. The second contribution of this work is the development of a multi-agent robotic system for the autonomous rendezvous and docking of multiple spacecraft. To compute the maneuvers guidance, navigation and control algorithms are implemented as part of the multi-agent robotic system. The navigation and control functions are implemented using existing algorithms, but one important contribution of this section is the introduction of a new six degrees of freedom guidance method which is part of the guidance, navigation and control architecture. This new method is an explicit solution to the guidance problem, and is particularly useful for real time guidance for attitude and position, as opposed to typical guidance methods which are based on numerical solutions, and therefore are computationally intensive. A simulation scenario is run for docking four CubeSats deployed radially from a launch vehicle. Considering fully actuated CubeSats, the simulations show docking maneuvers that are successfully completed within 25 minutes which is approximately 30% of a full orbital period in low earth orbit. The final section investigates the problem of optimization of satellite constellations for fast revisit time, and introduces a new method to generate different constellation configurations that are evaluated with a genetic algorithm. Two case studies are presented. The first is the optimization of a constellation for rapid coverage of the oceans of the globe in 24 hours or less. Results show that for an 80 km sensor swath width 50 satellites are required to cover the oceans with a 24 hour revisit time. The second constellation configuration study focuses on the optimization for the rapid coverage of the North Atlantic Tracks for air traffic monitoring in 3 hours or less. The results show that for a fixed swath width of 160 km and for a 3 hour revisit time 52 satellites are required.
Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.
2001-01-01
The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.
Data Centric Development Methodology
ERIC Educational Resources Information Center
Khoury, Fadi E.
2012-01-01
Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…
NIRP Core Software Suite v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitener, Dustin Heath; Folz, Wesley; Vo, Duong
The NIRP Core Software Suite is a core set of code that supports multiple applications. It includes miscellaneous base code for data objects, mathematic equations, and user interface components; and the framework includes several fully-developed software applications that exist as stand-alone tools to compliment other applications. The stand-alone tools are described below. Analyst Manager: An application to manage contact information for people (analysts) that use the software products. This information is often included in generated reports and may be used to identify the owners of calculations. Radionuclide Viewer: An application for viewing the DCFPAK radiological data. Compliments the Mixture Managermore » tool. Mixture Manager: An application to create and manage radionuclides mixtures that are commonly used in other applications. High Explosive Manager: An application to manage explosives and their properties. Chart Viewer: An application to view charts of data (e.g. meteorology charts). Other applications may use this framework to create charts specific to their data needs.« less
Biocompatible blood pool MRI contrast agents based on hyaluronan
Zhu, Wenlian; Artemov, Dmitri
2010-01-01
Biocompatible gadolinium blood pool contrast agents based on a biopolymer, hyaluronan, were investigated for magnetic resonance angiography application. Hyaluronan, a non-sulfated linear glucosaminoglycan composed of 2000–25,000 repeating disaccharide subunits of D-glucuronic acid and N-acetylglucosamine with molecular weight up to 20 MDa, is a major component of the extracellular matrix. Two gadolinium contrast agents based on 16 and 74 kDa hyaluronan were synthesized, both with R1 relaxivity around 5 mM−1 s−1 per gadolinium at 9.4 T at 25°C. These two hyaluronan based agents show significant enhancement of the vasculature for an extended period of time. Initial excretion was primarily through the renal system. Later uptake was observed in the stomach and lower gastrointestinal tract. Macromolecular hyaluronan-based gadolinium agents have a high clinical translation potential as hyaluronan is already approved by FDA for a variety of medical applications. PMID:21504061
Software for Partly Automated Recognition of Targets
NASA Technical Reports Server (NTRS)
Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark; Selinsky, T.
2002-01-01
The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user's tendencies while the user is selecting targets and to increase the user's productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.
The new meaning of quality in the information age.
Prahalad, C K; Krishnan, M S
1999-01-01
Software applications are now a mission-critical source of competitive advantage for most companies. They are also a source of great risk, as the Y2K bug has made clear. Yet many line managers still haven't confronted software issues--partly because they aren't sure how best to define the quality of the applications in their IT infrastructures. Some companies such as Wal-Mart and the Gap have successfully integrated the software in their networks, but most have accumulated an unwidely number of incompatible applications--all designed to perform the same tasks. The authors provide a framework for measuring the performance of software in a company's IT portfolio. Quality traditionally has been measured according to a product's ability to meet certain specifications; other views of quality have emerged that measure a product's adaptability to customers' needs and a product's ability to encourage innovation. To judge software quality properly, argue the authors, managers must measure applications against all three approaches. Understanding the domain of a software application is an important part of that process. The domain is the body of knowledge about a user's needs and expectations for a product. Software domains change frequently based on how a consumer chooses to use, for example, Microsoft Word or a spreadsheet application. The domain can also be influenced by general changes in technology, such as the development of a new software platform. Thus, applications can't be judged only according to whether they conform to specifications. The authors discuss how to identify domain characteristics and software risks and suggest ways to reduce the variability of software domains.
A SOA-based approach to geographical data sharing
NASA Astrophysics Data System (ADS)
Li, Zonghua; Peng, Mingjun; Fan, Wei
2009-10-01
In the last few years, large volumes of spatial data have been available in different government departments in China, but these data are mainly used within these departments. With the e-government project initiated, spatial data sharing become more and more necessary. Currently, the Web has been used not only for document searching but also for the provision and use of services, known as Web services, which are published in a directory and may be automatically discovered by software agents. Particularly in the spatial domain, the possibility of accessing these large spatial datasets via Web services has motivated research into the new field of Spatial Data Infrastructure (SDI) implemented using service-oriented architecture. In this paper a Service-Oriented Architecture (SOA) based Geographical Information Systems (GIS) is proposed, and a prototype system is deployed based on Open Geospatial Consortium (OGC) standard in Wuhan, China, thus that all the departments authorized can access the spatial data within the government intranet, and also these spatial data can be easily integrated into kinds of applications.
Validation of highly reliable, real-time knowledge-based systems
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1988-01-01
Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.
Static and Dynamic Verification of Critical Software for Space Applications
NASA Astrophysics Data System (ADS)
Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.
Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.
Software and package applicating for network meta-analysis: A usage-based comparative study.
Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao
2017-12-21
To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
Elements of decisional dynamics: An agent-based approach applied to artificial financial market
NASA Astrophysics Data System (ADS)
Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille
2018-02-01
This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).
Elements of decisional dynamics: An agent-based approach applied to artificial financial market.
Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille
2018-02-01
This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).
Vignally, P; Fondi, G; Taggi, F; Pitidis, A
2011-03-31
In Italy the European Union Injury Database reports the involvement of chemical products in 0.9% of home and leisure accidents. The Emergency Department registry on domestic accidents in Italy and the Poison Control Centres record that 90% of cases of exposure to toxic substances occur in the home. It is not rare for the effects of chemical agents to be observed in hospitals, with a high potential risk of damage - the rate of this cause of hospital admission is double the domestic injury average. The aim of this study was to monitor the effects of injuries caused by caustic agents in Italy using automatic free-text recognition in Emergency Department medical databases. We created a Stata software program to automatically identify caustic or corrosive injury cases using an agent-specific list of keywords. We focused attention on the procedure's sensitivity and specificity. Ten hospitals in six regions of Italy participated in the study. The program identified 112 cases of injury by caustic or corrosive agents. Checking the cases by quality controls (based on manual reading of ED reports), we assessed 99 cases as true positive, i.e. 88.4% of the patients were automatically recognized by the software as being affected by caustic substances (99% CI: 80.6%- 96.2%), that is to say 0.59% (99% CI: 0.45%-0.76%) of the whole sample of home injuries, a value almost three times as high as that expected (p < 0.0001) from European codified information. False positives were 11.6% of the recognized cases (99% CI: 5.1%- 21.5%). Our automatic procedure for caustic agent identification proved to have excellent product recognition capacity with an acceptable level of excess sensitivity. Contrary to our a priori hypothesis, the automatic recognition system provided a level of identification of agents possessing caustic effects that was significantly much greater than was predictable on the basis of the values from current codifications reported in the European Database.
Open-Source web-based geographical information system for health exposure assessment
2012-01-01
This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software. PMID:22233606
NASA Astrophysics Data System (ADS)
Schmitz, Oliver; Beelen, Rob M. J.; de Bakker, Merijn P.; Karssenberg, Derek
2015-04-01
Constructing spatio-temporal numerical models to support risk assessment, such as assessing the exposure of humans to air pollution, often requires the integration of field-based and agent-based modelling approaches. Continuous environmental variables such as air pollution are best represented using the field-based approach which considers phenomena as continuous fields having attribute values at all locations. When calculating human exposure to such pollutants it is, however, preferable to consider the population as a set of individuals each with a particular activity pattern. This would allow to account for the spatio-temporal variation in a pollutant along the space-time paths travelled by individuals, determined, for example, by home and work locations, road network, and travel times. Modelling this activity pattern requires an agent-based or individual based modelling approach. In general, field- and agent-based models are constructed with the help of separate software tools, while both approaches should play together in an interacting way and preferably should be combined into one modelling framework, which would allow for efficient and effective implementation of models by domain specialists. To overcome this lack in integrated modelling frameworks, we aim at the development of concepts and software for an integrated field-based and agent-based modelling framework. Concepts merging field- and agent-based modelling were implemented by extending PCRaster (http://www.pcraster.eu), a field-based modelling library implemented in C++, with components for 1) representation of discrete, mobile, agents, 2) spatial networks and algorithms by integrating the NetworkX library (http://networkx.github.io), allowing therefore to calculate e.g. shortest routes or total transport costs between locations, and 3) functions for field-network interactions, allowing to assign field-based attribute values to networks (i.e. as edge weights), such as aggregated or averaged concentration values. We demonstrate the approach by using six land use regression (LUR) models developed in the ESCAPE (European Study of Cohorts for Air Pollution Effects) project. These models calculate several air pollutants (e.g. NO2, NOx, PM2.5) for the entire Netherlands at a high (5 m) resolution. Using these air pollution maps, we compare exposure of individuals calculated at their x, y location of their home, their work place, and aggregated over the close surroundings of these locations. In addition, total exposure is accumulated over daily activity patterns, summing exposure at home, at the work place, and while travelling between home and workplace, by routing individuals over the Dutch road network, using the shortest route. Finally, we illustrate how routes can be calculated with the minimum total exposure (instead of shortest distance).
Agent-based user-adaptive service provision in ubiquitous systems
NASA Astrophysics Data System (ADS)
Saddiki, H.; Harroud, H.; Karmouch, A.
2012-11-01
With the increasing availability of smartphones, tablets and other computing devices, technology consumers have grown accustomed to performing all of their computing tasks anytime, anywhere and on any device. There is a greater need to support ubiquitous connectivity and accommodate users by providing software as network-accessible services. In this paper, we propose a MAS-based approach to adaptive service composition and provision that automates the selection and execution of a suitable composition plan for a given service. With agents capable of autonomous and intelligent behavior, the composition plan is selected in a dynamic negotiation driven by a utility-based decision-making mechanism; and the composite service is built by a coalition of agents each providing a component necessary to the target service. The same service can be built in variations for catering to dynamic user contexts and further personalizing the user experience. Also multiple services can be grouped to satisfy new user needs.
Using Combined SFTA and SFMECA Techniques for Space Critical Software
NASA Astrophysics Data System (ADS)
Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.
2012-01-01
This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
A Framework for Performing Verification and Validation in Reuse Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
ERIC Educational Resources Information Center
Curtis, Rick
This paper summarizes information about using computer hardware and software to aid in making purchase decisions that are based on user needs. The two major options in hardware are IBM-compatible machines and the Apple Macintosh line. The three basic software applications include word processing, database management, and spreadsheet applications.…
Mother ship and physical agents collaboration
NASA Astrophysics Data System (ADS)
Young, Stuart H.; Budulas, Peter P.; Emmerman, Philip J.
1999-07-01
This paper discusses ongoing research at the U.S. Army Research Laboratory that investigates the feasibility of developing a collaboration architecture between small physical agents and a mother ship. This incudes the distribution of planning, perception, mobility, processing and communications requirements between the mother ship and the agents. Small physical agents of the future will be virtually everywhere on the battlefield of the 21st century. A mother ship that is coupled to a team of small collaborating physical agents (conducting tasks such as Reconnaissance, Surveillance, and Target Acquisition (RSTA); logistics; sentry; and communications relay) will be used to build a completely effective and mission capable intelligent system. The mother ship must have long-range mobility to deploy the small, highly maneuverable agents that will operate in urban environments and more localized areas, and act as a logistics base for the smaller agents. The mother ship also establishes a robust communications network between the agents and is the primary information disseminating and receiving point to the external world. Because of its global knowledge and processing power, the mother ship does the high-level control and planning for the collaborative physical agents. This high level control and interaction between the mother ship and its agents (including inter agent collaboration) will be software agent architecture based. The mother ship incorporates multi-resolution battlefield visualization and analysis technology, which aids in mission planning and sensor fusion.
Effective Team Support: From Modeling to Software Agents
NASA Technical Reports Server (NTRS)
Remington, Roger W. (Technical Monitor); John, Bonnie; Sycara, Katia
2003-01-01
The purpose of this research contract was to perform multidisciplinary research between CMU psychologists, computer scientists and engineers and NASA researchers to design a next generation collaborative system to support a team of human experts and intelligent agents. To achieve robust performance enhancement of such a system, we had proposed to perform task and cognitive modeling to thoroughly understand the impact technology makes on the organization and on key individual personnel. Guided by cognitively-inspired requirements, we would then develop software agents that support the human team in decision making, information filtering, information distribution and integration to enhance team situational awareness. During the period covered by this final report, we made substantial progress in modeling infrastructure and task infrastructure. Work is continuing under a different contract to complete empirical data collection, cognitive modeling, and the building of software agents to support the teams task.
NASA Technical Reports Server (NTRS)
Remington, Roger W. (Technical Monitor); John, Bonnie E.; Sycara, Katia
2005-01-01
The purpose of this research contract was to perform multidisciplinary research between CMU psychologists, computer scientists and NASA researchers to design a next generation collaborative system to support a team of human experts and intelligent agents. To achieve robust performance enhancement of such a system, we had proposed to perform task and cognitive modeling to thoroughly understand the impact technology makes on the organization and on key individual personnel. Guided by cognitively-inspired requirements, we would then develop software agents that support the human team in decision making, information filtering, information distribution and integration to enhance team situational awareness. During the period covered by this final report, we made substantial progress in completing a system for empirical data collection, cognitive modeling, and the building of software agents to support a team's tasks, and in running experiments for the collection of baseline data.
Incorporating BDI Agents into Human-Agent Decision Making Research
NASA Astrophysics Data System (ADS)
Kamphorst, Bart; van Wissen, Arlette; Dignum, Virginia
Artificial agents, people, institutes and societies all have the ability to make decisions. Decision making as a research area therefore involves a broad spectrum of sciences, ranging from Artificial Intelligence to economics to psychology. The Colored Trails (CT) framework is designed to aid researchers in all fields in examining decision making processes. It is developed both to study interaction between multiple actors (humans or software agents) in a dynamic environment, and to study and model the decision making of these actors. However, agents in the current implementation of CT lack the explanatory power to help understand the reasoning processes involved in decision making. The BDI paradigm that has been proposed in the agent research area to describe rational agents, enables the specification of agents that reason in abstract concepts such as beliefs, goals, plans and events. In this paper, we present CTAPL: an extension to CT that allows BDI software agents that are written in the practical agent programming language 2APL to reason about and interact with a CT environment.
AADL and Model-based Engineering
2014-10-20
and MBE Feiler, Oct 20, 2014 © 2014 Carnegie Mellon University We Rely on Software for Safe Aircraft Operation Embedded software systems ...D eveloper Compute Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency...confusion Hardware Engineer Why do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software
System software for the finite element machine
NASA Technical Reports Server (NTRS)
Crockett, T. W.; Knott, J. D.
1985-01-01
The Finite Element Machine is an experimental parallel computer developed at Langley Research Center to investigate the application of concurrent processing to structural engineering analysis. This report describes system-level software which has been developed to facilitate use of the machine by applications researchers. The overall software design is outlined, and several important parallel processing issues are discussed in detail, including processor management, communication, synchronization, and input/output. Based on experience using the system, the hardware architecture and software design are critiqued, and areas for further work are suggested.
Using Modern Methodologies with Maintenance Software
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.
2014-01-01
Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.
The comparison of the use of holonic and agent-based methods in modelling of manufacturing systems
NASA Astrophysics Data System (ADS)
Foit, K.; Banaś, W.; Gwiazda, A.; Hryniewicz, P.
2017-08-01
The rapid evolution in the field of industrial automation and manufacturing is often called the 4th Industry Revolution. Worldwide availability of the internet access contributes to the competition between manufacturers, gives the opportunity for buying materials, parts and for creating the partnership networks, like cloud manufacturing, grid manufacturing (MGrid), virtual enterprises etc. The effect of the industry evolution is the need to search for new solutions in the field of manufacturing systems modelling and simulation. During the last decade researchers have developed the agent-based approach of modelling. This methodology have been taken from the computer science, but was adapted to the philosophy of industrial automation and robotization. The operation of the agent-based system depends on the simultaneous acting of different agents that may have different roles. On the other hand, there is the holon-based approach that uses the structures created by holons. It differs from the agent-based structure in some aspects, while the other ones are quite similar in both methodologies. The aim of this paper is to present the both methodologies and discuss the similarities and the differences. This may could help to select the optimal method of modelling, according to the considered problem and software resources.
Online Maps and Cloud-Supported Location-Based Services across a Manifold of Devices
NASA Astrophysics Data System (ADS)
Kröpfl, M.; Buchmüller, D.; Leberl, F.
2012-07-01
Online mapping, miniaturization of computing devices, the "cloud", Global Navigation Satellite System (GNSS) and cell tower triangulation all coalesce into an entirely novel infrastructure for numerous innovative map applications. This impacts the planning of human activities, navigating and tracking these activities as they occur, and finally documenting their outcome for either a single user or a network of connected users in a larger context. In this paper, we provide an example of a simple geospatial application making use of this model, which we will use to explain the basic steps necessary to deploy an application involving a web service hosting geospatial information and a client software consuming the web service through an API. The application allows an insurance claim specialist to add claims to a cloud-based database including a claim location. A field agent then uses a smartphone application to query the database by proximity, and heads out to capture photographs as supporting documentation for the claim. Once the photos have been uploaded to the web service, a second web service for image matching is called in order to try and match the current photograph to previously submitted assets. Image matching is used as a pre-verification step to determine whether the coverage of the respective object is sufficient for the claim specialist to process the claim. The development of the application was based on Microsoft's® Bing Maps™, Windows Phone™, Silverlight™, Windows Azure™ and Visual Studio™, and was completed in approximately 30 labour hours split among two developers.
NASA Astrophysics Data System (ADS)
Laracuente, Nicholas; Grossman, Carl
2013-03-01
We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College
Training Programs in Applications Software.
ERIC Educational Resources Information Center
Modianos, Doan T.; Cornwell, Larry W.
1988-01-01
Description of training programs for using business applications software highlights implementing programs for Lotus 1-2-3 and dBASE III Plus. The amount of computer experience of the users and the difference in training methods needed are discussed, and the use of a Macintosh computer for producing notes is explained. (LRW)
Application of Mobile Agents in Web-Based Learning Environment.
ERIC Educational Resources Information Center
Hong Hong, Kinshuk; He, Xiaoqin; Patel, Ashok; Jesshope, Chris
Web-based learning environments are strongly driven by the information revolution and the Internet, but they have a number of common deficiencies, such as slow access, no adaptivity to the individual student, limitation by bandwidth, and more. This paper outlines the benefits of mobile agents technology, and describes its application in Web-based…
Advanced Software Development Workstation Project, phase 3
NASA Technical Reports Server (NTRS)
1991-01-01
ACCESS provides a generic capability to develop software information system applications which are explicitly intended to facilitate software reuse. In addition, it provides the capability to retrofit existing large applications with a user friendly front end for preparation of input streams in a way that will reduce required training time, improve the productivity even of experienced users, and increase accuracy. Current and past work shows that ACCESS will be scalable to much larger object bases.
Software development for safety-critical medical applications
NASA Technical Reports Server (NTRS)
Knight, John C.
1992-01-01
There are many computer-based medical applications in which safety and not reliability is the overriding concern. Reduced, altered, or no functionality of such systems is acceptable as long as no harm is done. A precise, formal definition of what software safety means is essential, however, before any attempt can be made to achieve it. Without this definition, it is not possible to determine whether a specific software entity is safe. A set of definitions pertaining to software safety will be presented and a case study involving an experimental medical device will be described. Some new techniques aimed at improving software safety will also be discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Cintuglu, Mehmet Hazar; Youssef, Tarek; Mohammed, Osama A.
2016-08-10
This article presents the development and application of a real-time testbed for multiagent system interoperability. As utility independent private microgrids are installed constantly, standardized interoperability frameworks are required to define behavioral models of the individual agents for expandability and plug-and-play operation. In this paper, we propose a comprehensive hybrid agent framework combining the foundation for intelligent physical agents (FIPA), IEC 61850, and data distribution service (DDS) standards. The IEC 61850 logical node concept is extended using FIPA based agent communication language (ACL) with application specific attributes and deliberative behavior modeling capability. The DDS middleware is adopted to enable a real-timemore » publisher-subscriber interoperability mechanism between platforms. The proposed multi-agent framework was validated in a laboratory based testbed involving developed intelligent electronic device (IED) prototypes and actual microgrid setups. Experimental results were demonstrated for both decentralized and distributed control approaches. Secondary and tertiary control levels of a microgrid were demonstrated for decentralized hierarchical control case study. A consensus-based economic dispatch case study was demonstrated as a distributed control example. Finally, it was shown that the developed agent platform is industrially applicable for actual smart grid field deployment.« less
Software Authority Transition through Multiple Distributors
Han, Kyusunk; Shon, Taeshik
2014-01-01
The rapid growth in the use of smartphones and tablets has changed the software distribution ecosystem. The trend today is to purchase software through application stores rather than from traditional offline markets. Smartphone and tablet users can install applications easily by purchasing from the online store deployed in their device. Several systems, such as Android or PC-based OS units, allow users to install software from multiple sources. Such openness, however, can promote serious threats, including malware and illegal usage. In order to prevent such threats, several stores use online authentication techniques. These methods can, however, also present a problem whereby even licensed users cannot use their purchased application. In this paper, we discuss these issues and provide an authentication method that will make purchased applications available to the registered user at all times. PMID:25143971
Software authority transition through multiple distributors.
Han, Kyusunk; Shon, Taeshik
2014-01-01
The rapid growth in the use of smartphones and tablets has changed the software distribution ecosystem. The trend today is to purchase software through application stores rather than from traditional offline markets. Smartphone and tablet users can install applications easily by purchasing from the online store deployed in their device. Several systems, such as Android or PC-based OS units, allow users to install software from multiple sources. Such openness, however, can promote serious threats, including malware and illegal usage. In order to prevent such threats, several stores use online authentication techniques. These methods can, however, also present a problem whereby even licensed users cannot use their purchased application. In this paper, we discuss these issues and provide an authentication method that will make purchased applications available to the registered user at all times.
[Development and practice evaluation of blood acid-base imbalance analysis software].
Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing
2014-11-01
To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great clinical application promotion value.
Flexible control techniques for a lunar base
NASA Technical Reports Server (NTRS)
Kraus, Thomas W.
1992-01-01
The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.
Detailed requirements document for common software of shuttle program information management system
NASA Technical Reports Server (NTRS)
Everette, J. M.; Bradfield, L. D.; Horton, C. L.
1975-01-01
Common software was investigated as a method for minimizing development and maintenance cost of the shuttle program information management system (SPIMS) applications while reducing the time-frame of their development. Those requirements satisfying these criteria are presented along with the stand-alone modules which may be used directly by applications. The SPIMS applications operating on the CYBER 74 computer, are specialized information management systems which use System 2000 as a data base manager. Common software provides the features to support user interactions on a CRT terminal using form input and command response capabilities. These features are available as subroutines to the applications.
IMS software developments for the detection of chemical warfare agent
NASA Technical Reports Server (NTRS)
Klepel, ST.; Graefenhain, U.; Lippe, R.; Stach, J.; Starrock, V.
1995-01-01
Interference compounds like gasoline, diesel, burning wood or fuel, etc. are presented in common battlefield situations. These compounds can cause detectors to respond as a false positive or interfere with the detector's ability to respond to target compounds such as chemical warfare agents. To ensure proper response of the ion mobility spectrometer to chemical warfare agents, two special software packages were developed and incorporated into the Bruker RAID-1. The programs suppress interferring signals caused by car exhaust or smoke gases resulting from burning materials and correct the influence of variable sample gas humidity which is important for detection and quantification of blister agents like mustard gas or lewisite.
Software for Sharing and Management of Information
NASA Technical Reports Server (NTRS)
Chen, James R.; Wolfe, Shawn R.; Wragg, Stephen D.
2003-01-01
DIAMS is a set of computer programs that implements a system of collaborative agents that serve multiple, geographically distributed users communicating via the Internet. DIAMS provides a user interface as a Java applet that runs on each user s computer and that works within the context of the user s Internet-browser software. DIAMS helps all its users to manage, gain access to, share, and exchange information in databases that they maintain on their computers. One of the DIAMS agents is a personal agent that helps its owner find information most relevant to current needs. It provides software tools and utilities for users to manage their information repositories with dynamic organization and virtual views. Capabilities for generating flexible hierarchical displays are integrated with capabilities for indexed- query searching to support effective access to information. Automatic indexing methods are employed to support users queries and communication between agents. The catalog of a repository is kept in object-oriented storage to facilitate sharing of information. Collaboration between users is aided by matchmaker agents and by automated exchange of information. The matchmaker agents are designed to establish connections between users who have similar interests and expertise.
Next Generation Remote Agent Planner
NASA Technical Reports Server (NTRS)
Jonsson, Ari K.; Muscettola, Nicola; Morris, Paul H.; Rajan, Kanna
1999-01-01
In May 1999, as part of a unique technology validation experiment onboard the Deep Space One spacecraft, the Remote Agent became the first complete autonomous spacecraft control architecture to run as flight software onboard an active spacecraft. As one of the three components of the architecture, the Remote Agent Planner had the task of laying out the course of action to be taken, which included activities such as turning, thrusting, data gathering, and communicating. Building on the successful approach developed for the Remote Agent Planner, the Next Generation Remote Agent Planner is a completely redesigned and reimplemented version of the planner. The new system provides all the key capabilities of the original planner, while adding functionality, improving performance and providing a modular and extendible implementation. The goal of this ongoing project is to develop a system that provides both a basis for future applications and a framework for further research in the area of autonomous planning for spacecraft. In this article, we present an introductory overview of the Next Generation Remote Agent Planner. We present a new and simplified definition of the planning problem, describe the basics of the planning process, lay out the new system design and examine the functionality of the core reasoning module.
Development of Integrated Modular Avionics Application Based on Simulink and XtratuM
NASA Astrophysics Data System (ADS)
Fons-Albert, Borja; Usach-Molina, Hector; Vila-Carbo, Joan; Crespo-Lorente, Alfons
2013-08-01
This paper presents an integral approach for designing avionics applications that meets the requirements for software development and execution of this application domain. Software design follows the Model-Based design process and is performed in Simulink. This approach allows easy and quick testbench development and helps satisfying DO-178B requirements through the use of proper tools. The software execution platform is based on XtratuM, a minimal bare-metal hypervisor designed in our research group. XtratuM provides support for IMA-SP (Integrated Modular Avionics for Space) architectures. This approach allows the code generation of a Simulink model to be executed on top of Lithos as XtratuM partition. Lithos is a ARINC-653 compliant RTOS for XtratuM. The paper concentrates in how to smoothly port Simulink designs to XtratuM solving problems like application partitioning, automatic code generation, real-time tasking, interfacing, and others. This process is illustrated with an autopilot design test using a flight simulator.
Design and Implementation of Embedded Computer Vision Systems Based on Particle Filters
2010-01-01
for hardware/software implementa- tion of multi-dimensional particle filter application and we explore this in the third application which is a 3D...methodology for hardware/software implementation of multi-dimensional particle filter application and we explore this in the third application which is a...and hence multiprocessor implementation of parti- cle filters is an important option to examine. A significant body of work exists on optimizing generic
Galván, Pedro; Cane, Virgilio; Samudio, Margarita; Cabello, Agueda; Cabral, Margarita; Basogain, Xavier; Rivas, Ronald; Hilario, Enrique
2014-01-01
Report preliminary results of the application of the BONIS system in community tele-epidemiological surveillance in Paraguay. A study of viability and implementation carried out in the Family Health Unit located in Bañado Sur in the city of Asunción by the Paraguay River. The system automatically records personal data and symptoms of individuals who make telephone reports, and suspected cases of dengue are classified and prioritized. This information goes to community agents for follow-up and to specialists in charge of epidemiological surveillance. From April 2010 to August 2011, 1 028 calls to the system were logged. Of 157 reported cases of fever, home visits were made to 140 (89.2%); of these, fever and headache or body ache were confirmed in 52 (37.1%) cases, and headache or body ache without fever in 58 (41.4%) cases. Community agents referred 49 (35.0%) of them for medical consultation and blood tests, and they took blood samples in the homes of 19; of these, 56 (82.3%) were positive for dengue and 12 (17.4%) for influenza. Paraguay has a low-cost community tele-epidemiological surveillance system based on information and communication technologies and open-source software, which is scalable to other health symptoms and disorders of interest. To enable its acceptance and application, education programs should be developed to strengthen the management and promotion of community health.
Modeling and Analysis of Space Based Transceivers
NASA Technical Reports Server (NTRS)
Moore, Michael S.; Price, Jeremy C.; Reinhart, Richard; Liebetreu, John; Kacpura, Tom J.
2005-01-01
This paper presents the tool chain, methodology, and results of an on-going study being performed jointly by Space Communication Experts at NASA Glenn Research Center (GRC), General Dynamics C4 Systems (GD), and Southwest Research Institute (SwRI). The team is evaluating the applicability and tradeoffs concerning the use of Software Defined Radio (SDR) technologies for Space missions. The Space Telecommunications Radio Systems (STRS) project is developing an approach toward building SDR-based transceivers for space communications applications based on an accompanying software architecture that can be used to implement transceivers for NASA space missions. The study is assessing the overall cost and benefit of employing SDR technologies in general, and of developing a software architecture standard for its space SDR transceivers. The study is considering the cost and benefit of existing architectures, such as the Joint Tactical Radio Systems (JTRS) Software Communications Architecture (SCA), as well as potential new space-specific architectures.
Comprehensive Quantitative Analysis on Privacy Leak Behavior
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046
Comprehensive quantitative analysis on privacy leak behavior.
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.
NASA Astrophysics Data System (ADS)
Weiss, Brian A.; Fronczek, Lisa; Morse, Emile; Kootbally, Zeid; Schlenoff, Craig
2013-05-01
Transformative Apps (TransApps) is a Defense Advanced Research Projects Agency (DARPA) funded program whose goal is to develop a range of militarily-relevant software applications ("apps") to enhance the operational-effectiveness of military personnel on (and off) the battlefield. TransApps is also developing a military apps marketplace to facilitate rapid development and dissemination of applications to address user needs by connecting engaged communities of endusers with development groups. The National Institute of Standards and Technology's (NIST) role in the TransApps program is to design and implement evaluation procedures to assess the performance of: 1) the various software applications, 2) software-hardware interactions, and 3) the supporting online application marketplace. Specifically, NIST is responsible for evaluating 50+ tactically-relevant applications operating on numerous Android™-powered platforms. NIST efforts include functional regression testing and quantitative performance testing. This paper discusses the evaluation methodologies employed to assess the performance of three key program elements: 1) handheld-based applications and their integration with various hardware platforms, 2) client-based applications and 3) network technologies operating on both the handheld and client systems along with their integration into the application marketplace. Handheld-based applications are assessed using a combination of utility and usability-based checklists and quantitative performance tests. Client-based applications are assessed to replicate current overseas disconnected (i.e. no network connectivity between handhelds) operations and to assess connected operations envisioned for later use. Finally, networked applications are assessed on handhelds to establish baselines of performance for when connectivity will be common usage.
ERIC Educational Resources Information Center
Lawrence, Virginia
No longer just a user of commercial software, the 21st century teacher is a designer of interactive software based on theories of learning. This software, a comprehensive study of straightline equations, enhances conceptual understanding, sketching, graphic interpretive and word problem solving skills as well as making connections to real-life and…
Framework Based Guidance Navigation and Control Flight Software Development
NASA Technical Reports Server (NTRS)
McComas, David
2007-01-01
This viewgraph presentation describes NASA's guidance navigation and control flight software development background. The contents include: 1) NASA/Goddard Guidance Navigation and Control (GN&C) Flight Software (FSW) Development Background; 2) GN&C FSW Development Improvement Concepts; and 3) GN&C FSW Application Framework.
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
-7024 Greg's expertise is in the design and development of software for engineering applications. His experience includes project management, software architectural design, various software development the pre- and post-processors used in the analysis of data from both land-based and offshore wind
The Hidden Cost of Buying a Computer.
ERIC Educational Resources Information Center
Johnson, Michael
1983-01-01
In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
DAQ: Software Architecture for Data Acquisition in Sounding Rockets
NASA Technical Reports Server (NTRS)
Ahmad, Mohammad; Tran, Thanh; Nichols, Heidi; Bowles-Martinez, Jessica N.
2011-01-01
A multithreaded software application was developed by Jet Propulsion Lab (JPL) to collect a set of correlated imagery, Inertial Measurement Unit (IMU) and GPS data for a Wallops Flight Facility (WFF) sounding rocket flight. The data set will be used to advance Terrain Relative Navigation (TRN) technology algorithms being researched at JPL. This paper describes the software architecture and the tests used to meet the timing and data rate requirements for the software used to collect the dataset. Also discussed are the challenges of using commercial off the shelf (COTS) flight hardware and open source software. This includes multiple Camera Link (C-link) based cameras, a Pentium-M based computer, and Linux Fedora 11 operating system. Additionally, the paper talks about the history of the software architecture's usage in other JPL projects and its applicability for future missions, such as cubesats, UAVs, and research planes/balloons. Also talked about will be the human aspect of project especially JPL's Phaeton program and the results of the launch.
ICCE Policy Statement on Network and Multiple Machine Software.
ERIC Educational Resources Information Center
International Council for Computers in Education, Eugene, OR.
Designed to provide educators with guidance for the lawful reproduction of computer software, this document contains suggested guidelines, sample forms, and several short articles concerning software copyright and license agreements. The initial policy statement calls for educators to provide software developers (or their agents) with a…
Fluorine-19 MRI Contrast Agents for Cell Tracking and Lung Imaging
Fox, Matthew S.; Gaudet, Jeffrey M.; Foster, Paula J.
2015-01-01
Fluorine-19 (19F)-based contrast agents for magnetic resonance imaging stand to revolutionize imaging-based research and clinical trials in several fields of medical intervention. First, their use in characterizing in vivo cell behavior may help bring cellular therapy closer to clinical acceptance. Second, their use in lung imaging provides novel noninvasive interrogation of the ventilated airspaces without the need for complicated, hard-to-distribute hardware. This article reviews the current state of 19F-based cell tracking and lung imaging using magnetic resonance imaging and describes the link between the methods across these fields and how they may mutually benefit from solutions to mutual problems encountered when imaging 19F-containing compounds, as well as hardware and software advancements. PMID:27042089
Informatic search strategies to discover analogues and variants of natural product archetypes.
Johnston, Chad W; Connaty, Alex D; Skinnider, Michael A; Li, Yong; Grunwald, Alyssa; Wyatt, Morgan A; Kerr, Russell G; Magarvey, Nathan A
2016-03-01
Natural products are a crucial source of antimicrobial agents, but reliance on low-resolution bioactivity-guided approaches has led to diminishing interest in discovery programmes. Here, we demonstrate that two in-house automated informatic platforms can be used to target classes of biologically active natural products, specifically, peptaibols. We demonstrate that mass spectrometry-based informatic approaches can be used to detect natural products with high sensitivity, identifying desired agents present in complex microbial extracts. Using our specialised software packages, we could elaborate specific branches of chemical space, uncovering new variants of trichopolyn and demonstrating a way forward in mining natural products as a valuable source of potential pharmaceutical agents.
A development framework for distributed artificial intelligence
NASA Technical Reports Server (NTRS)
Adler, Richard M.; Cottman, Bruce H.
1989-01-01
The authors describe distributed artificial intelligence (DAI) applications in which multiple organizations of agents solve multiple domain problems. They then describe work in progress on a DAI system development environment, called SOCIAL, which consists of three primary language-based components. The Knowledge Object Language defines models of knowledge representation and reasoning. The metaCourier language supplies the underlying functionality for interprocess communication and control access across heterogeneous computing environments. The metaAgents language defines models for agent organization coordination, control, and resource management. Application agents and agent organizations will be constructed by combining metaAgents and metaCourier building blocks with task-specific functionality such as diagnostic or planning reasoning. This architecture hides implementation details of communications, control, and integration in distributed processing environments, enabling application developers to concentrate on the design and functionality of the intelligent agents and agent networks themselves.
Agent Based Software for the Autonomous Control of Formation Flying Spacecraft
NASA Technical Reports Server (NTRS)
How, Jonathan P.; Campbell, Mark; Dennehy, Neil (Technical Monitor)
2003-01-01
Distributed satellite systems is an enabling technology for many future NASA/DoD earth and space science missions, such as MMS, MAXIM, Leonardo, and LISA [1, 2, 3]. While formation flying offers significant science benefits, to reduce the operating costs for these missions it will be essential that these multiple vehicles effectively act as a single spacecraft by performing coordinated observations. Autonomous guidance, navigation, and control as part of a coordinated fleet-autonomy is a key technology that will help accomplish this complex goal. This is no small task, as most current space missions require significant input from the ground for even relatively simple decisions such as thruster burns. Work for the NMP DS1 mission focused on the development of the New Millennium Remote Agent (NMRA) architecture for autonomous spacecraft control systems. NMRA integrates traditional real-time monitoring and control with components for constraint-based planning, robust multi-threaded execution, and model-based diagnosis and reconfiguration. The complexity of using an autonomous approach for space flight software was evident when most of its capabilities were stripped off prior to launch (although more capability was uplinked subsequently, and the resulting demonstration was very successful).
Design of Genetic Algorithms for Topology Control of Unmanned Vehicles
2010-01-01
decentralised topology control mechanism distributed among active running software agents to achieve a uniform spread of terrestrial unmanned vehicles...14. ABSTRACT We present genetic algorithms (GAs) as a decentralised topology control mechanism distributed among active running software agents to...inspired topology control algorithm. The topology control of UVs using a decentralised solution over an unknown geographical terrain is a challenging
Software for Partly Automated Recognition of Targets
NASA Technical Reports Server (NTRS)
Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark
2003-01-01
The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user s tendencies while the user is selecting targets and to increase the user s productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.
2011-01-01
Background This study aims to identify the statistical software applications most commonly employed for data analysis in health services research (HSR) studies in the U.S. The study also examines the extent to which information describing the specific analytical software utilized is provided in published articles reporting on HSR studies. Methods Data were extracted from a sample of 1,139 articles (including 877 original research articles) published between 2007 and 2009 in three U.S. HSR journals, that were considered to be representative of the field based upon a set of selection criteria. Descriptive analyses were conducted to categorize patterns in statistical software usage in those articles. The data were stratified by calendar year to detect trends in software use over time. Results Only 61.0% of original research articles in prominent U.S. HSR journals identified the particular type of statistical software application used for data analysis. Stata and SAS were overwhelmingly the most commonly used software applications employed (in 46.0% and 42.6% of articles respectively). However, SAS use grew considerably during the study period compared to other applications. Stratification of the data revealed that the type of statistical software used varied considerably by whether authors were from the U.S. or from other countries. Conclusions The findings highlight a need for HSR investigators to identify more consistently the specific analytical software used in their studies. Knowing that information can be important, because different software packages might produce varying results, owing to differences in the software's underlying estimation methods. PMID:21977990
Defensive Swarm: An Agent Based Modeling Analysis
2017-12-01
INITIAL ALGORITHM (SINGLE- RUN ) TESTING .........................43 1. Patrol Algorithm—Passive...scalability are therefore quite important to modeling in this highly variable domain. One can force the software to run the gamut of options to see...changes in operating constructs or procedures. Additionally, modelers can run thousands of iterations testing the model under different circumstances
Real time software for a heat recovery steam generator control system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdes, R.; Delgadillo, M.A.; Chavez, R.
1995-12-31
This paper is addressed to the development and successful implementation of a real time software for the Heat Recovery Steam Generator (HRSG) control system of a Combined Cycle Power Plant. The real time software for the HRSG control system physically resides in a Control and Acquisition System (SAC) which is a component of a distributed control system (DCS). The SAC is a programmable controller. The DCS installed at the Gomez Palacio power plant in Mexico accomplishes the functions of logic, analog and supervisory control. The DCS is based on microprocessors and the architecture consists of workstations operating as a Man-Machinemore » Interface (MMI), linked to SAC controllers by means of a communication system. The HRSG real time software is composed of an operating system, drivers, dedicated computer program and application computer programs. The operating system used for the development of this software was the MultiTasking Operating System (MTOS). The application software developed at IIE for the HRSG control system basically consisted of a set of digital algorithms for the regulation of the main process variables at the HRSG. By using the multitasking feature of MTOS, the algorithms are executed pseudo concurrently. In this way, the applications programs continuously use the resources of the operating system to perform their functions through a uniform service interface. The application software of the HRSG consist of three tasks, each of them has dedicated responsibilities. The drivers were developed for the handling of hardware resources of the SAC controller which in turn allows the signals acquisition and data communication with a MMI. The dedicated programs were developed for hardware diagnostics, task initializations, access to the data base and fault tolerance. The application software and the dedicated software for the HRSG control system was developed using C programming language due to compactness, portability and efficiency.« less
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
Integrated Systems Health Management (ISHM) Toolkit
NASA Technical Reports Server (NTRS)
Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim
2013-01-01
A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.
Simplified Deployment of Health Informatics Applications by Providing Docker Images.
Löbe, Matthias; Ganslandt, Thomas; Lotzmann, Lydia; Mate, Sebastian; Christoph, Jan; Baum, Benjamin; Sariyar, Murat; Wu, Jie; Stäubert, Sebastian
2016-01-01
Due to the specific needs of biomedical researchers, in-house development of software is widespread. A common problem is to maintain and enhance software after the funded project has ended. Even if many tools are made open source, only a couple of projects manage to attract a user basis large enough to ensure sustainability. Reasons for this include complex installation and configuration of biomedical software as well as an ambiguous terminology of the features provided; all of which make evaluation of software laborious. Docker is a para-virtualization technology based on Linux containers that eases deployment of applications and facilitates evaluation. We investigated a suite of software developments funded by a large umbrella organization for networked medical research within the last 10 years and created Docker containers for a number of applications to support utilization and dissemination.
Development of a 32-bit UNIX-based ELAS workstation
NASA Technical Reports Server (NTRS)
Spiering, Bruce A.; Pearson, Ronnie W.; Cheng, Thomas D.
1987-01-01
A mini/microcomputer UNIX-based image analysis workstation has been designed and is being implemented to use the Earth Resources Laboratory Applications Software (ELAS). The hardware system includes a MASSCOMP 5600 computer, which is a 32-bit UNIX-based system (compatible with AT&T System V and Berkeley 4.2 BSD operating system), a floating point accelerator, a 474-megabyte fixed disk, a tri-density magnetic tape drive, and an 1152 by 910 by 12-plane color graphics/image interface. The software conversion includes reconfiguring the ELAs driver Master Task, recompiling and then testing the converted application modules. This hardware and software configuration is a self-sufficient image analysis workstation which can be used as a stand-alone system, or networked with other compatible workstations.
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
The Cloud-Based Integrated Data Viewer (IDV)
NASA Astrophysics Data System (ADS)
Fisher, Ward
2015-04-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While there are a suite of tools and methodologies used in traditional software engineering environments to mitigate this issue, they are typically ignored by developers lacking a background in software engineering. The result is a large body of software which is simultaneously critical and difficult to maintain. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. The advent of cloud computing has provided a solution to this problem, which was not previously practical on a large scale; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. Through application streaming we are able to bring the same visualization to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be. Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved. We will also discuss the differences between local software and software-as-a-service.
NASA Technical Reports Server (NTRS)
Rowell, Lawrence F.; Davis, John S.
1989-01-01
The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.
Research into software executives for space operations support
NASA Technical Reports Server (NTRS)
Collier, Mark D.
1990-01-01
Research concepts pertaining to a software (workstation) executive which will support a distributed processing command and control system characterized by high-performance graphics workstations used as computing nodes are presented. Although a workstation-based distributed processing environment offers many advantages, it also introduces a number of new concerns. In order to solve these problems, allow the environment to function as an integrated system, and present a functional development environment to application programmers, it is necessary to develop an additional layer of software. This 'executive' software integrates the system, provides real-time capabilities, and provides the tools necessary to support the application requirements.
NASA Astrophysics Data System (ADS)
Alford, W. A.; Kawamura, Kazuhiko; Wilkes, Don M.
1997-12-01
This paper discusses the problem of integrating human intelligence and skills into an intelligent manufacturing system. Our center has jointed the Holonic Manufacturing Systems (HMS) Project, an international consortium dedicated to developing holonic systems technologies. One of our contributions to this effort is in Work Package 6: flexible human integration. This paper focuses on one activity, namely, human integration into motion guidance and coordination. Much research on intelligent systems focuses on creating totally autonomous agents. At the Center for Intelligent Systems (CIS), we design robots that interact directly with a human user. We focus on using the natural intelligence of the user to simplify the design of a robotic system. The problem is finding ways for the user to interact with the robot that are efficient and comfortable for the user. Manufacturing applications impose the additional constraint that the manufacturing process should not be disturbed; that is, frequent interacting with the user could degrade real-time performance. Our research in human-robot interaction is based on a concept called human directed local autonomy (HuDL). Under this paradigm, the intelligent agent selects and executes a behavior or skill, based upon directions from a human user. The user interacts with the robot via speech, gestures, or other media. Our control software is based on the intelligent machine architecture (IMA), an object-oriented architecture which facilitates cooperation and communication among intelligent agents. In this paper we describe our research testbed, a dual-arm humanoid robot and human user, and the use of this testbed for a human directed sorting task. We also discuss some proposed experiments for evaluating the integration of the human into the robot system. At the time of this writing, the experiments have not been completed.
When more of the same is better
NASA Astrophysics Data System (ADS)
Fontanari, José F.
2016-01-01
Problem solving (e.g., drug design, traffic engineering, software development) by task forces represents a substantial portion of the economy of developed countries. Here we use an agent-based model of cooperative problem-solving systems to study the influence of diversity on the performance of a task force. We assume that agents cooperate by exchanging information on their partial success and use that information to imitate the more successful agent in the system —the model. The agents differ only in their propensities to copy the model. We find that, for easy tasks, the optimal organization is a homogeneous system composed of agents with the highest possible copy propensities. For difficult tasks, we find that diversity can prevent the system from being trapped in sub-optimal solutions. However, when the system size is adjusted to maximize the performance the homogeneous systems outperform the heterogeneous systems, i.e., for optimal performance, sameness should be preferred to diversity.
A Hazardous Gas Detection System for Aerospace and Commercial Applications
NASA Technical Reports Server (NTRS)
Hunter, G. W.; Neudeck, P. G.; Chen, L. - Y.; Makel, D. B.; Liu, C. C.; Wu, Q. H.; Knight, D.
1998-01-01
The detection of explosive conditions in aerospace propulsion applications is important for safety and economic reasons. Microfabricated hydrogen, oxygen, and hydrocarbon sensors as well as the accompanying hardware and software are being developed for a range of aerospace safety applications. The development of these sensors is being done using MEMS (Micro ElectroMechanical Systems) based technology and SiC-based semiconductor technology. The hardware and software allows control and interrogation of each sensor head and reduces accompanying cabling through multiplexing. These systems are being applied on the X-33 and on an upcoming STS-95 Shuttle mission. A number of commercial applications are also being pursued. It is concluded that this MEMS-based technology has significant potential to reduce costs and increase safety in a variety of aerospace applications.
A Hazardous Gas Detection System for Aerospace and Commercial Applications
NASA Technical Reports Server (NTRS)
Hunter, G. W.; Neudeck, P. G.; Chen, L.-Y.; Makel, D. B.; Liu, C. C.; Wu, Q. H.; Knight, D.
1998-01-01
The detection of explosive conditions in aerospace propulsion applications is important for safety and economic reasons. Microfabricated hydrogen, oxygen, and hydrocarbon sensors as well as the accompanying hardware and software are being, developed for a range of aerospace safety applications. The development of these sensors is being done using MEMS (Micro ElectroMechanical Systems) based technology and SiC-based semiconductor technology. The hardware and software allows control and interrocation of each sensor head and reduces accompanying cabling through multiplexing. These systems are being, applied on the X-33 and on an upcoming STS-95 Shuttle mission. A number of commercial applications are also being pursued. It is concluded that this MEMS-based technology has significant potential to reduce costs and increase safety in a variety of aerospace applications.
Antunes, Deborah; Jorge, Natasha A. N.; Caffarena, Ernesto R.; Passetti, Fabio
2018-01-01
RNA molecules are essential players in many fundamental biological processes. Prokaryotes and eukaryotes have distinct RNA classes with specific structural features and functional roles. Computational prediction of protein structures is a research field in which high confidence three-dimensional protein models can be proposed based on the sequence alignment between target and templates. However, to date, only a few approaches have been developed for the computational prediction of RNA structures. Similar to proteins, RNA structures may be altered due to the interaction with various ligands, including proteins, other RNAs, and metabolites. A riboswitch is a molecular mechanism, found in the three kingdoms of life, in which the RNA structure is modified by the binding of a metabolite. It can regulate multiple gene expression mechanisms, such as transcription, translation initiation, and mRNA splicing and processing. Due to their nature, these entities also act on the regulation of gene expression and detection of small metabolites and have the potential to helping in the discovery of new classes of antimicrobial agents. In this review, we describe software and web servers currently available for riboswitch aptamer identification and secondary and tertiary structure prediction, including applications. PMID:29403526
Models and Frameworks: A Synergistic Association for Developing Component-Based Applications
Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858
Models and frameworks: a synergistic association for developing component-based applications.
Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
Programmable ubiquitous telerobotic devices
NASA Astrophysics Data System (ADS)
Doherty, Michael; Greene, Matthew; Keaton, David; Och, Christian; Seidl, Matthew L.; Waite, William; Zorn, Benjamin G.
1997-12-01
We are investigating a field of research that we call ubiquitous telepresence, which involves the design and implementation of low-cost robotic devices that can be programmed and operated from anywhere on the Internet. These devices, which we call ubots, can be used for academic purposes (e.g., a biologist could remote conduct a population survey), commercial purposes (e.g., a house could be shown remotely by a real-estate agent), and for recreation and education (e.g., someone could tour a museum remotely). We anticipate that such devices will become increasingly common due to recent changes in hardware and software technology. In particular, current hardware technology enables such devices to be constructed very cheaply (less than $500), and current software and network technology allows highly portable code to be written and downloaded across the Internet. In this paper, we present our prototype system architecture, and the ubot implementation we have constructed based on it. The hardware technology we use is the handy board, a 6811-based controller board with digital and analog inputs and outputs. Our software includes a network layer based on TCP/IP and software layers written in Java. Our software enables users across the Internet to program the behavior of the vehicle and to receive image feedback from a camera mounted on it.
Multiagent pursuit-evasion games: Algorithms and experiments
NASA Astrophysics Data System (ADS)
Kim, Hyounjin
Deployment of intelligent agents has been made possible through advances in control software, microprocessors, sensor/actuator technology, communication technology, and artificial intelligence. Intelligent agents now play important roles in many applications where human operation is too dangerous or inefficient. There is little doubt that the world of the future will be filled with intelligent robotic agents employed to autonomously perform tasks, or embedded in systems all around us, extending our capabilities to perceive, reason and act, and replacing human efforts. There are numerous real-world applications in which a single autonomous agent is not suitable and multiple agents are required. However, after years of active research in multi-agent systems, current technology is still far from achieving many of these real-world applications. Here, we consider the problem of deploying a team of unmanned ground vehicles (UGV) and unmanned aerial vehicles (UAV) to pursue a second team of UGV evaders while concurrently building a map in an unknown environment. This pursuit-evasion game encompasses many of the challenging issues that arise in operations using intelligent multi-agent systems. We cast the problem in a probabilistic game theoretic framework and consider two computationally feasible pursuit policies: greedy and global-max. We also formulate this probabilistic pursuit-evasion game as a partially observable Markov decision process and employ a policy search algorithm to obtain a good pursuit policy from a restricted class of policies. The estimated value of this policy is guaranteed to be uniformly close to the optimal value in the given policy class under mild conditions. To implement this scenario on real UAVs and UGVs, we propose a distributed hierarchical hybrid system architecture which emphasizes the autonomy of each agent yet allows for coordinated team efforts. We then describe our implementation on a fleet of UGVs and UAVs, detailing components such as high level pursuit policy computation, inter-agent communication, navigation, sensing, and regulation. We present both simulation and experimental results on real pursuit-evasion games between our fleet of UAVs and UGVs and evaluate the pursuit policies, relating expected capture times to the speed and intelligence of the evaders and the sensing capabilities of the pursuers. The architecture and algorithmsis described in this dissertation are general enough to be applied to many real-world applications.
Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao
2015-09-01
This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.
Modular Infrastructure for Rapid Flight Software Development
NASA Technical Reports Server (NTRS)
Pires, Craig
2010-01-01
This slide presentation reviews the use of modular infrastructure to assist in the development of flight software. A feature of this program is the use of model based approach for application unique software. A review of two programs that this approach was use on are: the development of software for Hover Test Vehicle (HTV), and Lunar Atmosphere and Dust Environment Experiment (LADEE).
Production roll out plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, D.E.
The Hanford Data Integration 2000 (HANDI 2000) Project will result in an integrated and comprehensive set of functional applications containing core information necessary to support the Project Hanford Management Contract (PHMC). It is based on the Commercial-Off-The-Shelf (COTS) product solution with commercially proven business processes. The COTS product solution set, of Passport (PP) and PeopleSoft (PS) software, supports finance, supply, human resources, and payroll activities under the current PHMC direction. The PP software is an integrated application for Accounts Payable, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheets (MSDS). The PS software is an integrated application for Projects,more » General Ledger, Human Resources Training, Payroll, and Base Benefits. This set of software constitutes the Business Management System (BMS) and MSDS, a subset of the HANDI 2000 suite of systems. The primary objective of the Production Roll Out Plan is to communicate the methods and schedules for implementation and roll out to end users of BMS.« less
The NASA/Army Autonomous Rotorcraft Project
NASA Technical Reports Server (NTRS)
Whalley, M.; Freed, M.; Takahashi, M.; Christian, D.; Patterson-Hine, A.; Schulein, G.; Harris, R.
2002-01-01
An overview of the NASA Ames Research Center Autonomous Rotorcraft Project (ARP) is presented. The project brings together several technologies to address NASA and US Army autonomous vehicle needs, including a reactive planner for mission planning and execution, control system design incorporating a detailed understanding of the platform dynamics, and health monitoring and diagnostics. A candidate reconnaissance and surveillance mission is described. The autonomous agent architecture and its application to the candidate mission are presented. Details of the vehicle hardware and software development are provided.
2017-05-19
LightCycler® 96 desktop software. Positive and negative samples were identified using the “ Qualitative Detection” analysis function using the default...Institute of Infectious Diseases, Fort Detrick, MD 21702, United States A R T I C L E I N F O Keywords: West Nile virus Virus inactivation Sample buffer... samples using a commercially available SDS- PAGE sample buffer for proteomic studies. Using this method, we demonstrate its utility by identification
An Ontology-based Architecture for Integration of Clinical Trials Management Applications
Shankar, Ravi D.; Martins, Susana B.; O’Connor, Martin; Parrish, David B.; Das, Amar K.
2007-01-01
Management of complex clinical trials involves coordinated-use of a myriad of software applications by trial personnel. The applications typically use distinct knowledge representations and generate enormous amount of information during the course of a trial. It becomes vital that the applications exchange trial semantics in order for efficient management of the trials and subsequent analysis of clinical trial data. Existing model-based frameworks do not address the requirements of semantic integration of heterogeneous applications. We have built an ontology-based architecture to support interoperation of clinical trial software applications. Central to our approach is a suite of clinical trial ontologies, which we call Epoch, that define the vocabulary and semantics necessary to represent information on clinical trials. We are continuing to demonstrate and validate our approach with different clinical trials management applications and with growing number of clinical trials. PMID:18693919
Evaluation of Inventory Reduction Strategies: Balad Air Base Case Study
2012-03-01
produced by conducting individual simulations using a unique random seed generated by the default Anylogic © random number generator. The...develops an agent-based simulation model of the sustainment supply chain supporting Balad AB during its closure using the software AnyLogic ®. The...research. The goal of USAF Stockage Policy is to maximize customer support while minimizing inventory costs (DAF, 2011:1). USAF stocking decisions
LabVIEW-based control software for para-hydrogen induced polarization instrumentation.
Agraz, Jose; Grunfeld, Alexander; Li, Debiao; Cunningham, Karl; Willey, Cindy; Pozos, Robert; Wagner, Shawn
2014-04-01
The elucidation of cell metabolic mechanisms is the modern underpinning of the diagnosis, treatment, and in some cases the prevention of disease. Para-Hydrogen induced polarization (PHIP) enhances magnetic resonance imaging (MRI) signals over 10,000 fold, allowing for the MRI of cell metabolic mechanisms. This signal enhancement is the result of hyperpolarizing endogenous substances used as contrast agents during imaging. PHIP instrumentation hyperpolarizes Carbon-13 ((13)C) based substances using a process requiring control of a number of factors: chemical reaction timing, gas flow, monitoring of a static magnetic field (Bo), radio frequency (RF) irradiation timing, reaction temperature, and gas pressures. Current PHIP instruments manually control the hyperpolarization process resulting in the lack of the precise control of factors listed above, resulting in non-reproducible results. We discuss the design and implementation of a LabVIEW based computer program that automatically and precisely controls the delivery and manipulation of gases and samples, monitoring gas pressures, environmental temperature, and RF sample irradiation. We show that the automated control over the hyperpolarization process results in the hyperpolarization of hydroxyethylpropionate. The implementation of this software provides the fast prototyping of PHIP instrumentation for the evaluation of a myriad of (13)C based endogenous contrast agents used in molecular imaging.
LabVIEW-based control software for para-hydrogen induced polarization instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agraz, Jose, E-mail: joseagraz@ucla.edu; Grunfeld, Alexander; Li, Debiao
2014-04-15
The elucidation of cell metabolic mechanisms is the modern underpinning of the diagnosis, treatment, and in some cases the prevention of disease. Para-Hydrogen induced polarization (PHIP) enhances magnetic resonance imaging (MRI) signals over 10 000 fold, allowing for the MRI of cell metabolic mechanisms. This signal enhancement is the result of hyperpolarizing endogenous substances used as contrast agents during imaging. PHIP instrumentation hyperpolarizes Carbon-13 ({sup 13}C) based substances using a process requiring control of a number of factors: chemical reaction timing, gas flow, monitoring of a static magnetic field (B{sub o}), radio frequency (RF) irradiation timing, reaction temperature, and gas pressures.more » Current PHIP instruments manually control the hyperpolarization process resulting in the lack of the precise control of factors listed above, resulting in non-reproducible results. We discuss the design and implementation of a LabVIEW based computer program that automatically and precisely controls the delivery and manipulation of gases and samples, monitoring gas pressures, environmental temperature, and RF sample irradiation. We show that the automated control over the hyperpolarization process results in the hyperpolarization of hydroxyethylpropionate. The implementation of this software provides the fast prototyping of PHIP instrumentation for the evaluation of a myriad of {sup 13}C based endogenous contrast agents used in molecular imaging.« less
Virtual acoustic environments for comprehensive evaluation of model-based hearing devices.
Grimm, Giso; Luberadzka, Joanna; Hohmann, Volker
2018-06-01
Create virtual acoustic environments (VAEs) with interactive dynamic rendering for applications in audiology. A toolbox for creation and rendering of dynamic virtual acoustic environments (TASCAR) that allows direct user interaction was developed for application in hearing aid research and audiology. The software architecture and the simulation methods used to produce VAEs are outlined. Example environments are described and analysed. With the proposed software, a tool for simulation of VAEs is available. A set of VAEs rendered with the proposed software was described.
Application of the GNU Radio platform in the multistatic radar
NASA Astrophysics Data System (ADS)
Szlachetko, Boguslaw; Lewandowski, Andrzej
2009-06-01
This document presents the application of the Software Defined Radio-based platform in the multistatic radar. This platform consists of four-sensor linear antenna, Universal Software Radio Peripheral (USRP) hardware (radio frequency frontend) and GNU-Radio PC software. The paper provides information about architecture of digital signal processing performed by USRP's FPGA (digital down converting blocks) and PC host (implementation of the multichannel digital beamforming). The preliminary results of the signal recording performed by our experimental platform are presented.
JCell--a Java-based framework for inferring regulatory networks from time series data.
Spieth, C; Supper, J; Streichert, F; Speer, N; Zell, A
2006-08-15
JCell is a Java-based application for reconstructing gene regulatory networks from experimental data. The framework provides several algorithms to identify genetic and metabolic dependencies based on experimental data conjoint with mathematical models to describe and simulate regulatory systems. Owing to the modular structure, researchers can easily implement new methods. JCell is a pure Java application with additional scripting capabilities and thus widely usable, e.g. on parallel or cluster computers. The software is freely available for download at http://www-ra.informatik.uni-tuebingen.de/software/JCell.
2017-04-06
Research Hypothesis ........................................................................................................... 15 Research Design ...user community and of accommodating advancing software applications by the vendors. Research Design My approach to this project was to conduct... design descriptions , requirements specifications, test documentation, interface requirement specifications, product specifications, and software
ERIC Educational Resources Information Center
Muller, Eugene W.
1985-01-01
Develops generalizations for empirical evaluation of software based upon suitability of several research designs--pretest posttest control group, single-group pretest posttest, nonequivalent control group, time series, and regression discontinuity--to type of software being evaluated, and on circumstances under which evaluation is conducted. (MBR)
2008-12-01
between our current project and the historical projects. Therefore to refine the historical volatility estimate of the previously completed software... historical volatility estimates obtained in the form of beliefs and plausibility based on subjective probabilities that take into consideration unique
Estimation of toxicity using a Java based software tool
A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...
NASA Technical Reports Server (NTRS)
1976-01-01
Only a few efforts are currently underway to develop an adequate technology base for the various themes. Particular attention must be given to software commonality and evolutionary capability, to increased system integrity and autonomy; and to improved communications among the program users, the program developers, and the programs themselves. There is a need for quantum improvement in software development methods and increasing the awareness of software by all concerned. Major thrusts identified include: (1) data and systems management; (2) software technology for autonomous systems; (3) technology and methods for improving the software development process; (4) advances related to systems of software elements including their architecture, their attributes as systems, and their interfaces with users and other systems; and (5) applications of software including both the basic algorithms used in a number of applications and the software specific to a particular theme or discipline area. The impact of each theme on software is assessed.
The selection of adhesive systems for resin-based luting agents.
Carville, Rebecca; Quinn, Frank
2008-01-01
The use of resin-based luting agents is ever expanding with the development of adhesive dentistry. A multitude of different adhesive systems are used with resin-based luting agents, and new products are introduced to the market frequently. Traditional adhesives generally required a multiple step bonding procedure prior to cementing with active resin-based luting materials; however, combined agents offer a simple application procedure. Self-etching 'all-in-one' systems claim that there is no need for the use of a separate adhesive process. The following review addresses the advantages and disadvantages of the available adhesive systems used with resin-based luting agents.
Software engineering processes for Class D missions
NASA Astrophysics Data System (ADS)
Killough, Ronnie; Rose, Debi
2013-09-01
Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).
Unified web-based network management based on distributed object orientated software agents
NASA Astrophysics Data System (ADS)
Djalalian, Amir; Mukhtar, Rami; Zukerman, Moshe
2002-09-01
This paper presents an architecture that provides a unified web interface to managed network devices that support CORBA, OSI or Internet-based network management protocols. A client gains access to managed devices through a web browser, which is used to issue management operations and receive event notifications. The proposed architecture is compatible with both the OSI Management reference Model and CORBA. The steps required for designing the building blocks of such architecture are identified.
NASA Technical Reports Server (NTRS)
Guarro, Sergio B.
2010-01-01
This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.
Ultrasonography in gastroenterology.
Ødegaard, Svein; Nesje, Lars B; Hausken, Trygve; Gilja, Odd Helge
2015-06-01
Ultrasonography (US) is a safe and available real-time, high-resolution imaging method, which during the last decades has been increasingly integrated as a clinical tool in gastroenterology. New US applications have emerged with enforced data software and new technical solutions, including strain evaluation, three-dimensional imaging and use of ultrasound contrast agents. Specific gastroenterologic applications have been developed by combining US with other diagnostic or therapeutic methods, such as endoscopy, manometry, puncture needles, diathermy and stents. US provides detailed structural information about visceral organs without hazard to the patients and can play an important clinical role by reducing the need for invasive procedures. This paper presents different aspects of US in gastroenterology, with a special emphasis on the contribution from Nordic scientists in developing clinical applications.
Hynes, Martin; Wang, Han; Kilmartin, Liam
2009-01-01
Over the last decade, there has been substantial research interest in the application of accelerometry data for many forms of automated gait and activity analysis algorithms. This paper introduces a summary of new "of-the-shelf" mobile phone handset platforms containing embedded accelerometers which support the development of custom software to implement real time analysis of the accelerometer data. An overview of the main software programming environments which support the development of such software, including Java ME based JSR 256 API, C++ based Motion Sensor API and the Python based "aXYZ" module, is provided. Finally, a sample application is introduced and its performance evaluated in order to illustrate how a standard mobile phone can be used to detect gait activity using such a non-intrusive and easily accepted sensing platform.
Agent independent task planning
NASA Technical Reports Server (NTRS)
Davis, William S.
1990-01-01
Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.
Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.
Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin
2013-08-01
The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.
76 FR 66125 - Petition for Waiver of Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
..., as well as testing. BNSF states it created a Web-based software application that it characterizes as.... Specifically, BNSF is proposing to use Web-based software to satisfy the ``hands-on'' portion of training... scenario. The employee must maneuver the avatar in the virtual setting and perform all inspection tasks...
Software Defined Radio Standard Architecture and its Application to NASA Space Missions
NASA Technical Reports Server (NTRS)
Andro, Monty; Reinhart, Richard C.
2006-01-01
A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.
Agents, Bayes, and Climatic Risks - a modular modelling approach
NASA Astrophysics Data System (ADS)
Haas, A.; Jaeger, C.
2005-08-01
When insurance firms, energy companies, governments, NGOs, and other agents strive to manage climatic risks, it is by no way clear what the aggregate outcome should and will be. As a framework for investigating this subject, we present the LAGOM model family. It is based on modules depicting learning social agents. For managing climate risks, our agents use second order probabilities and update them by means of a Bayesian mechanism while differing in priors and risk aversion. The interactions between these modules and the aggregate outcomes of their actions are implemented using further modules. The software system is implemented as a series of parallel processes using the CIAMn approach. It is possible to couple modules irrespective of the language they are written in, the operating system under which they are run, and the physical location of the machine.
A Linguistic Model in Component Oriented Programming
NASA Astrophysics Data System (ADS)
Crăciunean, Daniel Cristian; Crăciunean, Vasile
2016-12-01
It is a fact that the component-oriented programming, well organized, can bring a large increase in efficiency in the development of large software systems. This paper proposes a model for building software systems by assembling components that can operate independently of each other. The model is based on a computing environment that runs parallel and distributed applications. This paper introduces concepts as: abstract aggregation scheme and aggregation application. Basically, an aggregation application is an application that is obtained by combining corresponding components. In our model an aggregation application is a word in a language.
Melanin-Based Contrast Agents for Biomedical Optoacoustic Imaging and Theranostic Applications.
Longo, Dario Livio; Stefania, Rachele; Aime, Silvio; Oraevsky, Alexander
2017-08-07
Optoacoustic imaging emerged in early 1990s as a new biomedical imaging technology that generates images by illuminating tissues with short laser pulses and detecting resulting ultrasound waves. This technique takes advantage of the spectroscopic approach to molecular imaging, and delivers high-resolution images in the depth of tissue. Resolution of the optoacoustic imaging is scalable, so that biomedical systems from cellular organelles to large organs can be visualized and, more importantly, characterized based on their optical absorption coefficient, which is proportional to the concentration of absorbing chromophores. Optoacoustic imaging was shown to be useful in both preclinical research using small animal models and in clinical applications. Applications in the field of molecular imaging offer abundant opportunities for the development of highly specific and effective contrast agents for quantitative optoacoustic imaging. Recent efforts are being made in the direction of nontoxic biodegradable contrast agents (such as nanoparticles made of melanin) that are potentially applicable in clinical optoacoustic imaging. In order to increase the efficiency and specificity of contrast agents and probes, they need to be made smart and capable of controlled accumulation in the target cells. This review was written in recognition of the potential breakthroughs in medical optoacoustic imaging that can be enabled by efficient and nontoxic melanin-based optoacoustic contrast agents.
Melanin-Based Contrast Agents for Biomedical Optoacoustic Imaging and Theranostic Applications
Longo, Dario Livio; Aime, Silvio
2017-01-01
Optoacoustic imaging emerged in early 1990s as a new biomedical imaging technology that generates images by illuminating tissues with short laser pulses and detecting resulting ultrasound waves. This technique takes advantage of the spectroscopic approach to molecular imaging, and delivers high-resolution images in the depth of tissue. Resolution of the optoacoustic imaging is scalable, so that biomedical systems from cellular organelles to large organs can be visualized and, more importantly, characterized based on their optical absorption coefficient, which is proportional to the concentration of absorbing chromophores. Optoacoustic imaging was shown to be useful in both preclinical research using small animal models and in clinical applications. Applications in the field of molecular imaging offer abundant opportunities for the development of highly specific and effective contrast agents for quantitative optoacoustic imaging. Recent efforts are being made in the direction of nontoxic biodegradable contrast agents (such as nanoparticles made of melanin) that are potentially applicable in clinical optoacoustic imaging. In order to increase the efficiency and specificity of contrast agents and probes, they need to be made smart and capable of controlled accumulation in the target cells. This review was written in recognition of the potential breakthroughs in medical optoacoustic imaging that can be enabled by efficient and nontoxic melanin-based optoacoustic contrast agents. PMID:28783106
μ-PADs for detection of chemical warfare agents.
Pardasani, Deepak; Tak, Vijay; Purohit, Ajay K; Dubey, D K
2012-12-07
Conventional methods of detection of chemical warfare agents (CWAs) based on chromogenic reactions are time and solvent intensive. The development of cost, time and solvent effective microfluidic paper based analytical devices (μ-PADs) for the detection of nerve and vesicant agents is described. The detection of analytes was based upon their reactions with rhodamine hydroxamate and para-nitrobenzyl pyridine, producing red and blue colours respectively. Reactions were optimized on the μ-PADs to produce the limits of detection (LODs) as low as 100 μM for sulfur mustard in aqueous samples. Results were quantified with the help of a simple desktop scanner and Photoshop software. Sarin achieved a linear response in the two concentration ranges of 20-100 mM and 100-500 mM, whereas the response of sulfur mustard was found to be linear in the concentration range of 10-75 mM. Results were precise enough to establish the μ-PADs as a valuable tool for security personnel fighting against chemical terrorism.
Introduction to Architectures: HSCB Information - What It Is and How It Fits (or Doesn’t Fit)
2010-10-01
Simulation Interoperability Workshop, 01E- SIW -080 [15] Barry G. Silverman, Gnana Gharathy, Kevin O’Brien, Jason Cornwell, “Human Behavior Models for Agents...Workshop, 10F- SIW -023, September 2010. [17] Christiansen, John H., “A flexible object-based software framework for modelling complex systems with
Specification-based software sizing: An empirical investigation of function metrics
NASA Technical Reports Server (NTRS)
Jeffery, Ross; Stathis, John
1993-01-01
For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.
Applications of software-defined radio (SDR) technology in hospital environments.
Chávez-Santiago, Raúl; Mateska, Aleksandra; Chomu, Konstantin; Gavrilovska, Liljana; Balasingham, Ilangko
2013-01-01
A software-defined radio (SDR) is a radio communication system where the major part of its functionality is implemented by means of software in a personal computer or embedded system. Such a design paradigm has the major advantage of producing devices that can receive and transmit widely different radio protocols based solely on the software used. This flexibility opens several application opportunities in hospital environments, where a large number of wired and wireless electronic devices must coexist in confined areas like operating rooms and intensive care units. This paper outlines some possible applications in the 2360-2500 MHz frequency band. These applications include the integration of wireless medical devices in a common communication platform for seamless interoperability, and cognitive radio (CR) for body area networks (BANs) and wireless sensor networks (WSNs) for medical environmental surveillance. The description of a proof-of-concept CR prototype is also presented.
The component-based architecture of the HELIOS medical software engineering environment.
Degoulet, P; Jean, F C; Engelmann, U; Meinzer, H P; Baud, R; Sandblad, B; Wigertz, O; Le Meur, R; Jagermann, C
1994-12-01
The constitution of highly integrated health information networks and the growth of multimedia technologies raise new challenges for the development of medical applications. We describe in this paper the general architecture of the HELIOS medical software engineering environment devoted to the development and maintenance of multimedia distributed medical applications. HELIOS is made of a set of software components, federated by a communication channel called the HELIOS Unification Bus. The HELIOS kernel includes three main components, the Analysis-Design and Environment, the Object Information System and the Interface Manager. HELIOS services consist in a collection of toolkits providing the necessary facilities to medical application developers. They include Image Related services, a Natural Language Processor, a Decision Support System and Connection services. The project gives special attention to both object-oriented approaches and software re-usability that are considered crucial steps towards the development of more reliable, coherent and integrated applications.
DigiSeis—A software component for digitizing seismic signals using the PC sound card
NASA Astrophysics Data System (ADS)
Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar
2012-06-01
An innovative software-based approach to develop an inexpensive experimental seismic recorder is presented. This approach requires no hardware as the built-in PC sound card is used for digitization of seismic signals. DigiSeis, an ActiveX component is developed to capture the digitized seismic signals from the sound card and deliver them to applications for processing and display. A seismic recorder application software SeisWave is developed over this component, which provides real-time monitoring and display of seismic events picked by a pair of external geophones. This recorder can be used as an educational aid for conducting seismic experiments. It can also be connected with suitable seismic sensors to record earthquakes. The software application and the ActiveX component are available for download. This component can be used to develop seismic recording applications according to user specific requirements.
The Use of Software Agents for Autonomous Control of a DC Space Power System
NASA Technical Reports Server (NTRS)
May, Ryan D.; Loparo, Kenneth A.
2014-01-01
In order to enable manned deep-space missions, the spacecraft must be controlled autonomously using on-board algorithms. A control architecture is proposed to enable this autonomous operation for an spacecraft electric power system and then implemented using a highly distributed network of software agents. These agents collaborate and compete with each other in order to implement each of the control functions. A subset of this control architecture is tested against a steadystate power system simulation and found to be able to solve a constrained optimization problem with competing objectives using only local information.
Remote hardware-reconfigurable robotic camera
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel; Torres-Huitzil, Cesar; Maya-Rueda, Selene E.
2001-10-01
In this work, a camera with integrated image processing capabilities is discussed. The camera is based on an imager coupled to an FPGA device (Field Programmable Gate Array) which contains an architecture for real-time computer vision low-level processing. The architecture can be reprogrammed remotely for application specific purposes. The system is intended for rapid modification and adaptation for inspection and recognition applications, with the flexibility of hardware and software reprogrammability. FPGA reconfiguration allows the same ease of upgrade in hardware as a software upgrade process. The camera is composed of a digital imager coupled to an FPGA device, two memory banks, and a microcontroller. The microcontroller is used for communication tasks and FPGA programming. The system implements a software architecture to handle multiple FPGA architectures in the device, and the possibility to download a software/hardware object from the host computer into its internal context memory. System advantages are: small size, low power consumption, and a library of hardware/software functionalities that can be exchanged during run time. The system has been validated with an edge detection and a motion processing architecture, which will be presented in the paper. Applications targeted are in robotics, mobile robotics, and vision based quality control.
Knowledge-based approach for generating target system specifications from a domain model
NASA Technical Reports Server (NTRS)
Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan
1992-01-01
Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.
A Biosequence-based Approach to Software Characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oehmen, Christopher S.; Peterson, Elena S.; Phillips, Aaron R.
For many applications, it is desirable to have some process for recognizing when software binaries are closely related without relying on them to be identical or have identical segments. Some examples include monitoring utilization of high performance computing centers or service clouds, detecting freeware in licensed code, and enforcing application whitelists. But doing so in a dynamic environment is a nontrivial task because most approaches to software similarity require extensive and time-consuming analysis of a binary, or they fail to recognize executables that are similar but nonidentical. Presented herein is a novel biosequence-based method for quantifying similarity of executable binaries.more » Using this method, it is shown in an example application on large-scale multi-author codes that 1) the biosequence-based method has a statistical performance in recognizing and distinguishing between a collection of real-world high performance computing applications better than 90% of ideal; and 2) an example of using family tree analysis to tune identification for a code subfamily can achieve better than 99% of ideal performance.« less
Sarkar, Archana; Dutta, Arup; Dhingra, Usha; Dhingra, Pratibha; Verma, Priti; Juyal, Rakesh; Black, Robert E; Menon, Venugopal P; Kumar, Jitendra; Sazawal, Sunil
2006-08-01
In settings in developing countries, children often socialize with multiple socializing agents (peers, siblings, neighbors) apart from their parents, and thus, a measurement of a child's social interactions should be expanded beyond parental interactions. Since the environment plays a role in shaping a child's development, the measurement of child-socializing agents' interactions is important. We developed and used a computerized observational software Behavior and Social Interaction Software (BASIS) with a preloaded coding scheme installed on a handheld Palm device to record complex observations of interactions between children and socializing agents. Using BASIS, social interaction assessments were conducted on 573 preschool children for 1 h in their natural settings. Multiple screens with a set of choices in each screen were designed that included the child's location, broad activity, state, and interactions with child-socializing agents. Data were downloaded onto a computer and systematically analyzed. BASIS, installed on Palm OS (M-125), enabled the recording of the complex interactions of child-socializing agents that could not be recorded with manual forms. Thus, this tool provides an innovative and relatively accurate method for the systematic recording of social interactions in an unrestricted environment.
Autonomous Mission Operations for Sensor Webs
NASA Astrophysics Data System (ADS)
Underbrink, A.; Witt, K.; Stanley, J.; Mandl, D.
2008-12-01
We present interim results of a 2005 ROSES AIST project entitled, "Using Intelligent Agents to Form a Sensor Web for Autonomous Mission Operations", or SWAMO. The goal of the SWAMO project is to shift the control of spacecraft missions from a ground-based, centrally controlled architecture to a collaborative, distributed set of intelligent agents. The network of intelligent agents intends to reduce management requirements by utilizing model-based system prediction and autonomic model/agent collaboration. SWAMO agents are distributed throughout the Sensor Web environment, which may include multiple spacecraft, aircraft, ground systems, and ocean systems, as well as manned operations centers. The agents monitor and manage sensor platforms, Earth sensing systems, and Earth sensing models and processes. The SWAMO agents form a Sensor Web of agents via peer-to-peer coordination. Some of the intelligent agents are mobile and able to traverse between on-orbit and ground-based systems. Other agents in the network are responsible for encapsulating system models to perform prediction of future behavior of the modeled subsystems and components to which they are assigned. The software agents use semantic web technologies to enable improved information sharing among the operational entities of the Sensor Web. The semantics include ontological conceptualizations of the Sensor Web environment, plus conceptualizations of the SWAMO agents themselves. By conceptualizations of the agents, we mean knowledge of their state, operational capabilities, current operational capacities, Web Service search and discovery results, agent collaboration rules, etc. The need for ontological conceptualizations over the agents is to enable autonomous and autonomic operations of the Sensor Web. The SWAMO ontology enables automated decision making and responses to the dynamic Sensor Web environment and to end user science requests. The current ontology is compatible with Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) Sensor Model Language (SensorML) concepts and structures. The agents are currently deployed on the U.S. Naval Academy MidSTAR-1 satellite and are actively managing the power subsystem on-orbit without the need for human intervention.
Resource Management for Real-Time Adaptive Agents
NASA Technical Reports Server (NTRS)
Welch, Lonnie; Chelberg, David; Pfarr, Barbara; Fleeman, David; Parrott, David; Tan, Zhen-Yu; Jain, Shikha; Drews, Frank; Bruggeman, Carl; Shuler, Chris
2003-01-01
Increased autonomy and automation in onboard flight systems offer numerous potential benefits, including cost reduction and greater flexibility. The existence of generic mechanisms for automation is critical for handling unanticipated science events and anomalies where limitations in traditional control software with fixed, predetermined algorithms can mean loss of science data and missed opportunities for observing important terrestrial events. We have developed such a mechanism by adding a Hierarchical Agent-based ReaLTime technology (HART) extension to our Dynamic Resource Management (DRM) middleware. Traditional DRM provides mechanisms to monitor the realtime performance of distributed applications and to move applications among processors to improve real-time performance. In the HART project we have designed and implemented a performance adaptation mechanism to improve reaktime performance. To use this mechanism, applications are developed that can run at various levels of quality. The DRM can choose a setting for the quality level of an application dynamically at run-time in order to manage satellite resource usage more effectively. A groundbased prototype of a satellite system that captures and processes images has also been developed as part of this project to be used as a benchmark for evaluating the resource management framework A significant enhancement of this generic mission-independent framework allows scientists to specify the utility, or "scientific benefit," of science observations under various conditions like cloud cover and compression method. The resource manager then uses these benefit tables to determine in redtime how to set the quality levels for applications to maximize overall system utility as defined by the scientists running the mission. We also show how maintenance functions llke health and safety data can be integrated into the utility framework. Once thls framework has been certified for missions and successfully flight tested it can be reused with little development overhead for other missions. In contrast, current space missions llke Swift manage similar types of resource trade -off completely with the scientific application code itself, and such code must be re-certified and tested for each mission even if a large portion of the code base is shared. This final report discusses some of the major issues motivating this research effort, provides a literature review of the related work, discusses the resource management framework and ground-based satellite system prototype that has been developed, indicates what work is yet to be performed, and provides a list of publications resulting from this work.
Genoviz Software Development Kit: Java tool kit for building genomics visualization applications.
Helt, Gregg A; Nicol, John W; Erwin, Ed; Blossom, Eric; Blanchard, Steven G; Chervitz, Stephen A; Harmon, Cyrus; Loraine, Ann E
2009-08-25
Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs) in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.
A problem of optimal control and observation for distributed homogeneous multi-agent system
NASA Astrophysics Data System (ADS)
Kruglikov, Sergey V.
2017-12-01
The paper considers the implementation of a algorithm for controlling a distributed complex of several mobile multi-robots. The concept of a unified information space of the controlling system is applied. The presented information and mathematical models of participants and obstacles, as real agents, and goals and scenarios, as virtual agents, create the base forming the algorithmic and software background for computer decision support system. The controlling scheme assumes the indirect management of the robotic team on the basis of optimal control and observation problem predicting intellectual behavior in a dynamic, hostile environment. A basic content problem is a compound cargo transportation by a group of participants in the case of a distributed control scheme in the terrain with multiple obstacles.
An Agent Inspired Reconfigurable Computing Implementation of a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Weir, John M.; Wells, B. Earl
2003-01-01
Many software systems have been successfully implemented using an agent paradigm which employs a number of independent entities that communicate with one another to achieve a common goal. The distributed nature of such a paradigm makes it an excellent candidate for use in high speed reconfigurable computing hardware environments such as those present in modem FPGA's. In this paper, a distributed genetic algorithm that can be applied to the agent based reconfigurable hardware model is introduced. The effectiveness of this new algorithm is evaluated by comparing the quality of the solutions found by the new algorithm with those found by traditional genetic algorithms. The performance of a reconfigurable hardware implementation of the new algorithm on an FPGA is compared to traditional single processor implementations.
An application framework for computer-aided patient positioning in radiation therapy.
Liebler, T; Hub, M; Sanner, C; Schlegel, W
2003-09-01
The importance of exact patient positioning in radiation therapy increases with the ongoing improvements in irradiation planning and treatment. Therefore, new ways to overcome precision limitations of current positioning methods in fractionated treatment have to be found. The Department of Medical Physics at the German Cancer Research Centre (DKFZ) follows different video-based approaches to increase repositioning precision. In this context, the modular software framework FIVE (Fast Integrated Video-based Environment) has been designed and implemented. It is both hardware- and platform-independent and supports merging position data by integrating various computer-aided patient positioning methods. A highly precise optical tracking system and several subtraction imaging techniques have been realized as modules to supply basic video-based repositioning techniques. This paper describes the common framework architecture, the main software modules and their interfaces. An object-oriented software engineering process has been applied using the UML, C + + and the Qt library. The significance of the current framework prototype for the application in patient positioning as well as the extension to further application areas will be discussed. Particularly in experimental research, where special system adjustments are often necessary, the open design of the software allows problem-oriented extensions and adaptations.
iCrowd: agent-based behavior modeling and crowd simulator
NASA Astrophysics Data System (ADS)
Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.
2016-05-01
Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.
Learning Fraction Comparison by Using a Dynamic Mathematics Software--GeoGebra
ERIC Educational Resources Information Center
Poon, Kin Keung
2018-01-01
GeoGebra is a mathematics software system that can serve as a tool for inquiry-based learning. This paper deals with the application of a fraction comparison software, which is constructed by GeoGebra, for use in a dynamic mathematics environment. The corresponding teaching and learning issues have also been discussed.
Checklists for the Evaluation of Educational Software: Critical Review and Prospects.
ERIC Educational Resources Information Center
Tergan, Sigmar-Olaf
1998-01-01
Reviews strengths and weaknesses of check lists for the evaluation of computer software and outlines consequences for their practical application. Suggests an approach based on an instructional design model and a comprehensive framework to cope with problems of validity and predictive power of software evaluation. Discusses prospects of the…
Learning fraction comparison by using a dynamic mathematics software - GeoGebra
NASA Astrophysics Data System (ADS)
Poon, Kin Keung
2018-04-01
GeoGebra is a mathematics software system that can serve as a tool for inquiry-based learning. This paper deals with the application of a fraction comparison software, which is constructed by GeoGebra, for use in a dynamic mathematics environment. The corresponding teaching and learning issues have also been discussed.
Applications of artificial intelligence to mission planning
NASA Technical Reports Server (NTRS)
Ford, Donnie R.; Floyd, Stephen A.; Rogers, John S.
1990-01-01
The following subject areas are covered: object-oriented programming task; rule-based programming task; algorithms for resource allocation; connecting a Symbolics to a VAX; FORTRAN from Lisp; trees and forest task; software data structure conversion; software functionality modifications and enhancements; portability of resource allocation to a TI MicroExplorer; frontier of feasibility software system; and conclusions.
Extensive Evaluation of Using a Game Project in a Software Architecture Course
ERIC Educational Resources Information Center
Wang, Alf Inge
2011-01-01
This article describes an extensive evaluation of introducing a game project to a software architecture course. In this project, university students have to construct and design a type of software architecture, evaluate the architecture, implement an application based on the architecture, and test this implementation. In previous years, the domain…
Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa
2016-01-01
Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.
[Research progress of probe design software of oligonucleotide microarrays].
Chen, Xi; Wu, Zaoquan; Liu, Zhengchun
2014-02-01
DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.
Program For Generating Interactive Displays
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
Sun/Unix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. Plus viewed as productivity tool for application developers and application end users, who benefit from resultant consistent and well-designed user interface sheltering them from intricacies of computer. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC and PS/2 compute
Another Program For Generating Interactive Graphics
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
VAX/Ultrix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. When used throughout company for wide range of applications, makes both application program and computer seem transparent, with noticeable improvements in learning curve. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC's and PS/2 computers running AIX, and HP 9000 S
User-friendly technology to help family carers cope.
Chambers, Mary; Connor, Samantha L
2002-12-01
Increases in the older adult population are occurring simultaneously with a growth in new technology. Modern technology presents an opportunity to enhance the quality of life and independence of older people and their family carers through communication and access to health care information. To evaluate the usability of a multimedia software application designed to provide family carers of the elderly or disabled with information, advice and psychological support to increase their coping capacity. The interactive application consisted of an information-based package that provided carers with advice on the promotion of psychological health, including relaxation and other coping strategies. The software application also included a carer self-assessment instrument, designed to provide both family and professional carers with information to assess how family carers were coping with their care-giving role. Usability evaluation was carried out in two stages. In the first stage (verification), user trials and an evaluation questionnaire were used to refine and develop the content and usability of the multimedia software application. In the second (demonstration), stage evaluation questionnaires were used to appraise the usability of the modified software application. The findings evidenced that the majority of users found the software to be usable and informative. Some areas were highlighted for improvement in the navigation of the software. The authors conclude that with further refinement, the software application has the potential to offer information and support to those who are caring for the elderly and disabled at home.
Software for marine ecological environment comprehensive monitoring system based on MCGS
NASA Astrophysics Data System (ADS)
Wang, X. H.; Ma, R.; Cao, X.; Cao, L.; Chu, D. Z.; Zhang, L.; Zhang, T. P.
2017-08-01
The automatic integrated monitoring software for marine ecological environment based on MCGS configuration software is designed and developed to realize real-time automatic monitoring of many marine ecological parameters. The DTU data transmission terminal performs network communication and transmits the data to the user data center in a timely manner. The software adopts the modular design and has the advantages of stable and flexible data structure, strong portability and scalability, clear interface, simple user operation and convenient maintenance. Continuous site comparison test of 6 months showed that, the relative error of the parameters monitored by the system such as temperature, salinity, turbidity, pH, dissolved oxygen was controlled within 5% with the standard method and the relative error of the nutrient parameters was within 15%. Meanwhile, the system had few maintenance times, low failure rate, stable and efficient continuous monitoring capabilities. The field application shows that the software is stable and the data communication is reliable, and it has a good application prospect in the field of marine ecological environment comprehensive monitoring.
Speeding up the screening of steroids in urine: development of a user-friendly library.
Galesio, M; López-Fdez, H; Reboiro-Jato, M; Gómez-Meire, Silvana; Glez-Peña, D; Fdez-Riverola, F; Lodeiro, Carlos; Diniz, M E; Capelo, J L
2013-12-11
This work presents a novel database search engine - MLibrary - designed to assist the user in the detection and identification of androgenic anabolic steroids (AAS) and its metabolites by matrix assisted laser desorption/ionization (MALDI) and mass spectrometry-based strategies. The detection of the AAS in the samples was accomplished by searching (i) the mass spectrometric (MS) spectra against the library developed to identify possible positives and (ii) by comparison of the tandem mass spectrometric (MS/MS) spectra produced after fragmentation of the possible positives with a complete set of spectra that have previously been assigned to the software. The urinary screening for anabolic agents plays a major role in anti-doping laboratories as they represent the most abused drug class in sports. With the help of the MLibrary software application, the use of MALDI techniques for doping control is simplified and the time for evaluation and interpretation of the results is reduced. To do so, the search engine takes as input several MALDI-TOF-MS and MALDI-TOF-MS/MS spectra. It aids the researcher in an automatic mode by identifying possible positives in a single MS analysis and then confirming their presence in tandem MS analysis by comparing the experimental tandem mass spectrometric data with the database. Furthermore, the search engine can, potentially, be further expanded to other compounds in addition to AASs. The applicability of the MLibrary tool is shown through the analysis of spiked urine samples. Copyright © 2013 Elsevier Inc. All rights reserved.
Software Reviews. PC Software for Artificial Intelligence Applications.
ERIC Educational Resources Information Center
Epp, Helmut; And Others
1988-01-01
Contrasts artificial intelligence and conventional programming languages. Reviews Personal Consultant Plus, Smalltalk/V, and Nexpert Object, which are PC-based products inspired by problem-solving paradigms. Provides information on background and operation of each. (RT)
Application of insoluble fibers in the fining of wine phenolics.
Guerrero, Raúl F; Smith, Paul; Bindon, Keren A
2013-05-08
The application of animal-derived proteins as wine fining agents has been subject to increased regulation in recent years. As an alternative to protein-based fining agents, insoluble plant-derived fibers have the capacity to adsorb red wine tannins. Changes in red wine tannin were analyzed following application of fibers derived from apple and grape and protein-based fining agents. Other changes in wine composition, namely, color, monomeric phenolics, metals, and turbidity, were also determined. Wine tannin was maximally reduced by application of an apple pomace fiber and a grape pomace fiber (G4), removing 42 and 38%, respectively. Potassium caseinate maximally removed 19% of wine tannin, although applied at a lower dose. Fibers reduced anthocyanins, total phenolics, and wine color density, but changes in wine hue were minor. Proteins and apple fiber selectively removed high molecular mass phenolics, whereas grape fibers removed those of both high and low molecular mass. The results show that insoluble fibers may be considered as alternative fining agents for red wines.
Method and apparatus for managing transactions with connected computers
Goldsmith, Steven Y.; Phillips, Laurence R.; Spires, Shannon V.
2003-01-01
The present invention provides a method and apparatus that make use of existing computer and communication resources and that reduce the errors and delays common to complex transactions such as international shipping. The present invention comprises an agent-based collaborative work environment that assists geographically distributed commercial and government users in the management of complex transactions such as the transshipment of goods across the U.S.-Mexico border. Software agents can mediate the creation, validation and secure sharing of shipment information and regulatory documentation over the Internet, using the World-Wide Web to interface with human users.
NASA Astrophysics Data System (ADS)
Barnett, Barry S.; Bovik, Alan C.
1995-04-01
This paper presents a real time full motion video conferencing system based on the Visual Pattern Image Sequence Coding (VPISC) software codec. The prototype system hardware is comprised of two personal computers, two camcorders, two frame grabbers, and an ethernet connection. The prototype system software has a simple structure. It runs under the Disk Operating System, and includes a user interface, a video I/O interface, an event driven network interface, and a free running or frame synchronous video codec that also acts as the controller for the video and network interfaces. Two video coders have been tested in this system. Simple implementations of Visual Pattern Image Coding and VPISC have both proven to support full motion video conferencing with good visual quality. Future work will concentrate on expanding this prototype to support the motion compensated version of VPISC, as well as encompassing point-to-point modem I/O and multiple network protocols. The application will be ported to multiple hardware platforms and operating systems. The motivation for developing this prototype system is to demonstrate the practicality of software based real time video codecs. Furthermore, software video codecs are not only cheaper, but are more flexible system solutions because they enable different computer platforms to exchange encoded video information without requiring on-board protocol compatible video codex hardware. Software based solutions enable true low cost video conferencing that fits the `open systems' model of interoperability that is so important for building portable hardware and software applications.
Aircraft interrogation and display system: A ground support equipment for digital flight systems
NASA Technical Reports Server (NTRS)
Glover, R. D.
1982-01-01
A microprocessor-based general purpose ground support equipment for electronic systems was developed. The hardware and software are designed to permit diverse applications in support of aircraft flight systems and simulation facilities. The implementation of the hardware, the structure of the software, describes the application of the system to an ongoing research aircraft project are described.
ERIC Educational Resources Information Center
Murray, Tom
2016-01-01
Intelligent Tutoring Systems authoring tools are highly complex educational software applications used to produce highly complex software applications (i.e. ITSs). How should our assumptions about the target users (authors) impact the design of authoring tools? In this article I first reflect on the factors leading to my original 1999 article on…
Fiber optic interferometry for industrial process monitoring and control applications
NASA Astrophysics Data System (ADS)
Marcus, Michael A.
2002-02-01
Over the past few years we have been developing applications for a high-resolution (sub-micron accuracy) fiber optic coupled dual Michelson interferometer-based instrument. It is being utilized in a variety of applications including monitoring liquid layer thickness uniformity on coating hoppers, film base thickness uniformity measurement, digital camera focus assessment, optical cell path length assessment and imager and wafer surface profile mapping. The instrument includes both coherent and non-coherent light sources, custom application dependent optical probes and sample interfaces, a Michelson interferometer, custom electronics, a Pentium-based PC with data acquisition cards and LabWindows CVI or LabView based application specific software. This paper describes the development evolution of this instrument platform and applications highlighting robust instrument design, hardware, software, and user interfaces development. The talk concludes with a discussion of a new high-speed instrument configuration, which can be utilized for high speed surface profiling and as an on-line web thickness gauge.
Precise control and animation creation over the DMD for projection-based applications
NASA Astrophysics Data System (ADS)
Koudsi, Badia
2014-03-01
Digital micromirror devices (DMDs) are used in a variety of display and projection applications to produce high resolution images, both static and animated. A common obstacle to working with DMDs in research and development applications is the steep learning curve required to obtain proficiency in programming the boards that control the behavior of the DMDs. This can discourage developers who wish to use DMDs in new or novel research and development applications which might benefit from their light-control properties. A new software package called Light Animator has been developed that provides a user friendly and more intuitive interface for controlling the DMD. The software allows users to address the micromirror array by the drawing and animation of objects in a style similar to that of commercial drawing programs. Sequences and animation are controlled by dividing the sequence into frames which the user can draw individually or the software can fill in for the user. Examples and descriptions of the software operation are described and operational performance measures are provided. Potential applications include 3D volumetric displays, a 3D scanner when combining the DMD with a CCD camera, and most any 2D application for which DMDs are currently used. The software's capabilities allow scientists to develop applications more easily and effectively.
NASA Astrophysics Data System (ADS)
Lyu, Bo-Han; Wang, Chen; Tsai, Chun-Wei
2017-08-01
Jasper Display Corp. (JDC) offer high reflectivity, high resolution Liquid Crystal on Silicon - Spatial Light Modulator (LCoS-SLM) which include an associated controller ASIC and LabVIEW based modulation software. Based on this LCoS-SLM, also called Education Kit (EDK), we provide a training platform which includes a series of optical theory and experiments to university students. This EDK not only provides a LabVIEW based operation software to produce Computer Generated Holograms (CGH) to generate some basic diffraction image or holographic image, but also provides simulation software to verity the experiment results simultaneously. However, we believe that a robust LCoSSLM, operation software, simulation software, training system, and training course can help students to study the fundamental optics, wave optics, and Fourier optics more easily. Based on these fundamental knowledges, they could develop their unique skills and create their new innovations on the optoelectronic application in the future.
Smart roadside initiative : user manual.
DOT National Transportation Integrated Search
2015-09-01
This document provides the user instructions for the Smart Roadside Initiative (SRI) applications including mobile and web-based SRI applications. These applications include smartphone-enabled information exchange and notification, and software compo...
Bidding Agents That Perpetrate Auction Fraud
NASA Astrophysics Data System (ADS)
Trevathan, Jarrod; McCabe, Alan; Read, Wayne
This paper presents a software bidding agent that inserts fake bids on the seller's behalf to inflate an auction's price. This behaviour is referred to as shill bidding. Shill bidding is strictly prohibited by online auctioneers, as it defrauds unsuspecting buyers by forcing them to pay more for the item. The malicious bidding agent was constructed to aid in developing shill detection techniques. We have previously documented a simple shill bidding agent that incrementally increases the auction price until it reaches the desired profit target, or it becomes too risky to continue bidding. This paper presents an adaptive shill bidding agent which when used over a series of auctions with substitutable items, can revise its strategy based on bidding behaviour in past auctions. The adaptive agent applies a novel prediction technique referred to as the Extremum Consistency (EC) algorithm, to determine the optimal price to aspire for. The EC algorithm has successfully been used in handwritten signature verification for determining the maximum and minimum values in an input stream. The agent's ability to inflate the price has been tested in a simulated marketplace and experimental results are presented.
Use of Continuous Integration Tools for Application Performance Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vergara Larrea, Veronica G; Joubert, Wayne; Fuson, Christopher B
High performance computing systems are becom- ing increasingly complex, both in node architecture and in the multiple layers of software stack required to compile and run applications. As a consequence, the likelihood is increasing for application performance regressions to occur as a result of routine upgrades of system software components which interact in complex ways. The purpose of this study is to evaluate the effectiveness of continuous integration tools for application performance monitoring on HPC systems. In addition, this paper also describes a prototype system for application perfor- mance monitoring based on Jenkins, a Java-based continuous integration tool. The monitoringmore » system described leverages several features in Jenkins to track application performance results over time. Preliminary results and lessons learned from monitoring applications on Cray systems at the Oak Ridge Leadership Computing Facility are presented.« less
Sense and Respond Logistics: Integrating Prediction, Responsiveness, and Control Capabilities
2006-01-01
logistics SAR sense and respond SCM Supply Chain Management SCN Supply Chain Network SIDA sense, interpret, decide, act SOS source of supply TCN...commodity supply chain management ( SCM ), will have WS- SCMs that focus on integrating information for a particular MDS. 8 In the remainder of this...developed applications of ABMs for SCM .21 Applications of Agents and Agent-Based Modeling Agents have been used in telecommunications, e-commerce
ERIC Educational Resources Information Center
Govindasamy, Malliga K.
2014-01-01
Agent technology has become one of the dynamic and most interesting areas of computer science in recent years. The dynamism of this technology has resulted in computer generated characters, known as pedagogical agent, entering the digital learning environments in increasing numbers. Commonly deployed in implementing tutoring strategies, these…
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
Collecting, Visualising, Communicating and Modelling Geographic Data for the Sciences
NASA Astrophysics Data System (ADS)
Crooks, A.; Hudson-Smith, A.; Milton, R.; Smith, D.; Batty, M.; Neuhaus, F.
2009-12-01
New web technologies and task specific software packages and services are fundamentally changing the way we share, collect, visualise, communicate and distribute geographic information. Coupled with these new technologies is the emergence of rich fine scale and extensive geographical datasets of the built environment. Such technologies and data are providing opportunities for both the social and physical sciences that were unimaginable ten years ago. Within this paper we discus such change from our own experiences at the Centre of Advanced Spatial Analysis. Specifically, how it is now possible to harness the crowd to collect peoples’ opinions about topical events such as the current financial crisis, in real time and map the results, through the use of our GMapCreator software and the MapTube website. Furthermore, such tools allow for widespread dissemination and visualisation of geographic data to whoever has an internet connection. We will explore how one can use new datasets to visualise the city using our Virtual London model as an example. Within the model individual buildings are tagged with multiple attributes providing a lens to explore the urban structure offering a plethora of research applications. We then turn to how one can visualise and communicate such data through low cost software and virtual worlds such as Crysis and Second Life with a look into their potential for modelling and finally how we disseminated much of this information through weblogs (blogs) such as Digital Urban, GIS and Agent-based modelling and Urban Tick.
NASA Astrophysics Data System (ADS)
Ratib, Osman; Rosset, Antoine; Dahlbom, Magnus; Czernin, Johannes
2005-04-01
Display and interpretation of multi dimensional data obtained from the combination of 3D data acquired from different modalities (such as PET-CT) require complex software tools allowing the user to navigate and modify the different image parameters. With faster scanners it is now possible to acquire dynamic images of a beating heart or the transit of a contrast agent adding a fifth dimension to the data. We developed a DICOM-compliant software for real time navigation in very large sets of 5 dimensional data based on an intuitive multidimensional jog-wheel widely used by the video-editing industry. The software, provided under open source licensing, allows interactive, single-handed, navigation through 3D images while adjusting blending of image modalities, image contrast and intensity and the rate of cine display of dynamic images. In this study we focused our effort on the user interface and means for interactively navigating in these large data sets while easily and rapidly changing multiple parameters such as image position, contrast, intensity, blending of colors, magnification etc. Conventional mouse-driven user interface requiring the user to manipulate cursors and sliders on the screen are too cumbersome and slow. We evaluated several hardware devices and identified a category of multipurpose jogwheel device that is used in the video-editing industry that is particularly suitable for rapidly navigating in five dimensions while adjusting several display parameters interactively. The application of this tool will be demonstrated in cardiac PET-CT imaging and functional cardiac MRI studies.
Investigation into the development of computer aided design software for space based sensors
NASA Technical Reports Server (NTRS)
Pender, C. W.; Clark, W. L.
1987-01-01
The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.
Tethys: A Platform for Water Resources Modeling and Decision Support Apps
NASA Astrophysics Data System (ADS)
Swain, N. R.; Christensen, S. D.; Jones, N.; Nelson, E. J.
2014-12-01
Cloud-based applications or apps are a promising medium through which water resources models and data can be conveyed in a user-friendly environment—making them more accessible to decision-makers and stakeholders. In the context of this work, a water resources web app is a web application that exposes limited modeling functionality for a scenario exploration activity in a structured workflow (e.g.: land use change runoff analysis, snowmelt runoff prediction, and flood potential analysis). The technical expertise required to develop water resources web apps can be a barrier to many potential developers of water resources apps. One challenge that developers face is in providing spatial storage, analysis, and visualization for the spatial data that is inherent to water resources models. The software projects that provide this functionality are non-standard to web development and there are a large number of free and open source software (FOSS) projects to choose from. In addition, it is often required to synthesize several software projects to provide all of the needed functionality. Another challenge for the developer will be orchestrating the use of several software components. Consequently, the initial software development investment required to deploy an effective water resources cloud-based application can be substantial. The Tethys Platform has been developed to lower the technical barrier and minimize the initial development investment that prohibits many scientists and engineers from making use of the web app medium. Tethys synthesizes several software projects including PostGIS for spatial storage, 52°North WPS for spatial analysis, GeoServer for spatial publishing, Google Earth™, Google Maps™ and OpenLayers for spatial visualization, and Highcharts for plotting tabular data. The software selection came after a literature review of software projects being used to create existing earth sciences web apps. All of the software is linked via a Python-powered software development kit (SDK). Tethys developers use the SDK to build their apps and incorporate the needed functionality from the software suite. The presentation will include several apps that have been developed using Tethys to demonstrate its capabilities. Based upon work supported by the National Science Foundation under Grant No. 1135483.
Software components for medical image visualization and surgical planning
NASA Astrophysics Data System (ADS)
Starreveld, Yves P.; Gobbi, David G.; Finnis, Kirk; Peters, Terence M.
2001-05-01
Purpose: The development of new applications in medical image visualization and surgical planning requires the completion of many common tasks such as image reading and re-sampling, segmentation, volume rendering, and surface display. Intra-operative use requires an interface to a tracking system and image registration, and the application requires basic, easy to understand user interface components. Rapid changes in computer and end-application hardware, as well as in operating systems and network environments make it desirable to have a hardware and operating system as an independent collection of reusable software components that can be assembled rapidly to prototype new applications. Methods: Using the OpenGL based Visualization Toolkit as a base, we have developed a set of components that implement the above mentioned tasks. The components are written in both C++ and Python, but all are accessible from Python, a byte compiled scripting language. The components have been used on the Red Hat Linux, Silicon Graphics Iris, Microsoft Windows, and Apple OS X platforms. Rigorous object-oriented software design methods have been applied to ensure hardware independence and a standard application programming interface (API). There are components to acquire, display, and register images from MRI, MRA, CT, Computed Rotational Angiography (CRA), Digital Subtraction Angiography (DSA), 2D and 3D ultrasound, video and physiological recordings. Interfaces to various tracking systems for intra-operative use have also been implemented. Results: The described components have been implemented and tested. To date they have been used to create image manipulation and viewing tools, a deep brain functional atlas, a 3D ultrasound acquisition and display platform, a prototype minimally invasive robotic coronary artery bypass graft planning system, a tracked neuro-endoscope guidance system and a frame-based stereotaxy neurosurgery planning tool. The frame-based stereotaxy module has been licensed and certified for use in a commercial image guidance system. Conclusions: It is feasible to encapsulate image manipulation and surgical guidance tasks in individual, reusable software modules. These modules allow for faster development of new applications. The strict application of object oriented software design methods allows individual components of such a system to make the transition from the research environment to a commercial one.
TSI-Enhanced Pedagogical Agents to Engage Learners in Virtual Worlds
ERIC Educational Resources Information Center
Leung, Steve; Virwaney, Sandeep; Lin, Fuhua; Armstrong, AJ; Dubbelboer, Adien
2013-01-01
Building pedagogical applications in virtual worlds is a multi-disciplinary endeavor that involves learning theories, application development framework, and mediated communication theories. This paper presents a project that integrates game-based learning, multi-agent system architecture (MAS), and the theory of Transformed Social Interaction…
System and method for deriving a process-based specification
NASA Technical Reports Server (NTRS)
Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)
2009-01-01
A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.
Design for Run-Time Monitor on Cloud Computing
NASA Astrophysics Data System (ADS)
Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.
MNE Scan: Software for real-time processing of electrophysiological data.
Esch, Lorenz; Sun, Limin; Klüber, Viktor; Lew, Seok; Baumgarten, Daniel; Grant, P Ellen; Okada, Yoshio; Haueisen, Jens; Hämäläinen, Matti S; Dinh, Christoph
2018-06-01
Magnetoencephalography (MEG) and Electroencephalography (EEG) are noninvasive techniques to study the electrophysiological activity of the human brain. Thus, they are well suited for real-time monitoring and analysis of neuronal activity. Real-time MEG/EEG data processing allows adjustment of the stimuli to the subject's responses for optimizing the acquired information especially by providing dynamically changing displays to enable neurofeedback. We introduce MNE Scan, an acquisition and real-time analysis software based on the multipurpose software library MNE-CPP. MNE Scan allows the development and application of acquisition and novel real-time processing methods in both research and clinical studies. The MNE Scan development follows a strict software engineering process to enable approvals required for clinical software. We tested the performance of MNE Scan in several device-independent use cases, including, a clinical epilepsy study, real-time source estimation, and Brain Computer Interface (BCI) application. Compared to existing tools we propose a modular software considering clinical software requirements expected by certification authorities. At the same time the software is extendable and freely accessible. We conclude that MNE Scan is the first step in creating a device-independent open-source software to facilitate the transition from basic neuroscience research to both applied sciences and clinical applications. Copyright © 2018 Elsevier B.V. All rights reserved.
Zhou, Ji; Applegate, Christopher; Alonso, Albor Dobon; Reynolds, Daniel; Orford, Simon; Mackiewicz, Michal; Griffiths, Simon; Penfield, Steven; Pullen, Nick
2017-01-01
Plants demonstrate dynamic growth phenotypes that are determined by genetic and environmental factors. Phenotypic analysis of growth features over time is a key approach to understand how plants interact with environmental change as well as respond to different treatments. Although the importance of measuring dynamic growth traits is widely recognised, available open software tools are limited in terms of batch image processing, multiple traits analyses, software usability and cross-referencing results between experiments, making automated phenotypic analysis problematic. Here, we present Leaf-GP (Growth Phenotypes), an easy-to-use and open software application that can be executed on different computing platforms. To facilitate diverse scientific communities, we provide three software versions, including a graphic user interface (GUI) for personal computer (PC) users, a command-line interface for high-performance computer (HPC) users, and a well-commented interactive Jupyter Notebook (also known as the iPython Notebook) for computational biologists and computer scientists. The software is capable of extracting multiple growth traits automatically from large image datasets. We have utilised it in Arabidopsis thaliana and wheat ( Triticum aestivum ) growth studies at the Norwich Research Park (NRP, UK). By quantifying a number of growth phenotypes over time, we have identified diverse plant growth patterns between different genotypes under several experimental conditions. As Leaf-GP has been evaluated with noisy image series acquired by different imaging devices (e.g. smartphones and digital cameras) and still produced reliable biological outputs, we therefore believe that our automated analysis workflow and customised computer vision based feature extraction software implementation can facilitate a broader plant research community for their growth and development studies. Furthermore, because we implemented Leaf-GP based on open Python-based computer vision, image analysis and machine learning libraries, we believe that our software not only can contribute to biological research, but also demonstrates how to utilise existing open numeric and scientific libraries (e.g. Scikit-image, OpenCV, SciPy and Scikit-learn) to build sound plant phenomics analytic solutions, in a efficient and effective way. Leaf-GP is a sophisticated software application that provides three approaches to quantify growth phenotypes from large image series. We demonstrate its usefulness and high accuracy based on two biological applications: (1) the quantification of growth traits for Arabidopsis genotypes under two temperature conditions; and (2) measuring wheat growth in the glasshouse over time. The software is easy-to-use and cross-platform, which can be executed on Mac OS, Windows and HPC, with open Python-based scientific libraries preinstalled. Our work presents the advancement of how to integrate computer vision, image analysis, machine learning and software engineering in plant phenomics software implementation. To serve the plant research community, our modulated source code, detailed comments, executables (.exe for Windows; .app for Mac), and experimental results are freely available at https://github.com/Crop-Phenomics-Group/Leaf-GP/releases.
Software development to implement the TxDOT culvert rating guide.
DOT National Transportation Integrated Search
2013-05-01
This implementation project created CULVLR: Culvert Load Rating, Version 1.0.0, a Windows-based : desktop application software package that automates the process by which Texas Department of Transportation : (TxDOT) engineers and their consultants ...
Boehm-Sturm, Philipp; Haeckel, Akvile; Hauptmann, Ralf; Mueller, Susanne; Kuhl, Christiane K; Schellenberger, Eyk A
2018-02-01
Purpose To synthesize two low-molecular-weight iron chelates and compare their T1 contrast effects with those of a commercial gadolinium-based contrast agent for their applicability in dynamic contrast material-enhanced (DCE) magnetic resonance (MR) imaging. Materials and Methods The animal experiments were approved by the local ethics committee. Two previously described iron (Fe) chelates of pentetic acid (Fe-DTPA) and of trans-cyclohexane diamine tetraacetic acid (Fe-tCDTA) were synthesized with stability constants several orders of magnitude higher than those of gadolinium-based contrast agents. The T1 contrast effects of the two chelates were compared with those of gadopentetate dimeglumine in blood serum phantoms at 1.5 T, 3 T, and 7 T. For in vivo studies, a human breast cancer cell line (MDA-231) was implanted in five mice per group. The dynamic contrast effects of the chelates were compared by performing DCE MR imaging with intravenous application of Fe-DTPA or Fe-tCDTA on day 1 and DCE MR imaging in the same tumors with gadopentetate dimeglumine on day 2. Quantitative DCE maps were generated with software and were compared by means of a one-tailed Pearson correlation test. Results Relaxivities in serum (0.94 T at room temperature) of Fe-tCDTA (r1 = 2.2 mmol -1 · sec -1 , r2 = 2.5 mmol -1 · sec -1 ) and Fe-DTPA (r1 = 0.9 mmol -1 · sec -1 , r2 = 0.9 mmol -1 · sec -1 ) were approximately twofold and fivefold lower, respectively, compared with those of gadopentetate dimeglumine (r1 = 4.1 mmol -1 · sec -1 , r2 = 4.8 mmol -1 · sec -1 ). Used at moderately higher concentrations, however, iron chelates generated similar contrast effects at T1-weighted MR imaging in vitro in serum, in vivo in blood, and for DCE MR imaging of breast cancer xenografts. The volume transfer constant values for Fe-DTPA and Fe-tCDTA in the same tumors correlated well with those observed for gadopentetate dimeglumine (Fe-tCDTA Pearson R, 0.99; P = .0003; Fe-DTPA Pearson R, 0.97; P = .003). Conclusion Iron-based contrast agents are promising as alternatives for contrast enhancement at T1-weighted MR imaging and have the potential to contribute to the safety of MR imaging. © RSNA, 2017 Online supplemental material is available for this article.
The research and practice of spacecraft software engineering
NASA Astrophysics Data System (ADS)
Chen, Chengxin; Wang, Jinghua; Xu, Xiaoguang
2017-06-01
In order to ensure the safety and reliability of spacecraft software products, it is necessary to execute engineering management. Firstly, the paper introduces the problems of unsystematic planning, uncertain classified management and uncontinuous improved mechanism in domestic and foreign spacecraft software engineering management. Then, it proposes a solution for software engineering management based on system-integrated ideology in the perspective of spacecraft system. Finally, a application result of spacecraft is given as an example. The research can provides a reference for executing spacecraft software engineering management and improving software product quality.
NASA Technical Reports Server (NTRS)
McNelis, Anne M.; Beach, Raymond F.; Soeder, James F.; McNelis, Nancy B.; May, Ryan; Dever, Timothy P.; Trase, Larry
2014-01-01
The development of distributed hierarchical and agent-based control systems will allow for reliable autonomous energy management and power distribution for on-orbit missions. Power is one of the most critical systems on board a space vehicle, requiring quick response time when a fault or emergency is identified. As NASAs missions with human presence extend beyond low earth orbit autonomous control of vehicle power systems will be necessary and will need to reliably function for long periods of time. In the design of autonomous electrical power control systems there is a need to dynamically simulate and verify the EPS controller functionality prior to use on-orbit. This paper presents the work at NASA Glenn Research Center in Cleveland, Ohio where the development of a controls laboratory is being completed that will be utilized to demonstrate advanced prototype EPS controllers for space, aeronautical and terrestrial applications. The control laboratory hardware, software and application of an autonomous controller for demonstration with the ISS electrical power system is the subject of this paper.
Xue, Shenghui; Qiao, Jingjuan; Pu, Fan; Cameron, Mathew; Yang, Jenny J.
2014-01-01
Magnetic resonance imaging (MRI) of disease biomarkers, especially cancer biomarkers, could potentially improve our understanding of the disease and drug activity during preclinical and clinical drug treatment and patient stratification. MRI contrast agents with high relaxivity and targeting capability to tumor biomarkers are highly required. Extensive work has been done to develop MRI contrast agents. However, only a few limited literatures report that protein residues can function as ligands to bind Gd3+ with high binding affinity, selectivity, and relaxivity. In this paper, we focus on reporting our current progress on designing a novel class of protein-based Gd3+ MRI contrast agents (ProCAs) equipped with several desirable capabilities for in vivo application of MRI of tumor biomarkers. We will first discuss our strategy for improving the relaxivity by a novel protein-based design. We then discuss the effect of increased relaxivity of ProCAs on improving the detection limits for MRI contrast agent, especially for in vivo application. We will further report our efforts to improve in vivo imaging capability and our achievement in molecular imaging of cancer biomarkers with potential preclinical and clinical applications. PMID:23335551
Clustering recommendations to compute agent reputation
NASA Astrophysics Data System (ADS)
Bedi, Punam; Kaur, Harmeet
2005-03-01
Traditional centralized approaches to security are difficult to apply to multi-agent systems which are used nowadays in e-commerce applications. Developing a notion of trust that is based on the reputation of an agent can provide a softer notion of security that is sufficient for many multi-agent applications. Our paper proposes a mechanism for computing reputation of the trustee agent for use by the trustier agent. The trustier agent computes the reputation based on its own experience as well as the experience the peer agents have with the trustee agents. The trustier agents intentionally interact with the peer agents to get their experience information in the form of recommendations. We have also considered the case of unintentional encounters between the referee agents and the trustee agent, which can be directly between them or indirectly through a set of interacting agents. The clustering is done to filter off the noise in the recommendations in the form of outliers. The trustier agent clusters the recommendations received from referee agents on the basis of the distances between recommendations using the hierarchical agglomerative method. The dendogram hence obtained is cut at the required similarity level which restricts the maximum distance between any two recommendations within a cluster. The cluster with maximum number of elements denotes the views of the majority of recommenders. The center of this cluster represents the reputation of the trustee agent which can be computed using c-means algorithm.
Open source software integrated into data services of Japanese planetary explorations
NASA Astrophysics Data System (ADS)
Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.
2015-12-01
Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.
Autonomic and Coevolutionary Sensor Networking
NASA Astrophysics Data System (ADS)
Boonma, Pruet; Suzuki, Junichi
(WSNs) applications are often required to balance the tradeoffs among conflicting operational objectives (e.g., latency and power consumption) and operate at an optimal tradeoff. This chapter proposes and evaluates a architecture, called BiSNET/e, which allows WSN applications to overcome this issue. BiSNET/e is designed to support three major types of WSN applications: , and hybrid applications. Each application is implemented as a decentralized group of, which is analogous to a bee colony (application) consisting of bees (agents). Agents collect sensor data or detect an event (a significant change in sensor reading) on individual nodes, and carry sensor data to base stations. They perform these data collection and event detection functionalities by sensing their surrounding network conditions and adaptively invoking behaviors such as pheromone emission, reproduction, migration, swarming and death. Each agent has its own behavior policy, as a set of genes, which defines how to invoke its behaviors. BiSNET/e allows agents to evolve their behavior policies (genes) across generations and autonomously adapt their performance to given objectives. Simulation results demonstrate that, in all three types of applications, agents evolve to find optimal tradeoffs among conflicting objectives and adapt to dynamic network conditions such as traffic fluctuations and node failures/additions. Simulation results also illustrate that, in hybrid applications, data collection agents and event detection agents coevolve to augment their adaptability and performance.
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
RINGMesh: A programming library for developing mesh-based geomodeling applications
NASA Astrophysics Data System (ADS)
Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume
2017-07-01
RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.
A software tool of digital tomosynthesis application for patient positioning in radiotherapy.
Yan, Hui; Dai, Jian-Rong
2016-03-08
Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm and CPU-based algorithm is 0.99. Based on the measurements of cube phantom on DTS, the geometric errors are within 0.5 mm in three axes. For both cube phantom and pelvic phantom, the registration errors are within 0.5 mm in three axes. Compared with reconstruction performance of CPU-based algorithms, the performances of DRR and DTS reconstructions are improved by a factor of 15 to 20. A GPU-based software tool was developed for DTS application for patient positioning of radiotherapy. The geometric and registration accuracy met the clinical requirement in patient setup of radiotherapy. The high performance of DRR and DTS reconstruction algorithms was achieved by the GPU-based computation environments. It is a useful software tool for researcher and clinician in evaluating DTS application in patient positioning of radiotherapy.
Choice: 36 band feature selection software with applications to multispectral pattern recognition
NASA Technical Reports Server (NTRS)
Jones, W. C.
1973-01-01
Feature selection software was developed at the Earth Resources Laboratory that is capable of inputting up to 36 channels and selecting channel subsets according to several criteria based on divergence. One of the criterion used is compatible with the table look-up classifier requirements. The software indicates which channel subset best separates (based on average divergence) each class from all other classes. The software employs an exhaustive search technique, and computer time is not prohibitive. A typical task to select the best 4 of 22 channels for 12 classes takes 9 minutes on a Univac 1108 computer.
Continuation of research into software for space operations support, volume 1
NASA Technical Reports Server (NTRS)
Collier, Mark D.; Killough, Ronnie; Martin, Nancy L.
1990-01-01
A prototype workstation executive called the Hardware Independent Software Development Environment (HISDE) was developed. Software technologies relevant to workstation executives were researched and evaluated and HISDE was used as a test bed for prototyping efforts. New X Windows software concepts and technology were introduced into workstation executives and related applications. The four research efforts performed included: (1) Research into the usability and efficiency of Motif (an X Windows based graphic user interface) which consisted of converting the existing Athena widget based HISDE user interface to Motif demonstrating the usability of Motif and providing insight into the level of effort required to translate an application from widget to another; (2) Prototype a real time data display widget which consisted of research methods for and prototyping the selected method of displaying textual values in an efficient manner; (3) X Windows performance evaluation which consisted of a series of performance measurements which demonstrated the ability of low level X Windows to display textural information; (4) Convert the Display Manager to X Window/Motif which is the application used by NASA for data display during operational mode.
Sharing Research Models: Using Software Engineering Practices for Facilitation
Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.
2011-01-01
Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780
Multirobot autonomous landmine detection using distributed multisensor information aggregation
NASA Astrophysics Data System (ADS)
Jumadinova, Janyl; Dasgupta, Prithviraj
2012-06-01
We consider the problem of distributed sensor information fusion by multiple autonomous robots within the context of landmine detection. We assume that different landmines can be composed of different types of material and robots are equipped with different types of sensors, while each robot has only one type of landmine detection sensor on it. We introduce a novel technique that uses a market-based information aggregation mechanism called a prediction market. Each robot is provided with a software agent that uses sensory input of the robot and performs calculations of the prediction market technique. The result of the agent's calculations is a 'belief' representing the confidence of the agent in identifying the object as a landmine. The beliefs from different robots are aggregated by the market mechanism and passed on to a decision maker agent. The decision maker agent uses this aggregate belief information about a potential landmine and makes decisions about which other robots should be deployed to its location, so that the landmine can be confirmed rapidly and accurately. Our experimental results show that, for identical data distributions and settings, using our prediction market-based information aggregation technique increases the accuracy of object classification favorably as compared to two other commonly used techniques.
Yap, Kevin Yi-Lwern; See, Cheng Shang; Kuo, En Yi; Chui, Wai Keung; Chan, Alexandre
2012-02-01
Patients with cancer who use complementary and alternative medicines (CAMs) in conjunction with chemotherapy treatment are at risk of manifesting anticancer drug-CAM interactions (DCIs), which may lead to negative therapeutic outcomes. This article describes a novel iPhone application developed for the Mobile Internet, called OncoRx-MI, which identifies DCIs of single-agent and multiple-agent chemotherapy regimen (CReg) prescriptions. Drug-, CAM-, and DCI-related information was compiled from various hardcopy and softcopy sources, and published literature from PubMed. Overall management plans for the CRegs were then developed. The iPhone Web documents were constructed using Adobe software and programming scripts, and mounted onto a third-party server. DCI searches are based on CReg acronyms, and OncoRx-MI is designed to fit the iPhone screen configuration for improved usability. A small usability study was also carried out and the user feedback presented. OncoRx-MI is able to detect over 2700 interactions between 256 CRegs and 166 CAMs, making up a total of over 4400 DCI pairs. The CAMs are classified into seven categories based on their uses in supportive care, and non-cancer-related CAMs are also included. The majority of the DCIs are pharmacokinetic in nature (79%), involving the induction and inhibition of the cytochrome P450 isozymes and p-glycoprotein. Pharmacodynamic DCIs include hepatotoxicity (39%), altered corticosteroid efficacies (30%), and increased risks of hypoglycemia (4%), hypertensive crisis (2%), bleeding, and serotonin syndrome (1% each). OncoRx-MI is the first mobile application of its kind that allows searching of DCIs for CRegs through 3G networks, and is intended to improve pharmaceutical care of patients with cancer by assisting health care practitioners in managing CReg interactions in their clinical practices.
Managing Scientific Software Complexity with Bocca and CCA
Allan, Benjamin A.; Norris, Boyana; Elwasif, Wael R.; ...
2008-01-01
In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enablemore » application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.« less
NASA Technical Reports Server (NTRS)
Horsham, Gary A. P.
1992-01-01
This structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this kind of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.
NASA Technical Reports Server (NTRS)
Horsham, Gary A. P.
1991-01-01
The structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this king of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.
NASA Technical Reports Server (NTRS)
Fountain T.; Tilak, S.; Shin, P.; Hubbard, P.; Freudinger, L.
2009-01-01
The Open Source DataTurbine Initiative is an international community of scientists and engineers sharing a common interest in real-time streaming data middleware and applications. The technology base of the OSDT Initiative is the DataTurbine open source middleware. Key applications of DataTurbine include coral reef monitoring, lake monitoring and limnology, biodiversity and animal tracking, structural health monitoring and earthquake engineering, airborne environmental monitoring, and environmental sustainability. DataTurbine software emerged as a commercial product in the 1990 s from collaborations between NASA and private industry. In October 2007, a grant from the USA National Science Foundation (NSF) Office of Cyberinfrastructure allowed us to transition DataTurbine from a proprietary software product into an open source software initiative. This paper describes the DataTurbine software and highlights key applications in environmental monitoring.
Software-defined Radio Based Measurement Platform for Wireless Networks
Chao, I-Chun; Lee, Kang B.; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan
2015-01-01
End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc.) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks. PMID:27891210
Software-defined Radio Based Measurement Platform for Wireless Networks.
Chao, I-Chun; Lee, Kang B; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan
2015-10-01
End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc. ) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks.
A multi-agent system for coordinating international shipping
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldsmith, S.Y.; Phillips, L.R.; Spires, S.V.
1998-05-01
Moving commercial cargo across the US-Mexico border is currently a complex, paper-based, error-prone process that incurs expensive inspections and delays at several ports of entry in the Southwestern US. Improved information handling will dramatically reduce border dwell time, variation in delivery time, and inventories, and will give better control of the shipment process. The Border Trade Facilitation System (BTFS) is an agent-based collaborative work environment that assists geographically distributed commercial and government users with transshipment of goods across the US-Mexico border. Software agents mediate the creation, validation and secure sharing of shipment information and regulatory documentation over the Internet, usingmore » the World Wide Web to interface with human actors. Agents are organized into Agencies. Each agency represents a commercial or government agency. Agents perform four specific functions on behalf of their user organizations: (1) agents with domain knowledge elicit commercial and regulatory information from human specialists through forms presented via web browsers; (2) agents mediate information from forms with diverse otologies, copying invariant data from one form to another thereby eliminating the need for duplicate data entry; (3) cohorts of distributed agents coordinate the work flow among the various information providers and they monitor overall progress of the documentation and the location of the shipment to ensure that all regulatory requirements are met prior to arrival at the border; (4) agents provide status information to human actors and attempt to influence them when problems are predicted.« less
Study on Spacelab software development and integration concepts
NASA Technical Reports Server (NTRS)
1974-01-01
A study was conducted to define the complexity and magnitude of the Spacelab software challenge. The study was based on current Spacelab program concepts, anticipated flight schedules, and ground operation plans. The study was primarily directed toward identifying and solving problems related to the experiment flight application and tests and checkout software executing in the Spacelab onboard command and data management subsystem (CDMS) computers and electrical ground support equipment (EGSE). The study provides a conceptual base from which it is possible to proceed into the development phase of the Software Test and Integration Laboratory (STIL) and establishes guidelines for the definition of standards which will ensure that the total Spacelab software is understood prior to entering development.
Study on a novel laser target detection system based on software radio technique
NASA Astrophysics Data System (ADS)
Song, Song; Deng, Jia-hao; Wang, Xue-tian; Gao, Zhen; Sun, Ji; Sun, Zhi-hui
2008-12-01
This paper presents that software radio technique is applied to laser target detection system with the pseudo-random code modulation. Based on the theory of software radio, the basic framework of the system, hardware platform, and the implementation of the software system are detailed. Also, the block diagram of the system, DSP circuit, block diagram of the pseudo-random code generator, and soft flow diagram of signal processing are designed. Experimental results have shown that the application of software radio technique provides a novel method to realize the modularization, miniaturization and intelligence of the laser target detection system, and the upgrade and improvement of the system will become simpler, more convenient, and cheaper.
Object-oriented design of medical imaging software.
Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R
1994-01-01
A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.
Sakhteman, Amirhossein; Zare, Bijan
2016-01-01
An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276
The LUE data model for representation of agents and fields
NASA Astrophysics Data System (ADS)
de Jong, Kor; Schmitz, Oliver; Karssenberg, Derek
2017-04-01
Traditionally, agents-based and field-based modelling environments use different data models to represent the state of information they manipulate. In agent-based modelling, involving the representation of phenomena as objects bounded in space and time, agents are often represented by classes, each of which represents a particular kind of agent and all its properties. Such classes can be used to represent entities like people, birds, cars and countries. In field-based modelling, involving the representation of the environment as continuous fields, fields are often represented by a discretization of space, using multidimensional arrays, each storing mostly a single attribute. Such arrays can be used to represent the elevation of the land-surface, the pH of the soil, or the population density in an area, for example. Representing a population of agents by class instances grouped in collections is an intuitive way of organizing information. A drawback, though, is that models in which class instances grouping properties are stored in collections are less efficient (execute slower) than models in which collections of properties are grouped. The field representation, on the other hand, is convenient for the efficient execution of models. Another drawback is that, because the data models used are so different, integrating agent-based and field-based models becomes difficult, since the model builder has to deal with multiple concepts, and often multiple modelling environments. With the development of the LUE data model [1] we aim at representing agents and fields within a single paradigm, by combining the advantages of the data models used in agent-based and field-based data modelling. This removes the barrier for writing integrated agent-based and field-based models. The resulting data model is intuitive to use and allows for efficient execution of models. LUE is both a high-level conceptual data model and a low-level physical data model. The LUE conceptual data model is a generalization of the data models used in agent-based and field-based modelling. The LUE physical data model [2] is an implementation of the LUE conceptual data model in HDF5. In our presentation we will provide details of our approach to organizing information about agents and fields. We will show examples of agent and field data represented by the conceptual and physical data model. References: [1] de Bakker, M.P., de Jong, K., Schmitz, O., Karssenberg, D., 2016. Design and demonstration of a data model to integrate agent-based and field-based modelling. Environmental Modelling and Software. http://dx.doi.org/10.1016/j.envsoft.2016.11.016 [2] de Jong, K., 2017. LUE source code. https://github.com/pcraster/lue
Starlink Software Developments
NASA Astrophysics Data System (ADS)
Bly, M. J.; Giaretta, D.; Currie, M. J.; Taylor, M.
Some current and upcoming software developments from Starlink were demonstrated. These included invoking traditional Starlink applications via web services, the current version of the ORAC-DR reduction pipeline, and some new Java-based tools including Treeview, an interactive explorer of hierarchical data structures.
NASA Astrophysics Data System (ADS)
Wasielewska, K.; Ganzha, M.
2012-10-01
In this paper we consider combining ontologically demarcated information with Saaty's Analytic Hierarchy Process (AHP) [1] for the multicriterial assessment of offers during contract negotiations. The context for the proposal is provided by the Agents in Grid project (AiG; [2]), which aims at development of an agent-based infrastructure for efficient resource management in the Grid. In the AiG project, software agents representing users can either (1) join a team and earn money, or (2) find a team to execute a job. Moreover, agents form teams, managers of which negotiate with clients and workers terms of potential collaboration. Here, ontologically described contracts (Service Level Agreements) are the results of autonomous multiround negotiations. Therefore, taking into account relatively complex nature of the negotiated contracts, multicriterial assessment of proposals plays a crucial role. The AHP method is based on pairwise comparisons of criteria and relies on the judgement of a panel of experts. It measures how well does an offer serve the objective of a decision maker. In this paper, we propose how the AHP method can be used to assess ontologically described contract proposals.
ERIC Educational Resources Information Center
Thompson, Kate; Reimann, Peter
2010-01-01
A classification system that was developed for the use of agent-based models was applied to strategies used by school-aged students to interrogate an agent-based model and a system dynamics model. These were compared, and relationships between learning outcomes and the strategies used were also analysed. It was found that the classification system…
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
Ground and Space Radar Volume Matching and Comparison Software
NASA Technical Reports Server (NTRS)
Morris, Kenneth; Schwaller, Mathew
2010-01-01
This software enables easy comparison of ground- and space-based radar observations. The software was initially designed to compare ground radar reflectivity from operational, ground based Sand C-band meteorological radars with comparable measurements from the Tropical Rainfall Measuring Mission (TRMM) satellite s Precipitation Radar (PR) instrument. The software is also applicable to other ground-based and space-based radars. The ground and space radar volume matching and comparison software was developed in response to requirements defined by the Ground Validation System (GVS) of Goddard s Global Precipitation Mission (GPM) project. This software innovation is specifically concerned with simplifying the comparison of ground- and spacebased radar measurements for the purpose of GPM algorithm and data product validation. This software is unique in that it provides an operational environment to routinely create comparison products, and uses a direct geometric approach to derive common volumes of space- and ground-based radar data. In this approach, spatially coincident volumes are defined by the intersection of individual space-based Precipitation Radar rays with the each of the conical elevation sweeps of the ground radar. Thus, the resampled volume elements of the space and ground radar reflectivity can be directly compared to one another.
Runtime Verification of Pacemaker Functionality Using Hierarchical Fuzzy Colored Petri-nets.
Majma, Negar; Babamir, Seyed Morteza; Monadjemi, Amirhassan
2017-02-01
Today, implanted medical devices are increasingly used for many patients and in case of diverse health problems. However, several runtime problems and errors are reported by the relevant organizations, even resulting in patient death. One of those devices is the pacemaker. The pacemaker is a device helping the patient to regulate the heartbeat by connecting to the cardiac vessels. This device is directed by its software, so any failure in this software causes a serious malfunction. Therefore, this study aims to a better way to monitor the device's software behavior to decrease the failure risk. Accordingly, we supervise the runtime function and status of the software. The software verification means examining limitations and needs of the system users by the system running software. In this paper, a method to verify the pacemaker software, based on the fuzzy function of the device, is presented. So, the function limitations of the device are identified and presented as fuzzy rules and then the device is verified based on the hierarchical Fuzzy Colored Petri-net (FCPN), which is formed considering the software limits. Regarding the experiences of using: 1) Fuzzy Petri-nets (FPN) to verify insulin pumps, 2) Colored Petri-nets (CPN) to verify the pacemaker and 3) To verify the pacemaker by a software agent with Petri-network based knowledge, which we gained during the previous studies, the runtime behavior of the pacemaker software is examined by HFCPN, in this paper. This is considered a developing step compared to the earlier work. HFCPN in this paper, compared to the FPN and CPN used in our previous studies reduces the complexity. By presenting the Petri-net (PN) in a hierarchical form, the verification runtime, decreased as 90.61% compared to the verification runtime in the earlier work. Since we need an inference engine in the runtime verification, we used the HFCPN to enhance the performance of the inference engine.
A Stigmergy Approach for Open Source Software Developer Community Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Xiaohui; Beaver, Justin M; Potok, Thomas E
2009-01-01
The stigmergy collaboration approach provides a hypothesized explanation about how online groups work together. In this research, we presented a stigmergy approach for building an agent based open source software (OSS) developer community collaboration simulation. We used group of actors who collaborate on OSS projects as our frame of reference and investigated how the choices actors make in contribution their work on the projects determinate the global status of the whole OSS projects. In our simulation, the forum posts and project codes served as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing developer agentmore » behaviors selection probability.« less
The Application of Flash in Web-Based Multimedia Courseware Development
ERIC Educational Resources Information Center
Chen, Jun; Wang, Zu-Yuan; Wu, Yuren
2009-01-01
Purpose: The purpose of this paper is to introduce some new functions achieved in a web-based multimedia courseware, which is developed by Flash software and used by part-time graduate students. Design/methodology/approach: The courseware uses Adobe Flash CS3 as its development software, which supports Actionscript language, FMS and FLV technology…
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.
1988-01-01
The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.
Web Application Software for Ground Operations Planning Database (GOPDb) Management
NASA Technical Reports Server (NTRS)
Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey
2013-01-01
A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.
An experimental evaluation of software redundancy as a strategy for improving reliability
NASA Technical Reports Server (NTRS)
Eckhardt, Dave E., Jr.; Caglayan, Alper K.; Knight, John C.; Lee, Larry D.; Mcallister, David F.; Vouk, Mladen A.; Kelly, John P. J.
1990-01-01
The strategy of using multiple versions of independently developed software as a means to tolerate residual software design faults is suggested by the success of hardware redundancy for tolerating hardware failures. Although, as generally accepted, the independence of hardware failures resulting from physical wearout can lead to substantial increases in reliability for redundant hardware structures, a similar conclusion is not immediate for software. The degree to which design faults are manifested as independent failures determines the effectiveness of redundancy as a method for improving software reliability. Interest in multi-version software centers on whether it provides an adequate measure of increased reliability to warrant its use in critical applications. The effectiveness of multi-version software is studied by comparing estimates of the failure probabilities of these systems with the failure probabilities of single versions. The estimates are obtained under a model of dependent failures and compared with estimates obtained when failures are assumed to be independent. The experimental results are based on twenty versions of an aerospace application developed and certified by sixty programmers from four universities. Descriptions of the application, development and certification processes, and operational evaluation are given together with an analysis of the twenty versions.
Software Safety Risk in Legacy Safety-Critical Computer Systems
NASA Technical Reports Server (NTRS)
Hill, Janice; Baggs, Rhoda
2007-01-01
Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.
García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto
2014-07-05
The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies. Copyright © 2014 Wiley Periodicals, Inc.
Access Control for Cooperation Systems Based on Group Situation
NASA Astrophysics Data System (ADS)
Kim, Minsoo; Joshi, James B. D.; Kim, Minkoo
Cooperation systems characterize many emerging environments such as ubiquitous and pervasive systems. Agent based cooperation systems have been proposed in the literature to address challenges of such emerging application environments. A key aspect of such agent based cooperation system is the group situation that changes dynamically and governs the requirements of the cooperation. While individual agent context is important, the overall cooperation behavior is more driven by the group context because of relationships and interactions between agents. Dynamic access control based on group situation is a crucial challenge in such cooperation systems. In this paper we propose a dynamic role based access control model for cooperation systems based on group situation. The model emphasizes capability based agent to role mapping and group situation based permission assignment to allow capturing dynamic access policies that evolve continuously.
Analysis of a hardware and software fault tolerant processor for critical applications
NASA Technical Reports Server (NTRS)
Dugan, Joanne B.
1993-01-01
Computer systems for critical applications must be designed to tolerate software faults as well as hardware faults. A unified approach to tolerating hardware and software faults is characterized by classifying faults in terms of duration (transient or permanent) rather than source (hardware or software). Errors arising from transient faults can be handled through masking or voting, but errors arising from permanent faults require system reconfiguration to bypass the failed component. Most errors which are caused by software faults can be considered transient, in that they are input-dependent. Software faults are triggered by a particular set of inputs. Quantitative dependability analysis of systems which exhibit a unified approach to fault tolerance can be performed by a hierarchical combination of fault tree and Markov models. A methodology for analyzing hardware and software fault tolerant systems is applied to the analysis of a hypothetical system, loosely based on the Fault Tolerant Parallel Processor. The models consider both transient and permanent faults, hardware and software faults, independent and related software faults, automatic recovery, and reconfiguration.
Applang - A DSL for specification of mobile applications for android platform based on textX
NASA Astrophysics Data System (ADS)
Kosanović, Milan; Dejanović, Igor; Milosavljević, Gordana
2016-06-01
Mobile platforms become a ubiquitous part of our daily lives thus making more pressure to software developers to develop more applications faster and with the support for different mobile operating systems. To foster the faster development of mobile services and applications and to support various mobile operating systems a new software development approaches must be undertaken. Domain-Specific Languages (DSL) are a viable approach that promise to solve a problem of target platform diversity as well as to facilitate rapid application development and shorter time-to-market. This paper presents Applang, a DSL for the specification of mobile applications for the Android platform, based on textX meta-language. The application is described using Applang DSL and the source code for a target platform is automatically generated by the provided code generator. The same application defined using single Applang source can be transformed to various targets with little or no manual modifications.
A Primer for Agent-Based Simulation and Modeling in Transportation Applications
DOT National Transportation Integrated Search
2013-11-01
Agent-based modeling and simulation (ABMS) methods have been applied in a spectrum of research domains. This primer focuses on ABMS in the transportation interdisciplinary domain, describes the basic concepts of ABMS and the recent progress of ABMS i...
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
NASA Astrophysics Data System (ADS)
Kassim, Mohar; Zaidi, Ahmad Mujahid Ahmad; Sholihin Mokhtar, Rahmat
2018-05-01
Mobile software application has become a part of today’s lifestyle. This mobile app is designed to help society to be physically active. The application is named UPNM Cardio Fitness, and is developed on the Android platform. The original purpose of the application is to measure and analyse the level of cardiovascular fitness of 18 years old male Military cadet Officers through a 2.4 km run test. The application is based on a data base using Google Fusion Table that stores and analyses the data received. The application consists of two parts: information of the individual and their respective fitness norms that can be accessed either automatically or manually. The classification of the norms is obtained from the fitness norms of 120 male cadets aged 18 years old. The norms are grouped into five categories which are: Excellent, Very Good, Good, Moderate and Poor. The software consists of 5 hyperlinks which are the main page, individual information, test result, file and record. The application is created using MIT App Inventor Software and Windows 7. The creation of the application has enabled researchers particularly in the Science Training programme in UPNM to carry out tests as well as to identify the level of fitness of their trainees immediately, accurately, and systematically.
An intelligent agent for optimal river-reservoir system management
NASA Astrophysics Data System (ADS)
Rieker, Jeffrey D.; Labadie, John W.
2012-09-01
A generalized software package is presented for developing an intelligent agent for stochastic optimization of complex river-reservoir system management and operations. Reinforcement learning is an approach to artificial intelligence for developing a decision-making agent that learns the best operational policies without the need for explicit probabilistic models of hydrologic system behavior. The agent learns these strategies experientially in a Markov decision process through observational interaction with the environment and simulation of the river-reservoir system using well-calibrated models. The graphical user interface for the reinforcement learning process controller includes numerous learning method options and dynamic displays for visualizing the adaptive behavior of the agent. As a case study, the generalized reinforcement learning software is applied to developing an intelligent agent for optimal management of water stored in the Truckee river-reservoir system of California and Nevada for the purpose of streamflow augmentation for water quality enhancement. The intelligent agent successfully learns long-term reservoir operational policies that specifically focus on mitigating water temperature extremes during persistent drought periods that jeopardize the survival of threatened and endangered fish species.
Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo
2018-01-01
It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.
Nano/micromotors for security/defense applications. A review.
Singh, Virendra V; Wang, Joseph
2015-12-14
The new capabilities of man-made micro/nanomotors open up considerable opportunities for diverse security and defense applications. This review highlights new micromotor-based strategies for enhanced security monitoring and detoxification of chemical and biological warfare agents (CBWA). The movement of receptor-functionalized nanomotors offers great potential for sensing and isolating target bio-threats from complex samples. New mobile reactive materials based on zeolite or activated carbon offer considerable promise for the accelerated removal of chemical warfare agents. A wide range of proof-of-concept motor-based approaches, including the detection and destruction of anthrax spores, 'on-off' nerve-agent detection or effective neutralization of chemical warfare agents have thus been demonstrated. The propulsion of micromotors and their corresponding bubble tails impart significant mixing that greatly accelerates such detoxification processes. These nanomotors will thus empower sensing and destruction where stirring large quantities of decontaminating reagents and controlled mechanical agitation are impossible or undesired. New technological breakthroughs and greater sophistication of micro/nanoscale machines will lead to rapid translation of the micromotor research activity into practical defense applications, addressing the escalating threat of CBWA.
Nano/micromotors for security/defense applications. A review
NASA Astrophysics Data System (ADS)
Singh, Virendra V.; Wang, Joseph
2015-11-01
The new capabilities of man-made micro/nanomotors open up considerable opportunities for diverse security and defense applications. This review highlights new micromotor-based strategies for enhanced security monitoring and detoxification of chemical and biological warfare agents (CBWA). The movement of receptor-functionalized nanomotors offers great potential for sensing and isolating target bio-threats from complex samples. New mobile reactive materials based on zeolite or activated carbon offer considerable promise for the accelerated removal of chemical warfare agents. A wide range of proof-of-concept motor-based approaches, including the detection and destruction of anthrax spores, `on-off' nerve-agent detection or effective neutralization of chemical warfare agents have thus been demonstrated. The propulsion of micromotors and their corresponding bubble tails impart significant mixing that greatly accelerates such detoxification processes. These nanomotors will thus empower sensing and destruction where stirring large quantities of decontaminating reagents and controlled mechanical agitation are impossible or undesired. New technological breakthroughs and greater sophistication of micro/nanoscale machines will lead to rapid translation of the micromotor research activity into practical defense applications, addressing the escalating threat of CBWA.
A mHealth Application for Chronic Wound Care: Findings of a User Trial
Friesen, Marcia R.; Hamel, Carole; McLeod, Robert D.
2013-01-01
This paper reports on the findings of a user trial of a mHealth application for pressure ulcer (bedsore) documentation. Pressure ulcers are a leading iatrogenic cause of death in developed countries and significantly impact quality of life for those affected. Pressure ulcers will be an increasing public health concern as the population ages. Electronic information systems are being explored to improve consistency and accuracy of documentation, improve patient and caregiver experience and ultimately improve patient outcomes. A software application was developed for Android Smartphones and tablets and was trialed in a personal care home in Western Canada. The software application provides an electronic medical record for chronic wounds, replacing nurses’ paper-based charting and is positioned for integration with facility’s larger eHealth framework. The mHealth application offers three intended benefits over paper-based charting of chronic wounds, including: (1) the capacity for remote consultation (telehealth between facilities, practitioners, and/or remote communities), (2) data organization and analysis, including built-in alerts, automatically-generated text-based and graph-based wound histories including wound images, and (3) tutorial support for non-specialized caregivers. The user trial yielded insights regarding the software application’s design and functionality in the clinical setting, and highlighted the key role of wound photographs in enhancing patient and caregiver experiences, enhancing communication between multiple healthcare professionals, and leveraging the software’s telehealth capacities. PMID:24256739
NREL Software Models Performance of Wind Plants (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2015-01-01
This NREL Highlight is being developed for the 2015 February Alliance S&T Meeting, and describes NREL's Simulator for Offshore Wind Farm Applications (SOWFA) software in collaboration with Norway-based Statoil, to optimize layouts and controls of wind plants arrays.
ERIC Educational Resources Information Center
Seman, Laio Oriel; Hausmann, Romeu; Bezerra, Eduardo Augusto
2018-01-01
Contribution: This paper presents the "PBL classroom model," an agent-based simulation (ABS) that allows testing of several scenarios of a project-based learning (PBL) application by considering different levels of soft-skills, and students' perception of the methodology. Background: While the community has made great advances in…
The agent-based spatial information semantic grid
NASA Astrophysics Data System (ADS)
Cui, Wei; Zhu, YaQiong; Zhou, Yong; Li, Deren
2006-10-01
Analyzing the characteristic of multi-Agent and geographic Ontology, The concept of the Agent-based Spatial Information Semantic Grid (ASISG) is defined and the architecture of the ASISG is advanced. ASISG is composed with Multi-Agents and geographic Ontology. The Multi-Agent Systems are composed with User Agents, General Ontology Agent, Geo-Agents, Broker Agents, Resource Agents, Spatial Data Analysis Agents, Spatial Data Access Agents, Task Execution Agent and Monitor Agent. The architecture of ASISG have three layers, they are the fabric layer, the grid management layer and the application layer. The fabric layer what is composed with Data Access Agent, Resource Agent and Geo-Agent encapsulates the data of spatial information system so that exhibits a conceptual interface for the Grid management layer. The Grid management layer, which is composed with General Ontology Agent, Task Execution Agent and Monitor Agent and Data Analysis Agent, used a hybrid method to manage all resources that were registered in a General Ontology Agent that is described by a General Ontology System. The hybrid method is assembled by resource dissemination and resource discovery. The resource dissemination push resource from Local Ontology Agent to General Ontology Agent and the resource discovery pull resource from the General Ontology Agent to Local Ontology Agents. The Local Ontology Agent is derived from special domain and describes the semantic information of local GIS. The nature of the Local Ontology Agents can be filtrated to construct a virtual organization what could provides a global scheme. The virtual organization lightens the burdens of guests because they need not search information site by site manually. The application layer what is composed with User Agent, Geo-Agent and Task Execution Agent can apply a corresponding interface to a domain user. The functions that ASISG should provide are: 1) It integrates different spatial information systems on the semantic The Grid management layer establishes a virtual environment that integrates seamlessly all GIS notes. 2) When the resource management system searches data on different spatial information systems, it transfers the meaning of different Local Ontology Agents rather than access data directly. So the ability of search and query can be said to be on the semantic level. 3) The data access procedure is transparent to guests, that is, they could access the information from remote site as current disk because the General Ontology Agent could automatically link data by the Data Agents that link the Ontology concept to GIS data. 4) The capability of processing massive spatial data. Storing, accessing and managing massive spatial data from TB to PB; efficiently analyzing and processing spatial data to produce model, information and knowledge; and providing 3D and multimedia visualization services. 5) The capability of high performance computing and processing on spatial information. Solving spatial problems with high precision, high quality, and on a large scale; and process spatial information in real time or on time, with high-speed and high efficiency. 6) The capability of sharing spatial resources. The distributed heterogeneous spatial information resources are Shared and realizing integrated and inter-operated on semantic level, so as to make best use of spatial information resources,such as computing resources, storage devices, spatial data (integrating from GIS, RS and GPS), spatial applications and services, GIS platforms, 7) The capability of integrating legacy GIS system. A ASISG can not only be used to construct new advanced spatial application systems, but also integrate legacy GIS system, so as to keep extensibility and inheritance and guarantee investment of users. 8) The capability of collaboration. Large-scale spatial information applications and services always involve different departments in different geographic places, so remote and uniform services are needed. 9) The capability of supporting integration of heterogeneous systems. Large-scale spatial information systems are always synthetically applications, so ASISG should provide interoperation and consistency through adopting open and applied technology standards. 10) The capability of adapting dynamic changes. Business requirements, application patterns, management strategies, and IT products always change endlessly for any departments, so ASISG should be self-adaptive. Two examples are provided in this paper, those examples provide a detailed way on how you design your semantic grid based on Multi-Agent systems and Ontology. In conclusion, the semantic grid of spatial information system could improve the ability of the integration and interoperability of spatial information grid.
Liu, Yuxin; Li, Luoyuan; Guo, Quanwei; Wang, Lu; Liu, Dongdong; Wei, Ziwei; Zhou, Jing
2016-01-01
Lanthanide-based contrast agents have attracted increasing attention for their unique properties and potential applications in cancer theranostics. To date, many of these agents have been studied extensively in cells and small animal models. However, performance of these theranostic nanoparticles requires further improvement. In this study, a novel CsLu2F7:Yb,Er,Tm-based visual therapeutic platform was developed for imaging-guided synergistic cancer therapy. Due to the presence of the heavy alkali metal Cesium (Cs) in host lattice, the nanoplatform can provide a higher resolution X-ray CT imaging than many other reported lanthanide-based CT contrast agents. Furthermore, by using the targeted RGD motif, chemotherapy drug alpha-tocopheryl succinate (α-TOS), and photothermal coupling agent ICG, this nanoplatform simultaneously provides multifunctional imaging and targeted synergistic therapy. To demonstrate the theranostic performance of this novel nanoplatform in vivo, visual diagnosis in the small animal model was realized by UCL/CT imaging which was further integrated with targeted chemo-photothermal synergistic therapy. These results provided evidence for the successful construction of a novel lanthanide-based nanoplatform coupled with multimodal imaging diagnosis and potential application in synergistic cancer theranostics.
Object oriented development of engineering software using CLIPS
NASA Technical Reports Server (NTRS)
Yoon, C. John
1991-01-01
Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.
Ma, Yakun; Ge, Yanxiu; Li, Lingbing
2017-02-01
Nanogel-based multifunctional drug delivery systems, especially hybrid nanogels and multicompartment nanogels have drawn more and more extensive attention from the researchers in pharmacy because it can result in achieving a superior functionality through the synergistic property enhancement of each component. The unique hybrid and compartmentalized structures provide the great potential for co-delivery of multiple agents even the multiple agents with different physicochemical properties. Otherwise the hybrid nanogel encapsulating optical and magnetic resonance imaging contrast can be utilized in imaging technique for disease diagnosis. More importantly through nanogel-based multifunctional drug delivery systems the stimuli-responsive features might be easily employed for the design of targeted release of drug. This review summarizes the construction of diverse hybrid nanogels and multicompartment nanogels. The application in co-delivery of multiple agents and imaging agents for diagnosis as well as the application in the design of stimuli-responsive multifunctional nanogels as drug delivery are also reviewed and discussed. The future prospects in application of multifunctional nanogels will be also discussed in this review. Copyright © 2016 Elsevier B.V. All rights reserved.
Calyx{trademark} EA implementation at AECB
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-12-31
This report describes a project to examine the applicability of a knowledge-based decision support software for environmental assessment (Calyx) to assist the Atomic Energy Control Board in environmental screenings, assessment, management, and database searches. The report begins with background on the Calyx software and then reviews activities with regard to modification of the Calyx knowledge base for application to the nuclear sector. This is followed by lists of standard activities handled by the software and activities specific to the Board; the hierarchy of environmental components developed for the Board; details of impact rules that describe the conditions under which environmentalmore » impacts will occur (the bulk of the report); information on mitigation and monitoring rules and on instance data; and considerations for future work on implementing Calyx at the Board. Appendices include an introduction to expert systems and an overview of the Calyx knowledge base structure.« less
Updates to the NASA Space Telecommunications Radio System (STRS) Architecture
NASA Technical Reports Server (NTRS)
Kacpura, Thomas J.; Handler, Louis M.; Briones, Janette; Hall, Charles S.
2008-01-01
This paper describes an update of the Space Telecommunications Radio System (STRS) open architecture for NASA space based radios. The STRS architecture has been defined as a framework for the design, development, operation and upgrade of space based software defined radios, where processing resources are constrained. The architecture has been updated based upon reviews by NASA missions, radio providers, and component vendors. The STRS Standard prescribes the architectural relationship between the software elements used in software execution and defines the Application Programmer Interface (API) between the operating environment and the waveform application. Modeling tools have been adopted to present the architecture. The paper will present a description of the updated API, configuration files, and constraints. Minimum compliance is discussed for early implementations. The paper then closes with a summary of the changes made and discussion of the relevant alignment with the Object Management Group (OMG) SWRadio specification, and enhancements to the specialized signal processing abstraction.
Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder
NASA Technical Reports Server (NTRS)
Lindsey, A. E.; Pecheur, Charles
2004-01-01
AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.
Formalism Challenges of the Cougaar Model Driven Architecture
NASA Technical Reports Server (NTRS)
Bohner, Shawn A.; George, Boby; Gracanin, Denis; Hinchey, Michael G.
2004-01-01
The Cognitive Agent Architecture (Cougaar) is one of the most sophisticated distributed agent architectures developed today. As part of its research and evolution, Cougaar is being studied for application to large, logistics-based applications for the Department of Defense (DoD). Anticipiting future complex applications of Cougaar, we are investigating the Model Driven Architecture (MDA) approach to understand how effective it would be for increasing productivity in Cougar-based development efforts. Recognizing the sophistication of the Cougaar development environment and the limitations of transformation technologies for agents, we have systematically developed an approach that combines component assembly in the large and transformation in the small. This paper describes some of the key elements that went into the Cougaar Model Driven Architecture approach and the characteristics that drove the approach.
Research on environmental impact of water-based fire extinguishing agents
NASA Astrophysics Data System (ADS)
Wang, Shuai
2018-02-01
This paper offers current status of application of water-based fire extinguishing agents, the environmental and research considerations of the need for the study of toxicity research. This paper also offers systematic review of test methods of toxicity and environmental impact of water-based fire extinguishing agents currently available, illustrate the main requirements and relevant test methods, and offer some research findings for future research considerations. The paper also offers limitations of current study.
Software Engineering and Its Application to Avionics
1988-01-01
34Automated Software Development Methodolgy (ASDM): An Architecture of a Knowledge-Based Expert System," Masters Thesis , Florida Atlantic University, Boca...operating system provides the control semnrim and aplication services within the miltiproossur system. Them processes timt aks up the application sofhwae...as a high-value target may no longer be occupied by the time the film is processed and analyzed. With the high mobility of today’s enemy forces
NASA Astrophysics Data System (ADS)
Subekti, P.; Hambali, E.; Suryani, A.; Suryadarma, P.
2017-05-01
This study aims to analyze the potential aplication of of palm oil-based foaming agent as peat fires fighter in Indonesia. From literature review, it has been known that the foaming agent able to form foam to extinguish fire, wrap and refrigerate the burning peat. It is necessary to develop the production and application of foaming agent in Indonesia because peat fires occur almost every year that caused smoke haze. Potential raw material for the production of environmental friendly foaming agent as foam extinguishing for peat fires in Indonesia aong other is palm oil due to abundant availability, sustainable, and foam product easily degraded in the environment of the burnt areas. Production of foaming agent as fire-fighting in Indonesia is one alternative to reduce the time to control the fire and smog disaster impact. Application of palm oil as a raw material for fire-fighting is contribute to increase the value added and the development of palm oil downstream industry.