Sample records for real-time object database

  1. Commanding and Controlling Satellite Clusters (IEEE Intelligent Systems, November/December 2000)

    DTIC Science & Technology

    2000-01-01

    real - time operating system , a message-passing OS well suited for distributed...ground Flight processors ObjectAgent RTOS SCL RTOS RDMS Space command language Real - time operating system Rational database management system TS-21 RDMS...engineer with Princeton Satellite Systems. She is working with others to develop ObjectAgent software to run on the OSE Real Time Operating System .

  2. Application of real-time database to LAMOST control system

    NASA Astrophysics Data System (ADS)

    Xu, Lingzhe; Xu, Xinqi

    2004-09-01

    The QNX based real time database is one of main features for Large sky Area Multi-Object fiber Spectroscopic Telescope's (LAMOST) control system, which serves as a storage and platform for data flow, recording and updating timely various status of moving components in the telescope structure as well as environmental parameters around it. The database joins harmonically in the administration of the Telescope Control System (TCS). The paper presents methodology and technique tips in designing the EMPRESS database GUI software package, such as the dynamic creation of control widgets, dynamic query and share memory. The seamless connection between EMPRESS and the graphical development tool of QNX"s Photon Application Builder (PhAB) has been realized, and so have the Windows look and feel yet under Unix-like operating system. In particular, the real time feature of the database is analyzed that satisfies the needs of the control system.

  3. SkyDOT: a publicly accessible variability database, containing multiple sky surveys and real-time data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starr, D. L.; Wozniak, P. R.; Vestrand, W. T.

    2002-01-01

    SkyDOT (Sky Database for Objects in Time-Domain) is a Virtual Observatory currently comprised of data from the RAPTOR, ROTSE I, and OGLE I1 survey projects. This makes it a very large time domain database. In addition, the RAPTOR project provides SkyDOT with real-time variability data as well as stereoscopic information. With its web interface, we believe SkyDOT will be a very useful tool for both astronomers, and the public. Our main task has been to construct an efficient relational database containing all existing data, while handling a real-time inflow of data. We also provide a useful web interface allowing easymore » access to both astronomers and the public. Initially, this server will allow common searches, specific queries, and access to light curves. In the future we will include machine learning classification tools and access to spectral information.« less

  4. The Starlite Project - Prototyping Real-Time Software.

    DTIC Science & Technology

    1992-11-01

    by ONR under contract ledge of transactions and a temporal data model. A N00014-91-J-l 102, by DOE, and by NOSC. multiversion data object is one...environment. Section 4 presents experimentations of priority-based synchronization algorithms and multiversion data objects using the prototyping environment...priority-based .yn- chronization algorithms and between a multiversion database and its corresponding single- version database, through the sensitivity

  5. An Evaluation of Database Solutions to Spatial Object Association

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, V S; Kurc, T; Saltz, J

    2008-06-24

    Object association is a common problem encountered in many applications. Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two datasets based on their positions in a common spatial coordinate system--one of the datasets may correspond to a catalog of objects observed over time in a multi-dimensional domain; the other dataset may consist of objects observed in a snapshot of the domain at a time point. The use of database management systems to the solve the object association problem provides portability across different platforms and also greater flexibility. Increasingmore » dataset sizes in today's applications, however, have made object association a data/compute-intensive problem that requires targeted optimizations for efficient execution. In this work, we investigate how database-based crossmatch algorithms can be deployed on different database system architectures and evaluate the deployments to understand the impact of architectural choices on crossmatch performance and associated trade-offs. We investigate the execution of two crossmatch algorithms on (1) a parallel database system with active disk style processing capabilities, (2) a high-throughput network database (MySQL Cluster), and (3) shared-nothing databases with replication. We have conducted our study in the context of a large-scale astronomy application with real use-case scenarios.« less

  6. The StarLite Project Prototyping Real-Time Software

    DTIC Science & Technology

    1991-10-01

    multiversion data objects using the prototyping environment. Section 5 concludes the paper. 2. Message-Based Simulation When prototyping distributed...phase locking and priority-based synchronization algorithms, and between a multiversion database and its corresponding single-version database, through...its deadline, since the transaction is only aborted in the validation phase. 4.5. A Multiversion Database System To illustrate the effctivcness of the

  7. Point pattern match-based change detection in a constellation of previously detected objects

    DOEpatents

    Paglieroni, David W.

    2016-06-07

    A method and system is provided that applies attribute- and topology-based change detection to objects that were detected on previous scans of a medium. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, detection strength, size, elongation, orientation, etc. The locations define a three-dimensional network topology forming a constellation of previously detected objects. The change detection system stores attributes of the previously detected objects in a constellation database. The change detection system detects changes by comparing the attributes and topological consistency of newly detected objects encountered during a new scan of the medium to previously detected objects in the constellation database. The change detection system may receive the attributes of the newly detected objects as the objects are detected by an object detection system in real time.

  8. The StarLite Project

    DTIC Science & Technology

    1988-09-01

    The current prototyping tool also provides a multiversion data object control mechanism. In a real-time database system, synchronization protocols...data in distributed real-time systems. The semantic informa- tion of read-only transactions is exploited for improved efficiency, and a multiversion ...are discussed. ." Index Terms: distributed system, replication, read-only transaction, consistency, multiversion . I’ I’ I’ 4. -9- I I I ° e% 4, 1

  9. A storage scheme for the real-time database supporting the on-line commitment

    NASA Astrophysics Data System (ADS)

    Dai, Hong-bin; Jing, Yu-jian; Wang, Hui

    2013-07-01

    The modern SCADA (Supervisory Control and Data acquisition) systems have been applied to various aspects of everyday life. As the time goes on, the requirements of the applications of the systems vary. Thus the data structure of the real-time database, which is the core of a SCADA system, often needs modification. As a result, the commitment consisting of a sequence of configuration operations modifying the data structure of the real-time database is performed from time to time. Though it is simple to perform the off-line commitment by first stopping and then restarting the system, during which all the data in the real-time database are reconstructed. It is much more preferred or in some cases even necessary to perform the on-line commitment, during which the real-time database can still provide real-time service and the system continues working normally. In this paper, a storage scheme of the data in the real-time database is proposed. It helps the real-time database support its on-line commitment, during which real-time service is still available.

  10. Detection of dominant flow and abnormal events in surveillance video

    NASA Astrophysics Data System (ADS)

    Kwak, Sooyeong; Byun, Hyeran

    2011-02-01

    We propose an algorithm for abnormal event detection in surveillance video. The proposed algorithm is based on a semi-unsupervised learning method, a kind of feature-based approach so that it does not detect the moving object individually. The proposed algorithm identifies dominant flow without individual object tracking using a latent Dirichlet allocation model in crowded environments. It can also automatically detect and localize an abnormally moving object in real-life video. The performance tests are taken with several real-life databases, and their results show that the proposed algorithm can efficiently detect abnormally moving objects in real time. The proposed algorithm can be applied to any situation in which abnormal directions or abnormal speeds are detected regardless of direction.

  11. A service-oriented data access control model

    NASA Astrophysics Data System (ADS)

    Meng, Wei; Li, Fengmin; Pan, Juchen; Song, Song; Bian, Jiali

    2017-01-01

    The development of mobile computing, cloud computing and distributed computing meets the growing individual service needs. Facing with complex application system, it's an urgent problem to ensure real-time, dynamic, and fine-grained data access control. By analyzing common data access control models, on the basis of mandatory access control model, the paper proposes a service-oriented access control model. By regarding system services as subject and data of databases as object, the model defines access levels and access identification of subject and object, and ensures system services securely to access databases.

  12. Real time monitoring of slope stability in eastern Oklahoma.

    DOT National Transportation Integrated Search

    2014-01-01

    There were three primary objectives of the proposed research. The first was to establish a : comprehensive landslide database, the second was to create a first- cut regional landslide map and : the third was to relate safe and stable constructed slop...

  13. RAPTOR-scan: Identifying and Tracking Objects Through Thousands of Sky Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidoff, Sherri; Wozniak, Przemyslaw

    2004-09-28

    The RAPTOR-scan system mines data for optical transients associated with gamma-ray bursts and is used to create a catalog for the RAPTOR telescope system. RAPTOR-scan can detect and track individual astronomical objects across data sets containing millions of observed points.Accurately identifying a real object over many optical images (clustering the individual appearances) is necessary in order to analyze object light curves. To achieve this, RAPTOR telescope observations are sent in real time to a database. Each morning, a program based on the DBSCAN algorithm clusters the observations and labels each one with an object identifier. Once clustering is complete, themore » analysis program may be used to query the database and produce light curves, maps of the sky field, or other informative displays.Although RAPTOR-scan was designed for the RAPTOR optical telescope system, it is a general tool designed to identify objects in a collection of astronomical data and facilitate quick data analysis. RAPTOR-scan will be released as free software under the GNU General Public License.« less

  14. A GIS-Enabled, Michigan-Specific, Hierarchical Groundwater Modeling and Visualization System

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Li, S.; Mandle, R.; Simard, A.; Fisher, B.; Brown, E.; Ross, S.

    2005-12-01

    Efficient management of groundwater resources relies on a comprehensive database that represents the characteristics of the natural groundwater system as well as analysis and modeling tools to describe the impacts of decision alternatives. Many agencies in Michigan have spent several years compiling expensive and comprehensive surface water and groundwater inventories and other related spatial data that describe their respective areas of responsibility. However, most often this wealth of descriptive data has only been utilized for basic mapping purposes. The benefits from analyzing these data, using GIS analysis functions or externally developed analysis models or programs, has yet to be systematically realized. In this talk, we present a comprehensive software environment that allows Michigan groundwater resources managers and frontline professionals to make more effective use of the available data and improve their ability to manage and protect groundwater resources, address potential conflicts, design cleanup schemes, and prioritize investigation activities. In particular, we take advantage of the Interactive Ground Water (IGW) modeling system and convert it to a customized software environment specifically for analyzing, modeling, and visualizing the Michigan statewide groundwater database. The resulting Michigan IGW modeling system (IGW-M) is completely window-based, fully interactive, and seamlessly integrated with a GIS mapping engine. The system operates in real-time (on the fly) providing dynamic, hierarchical mapping, modeling, spatial analysis, and visualization. Specifically, IGW-M allows water resources and environmental professionals in Michigan to: * Access and utilize the extensive data from the statewide groundwater database, interactively manipulate GIS objects, and display and query the associated data and attributes; * Analyze and model the statewide groundwater database, interactively convert GIS objects into numerical model features, automatically extract data and attributes, and simulate unsteady groundwater flow and contaminant transport in response to water and land management decisions; * Visualize and map model simulations and predictions with data from the statewide groundwater database in a seamless interactive environment. IGW-M has the potential to significantly improve the productivity of Michigan groundwater management investigations. It changes the role of engineers and scientists in modeling and analyzing the statewide groundwater database from heavily physical to cognitive problem-solving and decision-making tasks. The seamless real-time integration, real-time visual interaction, and real-time processing capability allows a user to focus on critical management issues, conflicts, and constraints, to quickly and iteratively examine conceptual approximations, management and planning scenarios, and site characterization assumptions, to identify dominant processes, to evaluate data worth and sensitivity, and to guide further data-collection activities. We illustrate the power and effectiveness of the M-IGW modeling and visualization system with a real case study and a real-time, live demonstration.

  15. Aurorasaurus Database of Real-Time, Soft-Sensor Sourced Aurora Data for Space Weather Research

    NASA Astrophysics Data System (ADS)

    Kosar, B.; MacDonald, E.; Heavner, M.

    2017-12-01

    Aurorasaurus is an innovative citizen science project focused on two fundamental objectives i.e., collecting real-time, ground-based signals of auroral visibility from citizen scientists (soft-sensors) and incorporating this new type of data into scientific investigations pertaining to aurora. The project has been live since the Fall of 2014, and as of Summer 2017, the database compiled approximately 12,000 observations (5295 direct reports and 6413 verified tweets). In this presentation, we will focus on demonstrating the utility of this robust science quality data for space weather research needs. These data scale with the size of the event and are well-suited to capture the largest, rarest events. Emerging state-of-the-art computational methods based on statistical inference such as machine learning frameworks and data-model integration methods can offer new insights that could potentially lead to better real-time assessment and space weather prediction when citizen science data are combined with traditional sources.

  16. Spatial Data Integration Using Ontology-Based Approach

    NASA Astrophysics Data System (ADS)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  17. Performance Evaluation of a Firm Real-Time DataBase System

    DTIC Science & Technology

    1995-01-01

    after its deadline has passed. StarBase differs from previous real-time database work in that a) it relies on a real - time operating system which...StarBase, running on a real - time operating system kernel, RT-Mach. We discuss how performance was evaluated in StarBase using the StarBase workload

  18. Collaborative Resource Allocation

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Wax, Allan; Lam, Raymond; Baldwin, John; Borden, Chester

    2007-01-01

    Collaborative Resource Allocation Networking Environment (CRANE) Version 0.5 is a prototype created to prove the newest concept of using a distributed environment to schedule Deep Space Network (DSN) antenna times in a collaborative fashion. This program is for all space-flight and terrestrial science project users and DSN schedulers to perform scheduling activities and conflict resolution, both synchronously and asynchronously. Project schedulers can, for the first time, participate directly in scheduling their tracking times into the official DSN schedule, and negotiate directly with other projects in an integrated scheduling system. A master schedule covers long-range, mid-range, near-real-time, and real-time scheduling time frames all in one, rather than the current method of separate functions that are supported by different processes and tools. CRANE also provides private workspaces (both dynamic and static), data sharing, scenario management, user control, rapid messaging (based on Java Message Service), data/time synchronization, workflow management, notification (including emails), conflict checking, and a linkage to a schedule generation engine. The data structure with corresponding database design combines object trees with multiple associated mortal instances and relational database to provide unprecedented traceability and simplify the existing DSN XML schedule representation. These technologies are used to provide traceability, schedule negotiation, conflict resolution, and load forecasting from real-time operations to long-range loading analysis up to 20 years in the future. CRANE includes a database, a stored procedure layer, an agent-based middle tier, a Web service wrapper, a Windows Integrated Analysis Environment (IAE), a Java application, and a Web page interface.

  19. Real-Time Simulation

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Coryphaeus Software, founded in 1989 by former NASA electronic engineer Steve Lakowske, creates real-time 3D software. Designer's Workbench, the company flagship product, is a modeling and simulation tool for the development of both static and dynamic 3D databases. Other products soon followed. Activation, specifically designed for game developers, allows developers to play and test the 3D games before they commit to a target platform. Game publishers can shorten development time and prove the "playability" of the title, maximizing their chances of introducing a smash hit. Another product, EasyT, lets users create massive, realistic representation of Earth terrains that can be viewed and traversed in real time. Finally, EasyScene software control the actions among interactive objects within a virtual world. Coryphaeus products are used on Silican Graphics workstation and supercomputers to simulate real-world performance in synthetic environments. Customers include aerospace, aviation, architectural and engineering firms, game developers, and the entertainment industry.

  20. ControlShell: A real-time software framework

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Chen, Vincent W.; Pardo-Castellote, Gerardo

    1994-01-01

    The ControlShell system is a programming environment that enables the development and implementation of complex real-time software. It includes many building tools for complex systems, such as a graphical finite state machine (FSM) tool to provide strategic control. ControlShell has a component-based design, providing interface definitions and mechanisms for building real-time code modules along with providing basic data management. Some of the system-building tools incorporated in ControlShell are a graphical data flow editor, a component data requirement editor, and a state-machine editor. It also includes a distributed data flow package, an execution configuration manager, a matrix package, and an object database and dynamic binding facility. This paper presents an overview of ControlShell's architecture and examines the functions of several of its tools.

  1. A Physics-driven Neural Networks-based Simulation System (PhyNNeSS) for multimodal interactive virtual environments involving nonlinear deformable objects

    PubMed Central

    De, Suvranu; Deo, Dhannanjay; Sankaranarayanan, Ganesh; Arikatla, Venkata S.

    2012-01-01

    Background While an update rate of 30 Hz is considered adequate for real time graphics, a much higher update rate of about 1 kHz is necessary for haptics. Physics-based modeling of deformable objects, especially when large nonlinear deformations and complex nonlinear material properties are involved, at these very high rates is one of the most challenging tasks in the development of real time simulation systems. While some specialized solutions exist, there is no general solution for arbitrary nonlinearities. Methods In this work we present PhyNNeSS - a Physics-driven Neural Networks-based Simulation System - to address this long-standing technical challenge. The first step is an off-line pre-computation step in which a database is generated by applying carefully prescribed displacements to each node of the finite element models of the deformable objects. In the next step, the data is condensed into a set of coefficients describing neurons of a Radial Basis Function network (RBFN). During real-time computation, these neural networks are used to reconstruct the deformation fields as well as the interaction forces. Results We present realistic simulation examples from interactive surgical simulation with real time force feedback. As an example, we have developed a deformable human stomach model and a Penrose-drain model used in the Fundamentals of Laparoscopic Surgery (FLS) training tool box. Conclusions A unique computational modeling system has been developed that is capable of simulating the response of nonlinear deformable objects in real time. The method distinguishes itself from previous efforts in that a systematic physics-based pre-computational step allows training of neural networks which may be used in real time simulations. We show, through careful error analysis, that the scheme is scalable, with the accuracy being controlled by the number of neurons used in the simulation. PhyNNeSS has been integrated into SoFMIS (Software Framework for Multimodal Interactive Simulation) for general use. PMID:22629108

  2. SCEGRAM: An image database for semantic and syntactic inconsistencies in scenes.

    PubMed

    Öhlschläger, Sabine; Võ, Melissa Le-Hoa

    2017-10-01

    Our visual environment is not random, but follows compositional rules according to what objects are usually found where. Despite the growing interest in how such semantic and syntactic rules - a scene grammar - enable effective attentional guidance and object perception, no common image database containing highly-controlled object-scene modifications has been publically available. Such a database is essential in minimizing the risk that low-level features drive high-level effects of interest, which is being discussed as possible source of controversial study results. To generate the first database of this kind - SCEGRAM - we took photographs of 62 real-world indoor scenes in six consistency conditions that contain semantic and syntactic (both mild and extreme) violations as well as their combinations. Importantly, always two scenes were paired, so that an object was semantically consistent in one scene (e.g., ketchup in kitchen) and inconsistent in the other (e.g., ketchup in bathroom). Low-level salience did not differ between object-scene conditions and was generally moderate. Additionally, SCEGRAM contains consistency ratings for every object-scene condition, as well as object-absent scenes and object-only images. Finally, a cross-validation using eye-movements replicated previous results of longer dwell times for both semantic and syntactic inconsistencies compared to consistent controls. In sum, the SCEGRAM image database is the first to contain well-controlled semantic and syntactic object-scene inconsistencies that can be used in a broad range of cognitive paradigms (e.g., verbal and pictorial priming, change detection, object identification, etc.) including paradigms addressing developmental aspects of scene grammar. SCEGRAM can be retrieved for research purposes from http://www.scenegrammarlab.com/research/scegram-database/ .

  3. A Web-Based Database for Nurse Led Outreach Teams (NLOT) in Toronto.

    PubMed

    Li, Shirley; Kuo, Mu-Hsing; Ryan, David

    2016-01-01

    A web-based system can provide access to real-time data and information. Healthcare is moving towards digitizing patients' medical information and securely exchanging it through web-based systems. In one of Ontario's health regions, Nurse Led Outreach Teams (NLOT) provide emergency mobile nursing services to help reduce unnecessary transfers from long-term care homes to emergency departments. Currently the NLOT team uses a Microsoft Access database to keep track of the health information on the residents that they serve. The Access database lacks scalability, portability, and interoperability. The objective of this study is the development of a web-based database using Oracle Application Express that is easily accessible from mobile devices. The web-based database will allow NLOT nurses to enter and access resident information anytime and from anywhere.

  4. Generation of large scale urban environments to support advanced sensor and seeker simulation

    NASA Astrophysics Data System (ADS)

    Giuliani, Joseph; Hershey, Daniel; McKeown, David, Jr.; Willis, Carla; Van, Tan

    2009-05-01

    One of the key aspects for the design of a next generation weapon system is the need to operate in cluttered and complex urban environments. Simulation systems rely on accurate representation of these environments and require automated software tools to construct the underlying 3D geometry and associated spectral and material properties that are then formatted for various objective seeker simulation systems. Under an Air Force Small Business Innovative Research (SBIR) contract, we have developed an automated process to generate 3D urban environments with user defined properties. These environments can be composed from a wide variety of source materials, including vector source data, pre-existing 3D models, and digital elevation models, and rapidly organized into a geo-specific visual simulation database. This intermediate representation can be easily inspected in the visible spectrum for content and organization and interactively queried for accuracy. Once the database contains the required contents, it can then be exported into specific synthetic scene generation runtime formats, preserving the relationship between geometry and material properties. To date an exporter for the Irma simulation system developed and maintained by AFRL/Eglin has been created and a second exporter to Real Time Composite Hardbody and Missile Plume (CHAMP) simulation system for real-time use is currently being developed. This process supports significantly more complex target environments than previous approaches to database generation. In this paper we describe the capabilities for content creation for advanced seeker processing algorithms simulation and sensor stimulation, including the overall database compilation process and sample databases produced and exported for the Irma runtime system. We also discuss the addition of object dynamics and viewer dynamics within the visual simulation into the Irma runtime environment.

  5. Key features for ATA / ATR database design in missile systems

    NASA Astrophysics Data System (ADS)

    Özertem, Kemal Arda

    2017-05-01

    Automatic target acquisition (ATA) and automatic target recognition (ATR) are two vital tasks for missile systems, and having a robust detection and recognition algorithm is crucial for overall system performance. In order to have a robust target detection and recognition algorithm, an extensive image database is required. Automatic target recognition algorithms use the database of images in training and testing steps of algorithm. This directly affects the recognition performance, since the training accuracy is driven by the quality of the image database. In addition, the performance of an automatic target detection algorithm can be measured effectively by using an image database. There are two main ways for designing an ATA / ATR database. The first and easy way is by using a scene generator. A scene generator can model the objects by considering its material information, the atmospheric conditions, detector type and the territory. Designing image database by using a scene generator is inexpensive and it allows creating many different scenarios quickly and easily. However the major drawback of using a scene generator is its low fidelity, since the images are created virtually. The second and difficult way is designing it using real-world images. Designing image database with real-world images is a lot more costly and time consuming; however it offers high fidelity, which is critical for missile algorithms. In this paper, critical concepts in ATA / ATR database design with real-world images are discussed. Each concept is discussed in the perspective of ATA and ATR separately. For the implementation stage, some possible solutions and trade-offs for creating the database are proposed, and all proposed approaches are compared to each other with regards to their pros and cons.

  6. Capability of a Mobile Monitoring System to Provide Real-Time Data Broadcasting and Near Real-Time Source Attribution

    NASA Astrophysics Data System (ADS)

    Erickson, M.; Olaguer, J.; Wijesinghe, A.; Colvin, J.; Neish, B.; Williams, J.

    2014-12-01

    It is becoming increasingly important to understand the emissions and health effects of industrial facilities. Many areas have no or limited sustained monitoring capabilities, making it difficult to quantify the major pollution sources affecting human health, especially in fence line communities. Developments in real-time monitoring and micro-scale modeling offer unique ways to tackle these complex issues. This presentation will demonstrate the capability of coupling real-time observations with micro-scale modeling to provide real-time information and near real-time source attribution. The Houston Advanced Research Center constructed the Mobile Acquisition of Real-time Concentrations (MARC) laboratory. MARC consists of a Ford E-350 passenger van outfitted with a Proton Transfer Reaction Mass Spectrometer (PTR-MS) and meteorological equipment. This allows for the fast measurement of various VOCs important to air quality. The data recorded from the van is uploaded to an off-site database and the information is broadcast to a website in real-time. This provides for off-site monitoring of MARC's observations, which allows off-site personnel to provide immediate input to the MARC operators on how to best achieve project objectives. The information stored in the database can also be used to provide near real-time source attribution. An inverse model has been used to ascertain the amount, location, and timing of emissions based on MARC measurements in the vicinity of industrial sites. The inverse model is based on a 3D micro-scale Eulerian forward and adjoint air quality model known as the HARC model. The HARC model uses output from the Quick Urban and Industrial Complex (QUIC) wind model and requires a 3D digital model of the monitored facility based on lidar or industrial permit data. MARC is one of the instrument platforms deployed during the 2014 Benzene and other Toxics Exposure Study (BEE-TEX) in Houston, TX. The main goal of the study is to quantify and explain the origin of ambient exposure to hazardous air pollutants in an industrial fence line community near the Houston Ship Channel. Preliminary results derived from analysis of MARC observations during the BEE-TEX experiment will be presented.

  7. Attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm.

    PubMed

    Zhang, Jie; Wang, Yuping; Feng, Junhong

    2013-01-01

    In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption.

  8. Attribute Index and Uniform Design Based Multiobjective Association Rule Mining with Evolutionary Algorithm

    PubMed Central

    Wang, Yuping; Feng, Junhong

    2013-01-01

    In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption. PMID:23766683

  9. Expert system decision support for low-cost launch vehicle operations

    NASA Technical Reports Server (NTRS)

    Szatkowski, G. P.; Levin, Barry E.

    1991-01-01

    Progress in assessing the feasibility, benefits, and risks associated with AI expert systems applied to low cost expendable launch vehicle systems is described. Part one identified potential application areas in vehicle operations and on-board functions, assessed measures of cost benefit, and identified key technologies to aid in the implementation of decision support systems in this environment. Part two of the program began the development of prototypes to demonstrate real-time vehicle checkout with controller and diagnostic/analysis intelligent systems and to gather true measures of cost savings vs. conventional software, verification and validation requirements, and maintainability improvement. The main objective of the expert advanced development projects was to provide a robust intelligent system for control/analysis that must be performed within a specified real-time window in order to meet the demands of the given application. The efforts to develop the two prototypes are described. Prime emphasis was on a controller expert system to show real-time performance in a cryogenic propellant loading application and safety validation implementation of this system experimentally, using commercial-off-the-shelf software tools and object oriented programming techniques. This smart ground support equipment prototype is based in C with imbedded expert system rules written in the CLIPS protocol. The relational database, ORACLE, provides non-real-time data support. The second demonstration develops the vehicle/ground intelligent automation concept, from phase one, to show cooperation between multiple expert systems. This automated test conductor (ATC) prototype utilizes a knowledge-bus approach for intelligent information processing by use of virtual sensors and blackboards to solve complex problems. It incorporates distributed processing of real-time data and object-oriented techniques for command, configuration control, and auto-code generation.

  10. Geospatial Database for Strata Objects Based on Land Administration Domain Model (ladm)

    NASA Astrophysics Data System (ADS)

    Nasorudin, N. N.; Hassan, M. I.; Zulkifli, N. A.; Rahman, A. Abdul

    2016-09-01

    Recently in our country, the construction of buildings become more complex and it seems that strata objects database becomes more important in registering the real world as people now own and use multilevel of spaces. Furthermore, strata title was increasingly important and need to be well-managed. LADM is a standard model for land administration and it allows integrated 2D and 3D representation of spatial units. LADM also known as ISO 19152. The aim of this paper is to develop a strata objects database using LADM. This paper discusses the current 2D geospatial database and needs for 3D geospatial database in future. This paper also attempts to develop a strata objects database using a standard data model (LADM) and to analyze the developed strata objects database using LADM data model. The current cadastre system in Malaysia includes the strata title is discussed in this paper. The problems in the 2D geospatial database were listed and the needs for 3D geospatial database in future also is discussed. The processes to design a strata objects database are conceptual, logical and physical database design. The strata objects database will allow us to find the information on both non-spatial and spatial strata title information thus shows the location of the strata unit. This development of strata objects database may help to handle the strata title and information.

  11. Reducing process delays for real-time earthquake parameter estimation - An application of KD tree to large databases for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Yin, Lucy; Andrews, Jennifer; Heaton, Thomas

    2018-05-01

    Earthquake parameter estimations using nearest neighbor searching among a large database of observations can lead to reliable prediction results. However, in the real-time application of Earthquake Early Warning (EEW) systems, the accurate prediction using a large database is penalized by a significant delay in the processing time. We propose to use a multidimensional binary search tree (KD tree) data structure to organize large seismic databases to reduce the processing time in nearest neighbor search for predictions. We evaluated the performance of KD tree on the Gutenberg Algorithm, a database-searching algorithm for EEW. We constructed an offline test to predict peak ground motions using a database with feature sets of waveform filter-bank characteristics, and compare the results with the observed seismic parameters. We concluded that large database provides more accurate predictions of the ground motion information, such as peak ground acceleration, velocity, and displacement (PGA, PGV, PGD), than source parameters, such as hypocenter distance. Application of the KD tree search to organize the database reduced the average searching process by 85% time cost of the exhaustive method, allowing the method to be feasible for real-time implementation. The algorithm is straightforward and the results will reduce the overall time of warning delivery for EEW.

  12. Context indexing of digital cardiac ultrasound records in PACS

    NASA Astrophysics Data System (ADS)

    Lobodzinski, S. Suave; Meszaros, Georg N.

    1998-07-01

    Recent wide adoption of the DICOM 3.0 standard by ultrasound equipment vendors created a need for practical clinical implementations of cardiac imaging study visualization, management and archiving, DICOM 3.0 defines only a logical and physical format for exchanging image data (still images, video, patient and study demographics). All DICOM compliant imaging studies must presently be archived on a 650 Mb recordable compact disk. This is a severe limitation for ultrasound applications where studies of 3 to 10 minutes long are a common practice. In addition, DICOM digital echocardiography objects require physiological signal indexing, content segmentation and characterization. Since DICOM 3.0 is an interchange standard only, it does not define how to database composite video objects. The goal of this research was therefore to address the issues of efficient storage, retrieval and management of DICOM compliant cardiac video studies in a distributed PACS environment. Our Web based implementation has the advantage of accommodating both DICOM defined entity-relation modules (equipment data, patient data, video format, etc.) in standard relational database tables and digital indexed video with its attributes in an object relational database. Object relational data model facilitates content indexing of full motion cardiac imaging studies through bi-directional hyperlink generation that tie searchable video attributes and related objects to individual video frames in the temporal domain. Benefits realized from use of bi-directionally hyperlinked data models in an object relational database include: (1) real time video indexing during image acquisition, (2) random access and frame accurate instant playback of previously recorded full motion imaging data, and (3) time savings from faster and more accurate access to data through multiple navigation mechanisms such as multidimensional queries on an index, queries on a hyperlink attribute, free search and browsing.

  13. Real-time door detection for indoor autonomous vehicle

    NASA Astrophysics Data System (ADS)

    He, Zhihao; Zhu, Ming

    2017-07-01

    Indoor Autonomous Vehicle(IAV) is used in many indoor scenes. Such as hotels and hospitals. Door detection is a key issue to guide the IAV into rooms. In this paper, we consider door detection in the use of indoor navigation of IAV. Since real-time properties are important for real-world IAV, the detection algorithm must be fast enough. Most monocular-camera based door detection model need a perfect detection of the four line segments of the door or the four corners. But in many situations, line segments could be extended or cut off. And there could be many false detected corners. And few of them can distinguish doors from door-like objects with door-like shape effectively. We proposed a 2-D vision model of the door that is made up of line segments. The number of parts detected is used to determine the possibility of a door. Our algorithm is tested on a database of doors.1 The robustness and real-time are verified. The precision is 89.4%. Average time consumed for processing a 640x320 figure is 44.73ms.

  14. 3-D Object Pose Determination Using Complex EGI

    DTIC Science & Technology

    1990-10-01

    the length of edges of the polyhedron from the EGI. Dane and Bajcsy [4] make use of the Gaussian Image to spatially segment a group of range points...involving real range data of two smooth objects were conducted. The two smooth objects are the torus and ellipsoid, whose databases have been created...in the simulations earlier. 5.0.1 Implementational Issues The torus and ellipsoid were crafted out of clay to resemble the models whose databases were

  15. PEP725: real time monitoring of phenological events in Austria, Germany, Sweden and Switzerland

    NASA Astrophysics Data System (ADS)

    Ungersboeck, Markus; Bolmgren, Kjell; Huebner, Thomas; Kaspar, Frank; Langvall, Ola; Paul, Anita; Pietragalla, Barbara; Scheifinger, Helfried; Koch, Elisabeth

    2017-04-01

    The main objective of PEP725 (Pan European Phenological database; http://www.pep725.eu/) is to promote and facilitate phenological research by delivering a pan European phenological database with an open, unrestricted data access for science, research and education. The first datasets in PEP725 date back to 1868; however, there are only a few observations available until 1950. From 1951 onwards, the phenological networks all over Europe developed rapidly. So far more than 11 923 489 of observations of 121 different plants are now available in the PEP725 database. Approximately 40 % of all data are flowering records, 10 % are fruit ripeness observations and also 10 % are leaf unfolding observations. The PEP725 database is updated annually. But since recently Deutscher Wetterdienst and MeteoSwiss offer their observers to upload their observations via web in real time mode, ZAMG introduced this web-based feature already in 2007 (phenowatch.at) and the observers of SWE-NPN (the Swedish National Phenology Network) can submit their observations through the web application naturenskalender.se since the start in 2008. Since spring 2016 one you can find a real time animated monitoring tool showing how the "green wave" in spring is moving from 46° northern latitude up to the Arctic Circle and the "brown wave" in autumn in the opposite direction. In 2015 the "green wave" speeds up from app. 4.4 days/degree latitude for hazel flowering to 2.9 days/ degree latitude for willow flowering and 2.25 days/degree latitude for birch leaf unfolding. There are other European countries as for instance Italy, The Netherlands, UK that have been doing visualizations of ground phenology in real time for some years, but these efforts always end at the national borders. PEP725 is funded by ZAMG, the Austrian ministry of science, research and economy and EUMETNET, the network of European meteorological services. So far 21 European meteorological services and 7 partners from different phenological network operators have joined PEP725.

  16. The utilization of neural nets in populating an object-oriented database

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms (i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  17. The Real-Time ObjectAgent Software Architecture for Distributed Satellite Systems

    DTIC Science & Technology

    2001-01-01

    real - time operating system selection are also discussed. The fourth section describes a simple demonstration of real-time ObjectAgent. Finally, the...experience with C++. After selecting the programming language, it was necessary to select a target real - time operating system (RTOS) and embedded...ObjectAgent software to run on the OSE Real Time Operating System . In addition, she is responsible for the integration of ObjectAgent

  18. Realization of Real-Time Clinical Data Integration Using Advanced Database Technology

    PubMed Central

    Yoo, Sooyoung; Kim, Boyoung; Park, Heekyong; Choi, Jinwook; Chun, Jonghoon

    2003-01-01

    As information & communication technologies have advanced, interest in mobile health care systems has grown. In order to obtain information seamlessly from distributed and fragmented clinical data from heterogeneous institutions, we need solutions that integrate data. In this article, we introduce a method for information integration based on real-time message communication using trigger and advanced database technologies. Messages were devised to conform to HL7, a standard for electronic data exchange in healthcare environments. The HL7 based system provides us with an integrated environment in which we are able to manage the complexities of medical data. We developed this message communication interface to generate and parse HL7 messages automatically from the database point of view. We discuss how easily real time data exchange is performed in the clinical information system, given the requirement for minimum loading of the database system. PMID:14728271

  19. Redefining the Practice of Peer Review Through Intelligent Automation Part 1: Creation of a Standardized Methodology and Referenceable Database.

    PubMed

    Reiner, Bruce I

    2017-10-01

    Conventional peer review practice is compromised by a number of well-documented biases, which in turn limit standard of care analysis, which is fundamental to determination of medical malpractice. In addition to these intrinsic biases, other existing deficiencies exist in current peer review including the lack of standardization, objectivity, retrospective practice, and automation. An alternative model to address these deficiencies would be one which is completely blinded to the peer reviewer, requires independent reporting from both parties, utilizes automated data mining techniques for neutral and objective report analysis, and provides data reconciliation for resolution of finding-specific report differences. If properly implemented, this peer review model could result in creation of a standardized referenceable peer review database which could further assist in customizable education, technology refinement, and implementation of real-time context and user-specific decision support.

  20. Modeling Real-Time Applications with Reusable Design Patterns

    NASA Astrophysics Data System (ADS)

    Rekhis, Saoussen; Bouassida, Nadia; Bouaziz, Rafik

    Real-Time (RT) applications, which manipulate important volumes of data, need to be managed with RT databases that deal with time-constrained data and time-constrained transactions. In spite of their numerous advantages, RT databases development remains a complex task, since developers must study many design issues related to the RT domain. In this paper, we tackle this problem by proposing RT design patterns that allow the modeling of structural and behavioral aspects of RT databases. We show how RT design patterns can provide design assistance through architecture reuse of reoccurring design problems. In addition, we present an UML profile that represents patterns and facilitates further their reuse. This profile proposes, on one hand, UML extensions allowing to model the variability of patterns in the RT context and, on another hand, extensions inspired from the MARTE (Modeling and Analysis of Real-Time Embedded systems) profile.

  1. The INGV Real Time Strong Motion Database

    NASA Astrophysics Data System (ADS)

    Massa, Marco; D'Alema, Ezio; Mascandola, Claudia; Lovati, Sara; Scafidi, Davide; Gomez, Antonio; Carannante, Simona; Franceschina, Gianlorenzo; Mirenna, Santi; Augliera, Paolo

    2017-04-01

    The INGV real time strong motion data sharing is assured by the INGV Strong Motion Database. ISMD (http://ismd.mi.ingv.it) was designed in the last months of 2011 in cooperation among different INGV departments, with the aim to organize the distribution of the INGV strong-motion data using standard procedures for data acquisition and processing. The first version of the web portal was published soon after the occurrence of the 2012 Emilia (Northern Italy), Mw 6.1, seismic sequence. At that time ISMD was the first European real time web portal devoted to the engineering seismology community. After four years of successfully operation, the thousands of accelerometric waveforms collected in the archive need necessary a technological improvement of the system in order to better organize the new data archiving and to make more efficient the answer to the user requests. ISMD 2.0 was based on PostgreSQL (www.postgresql.org), an open source object- relational database. The main purpose of the web portal is to distribute few minutes after the origin time the accelerometric waveforms and related metadata of the Italian earthquakes with ML≥3.0. Data are provided both in raw SAC (counts) and automatically corrected ASCII (gal) formats. The web portal also provide, for each event, a detailed description of the ground motion parameters (i.e. Peak Ground Acceleration, Velocity and Displacement, Arias and Housner Intensities) data converted in velocity and displacement, response spectra up to 10.0 s and general maps concerning the recent and the historical seismicity of the area together with information about its seismic hazard. The focal parameters of the events are provided by the INGV National Earthquake Center (CNT, http://cnt.rm.ingv.it). Moreover, the database provides a detailed site characterization section for each strong motion station, based on geological, geomorphological and geophysical information. At present (i.e. January 2017), ISMD includes 987 (121.185 waveforms) Italian earthquakes with ML≥3.0, recorded since the 1st January 2012, besides 204 accelerometric stations belonging to the INGV strong motion network and regional partner.

  2. StarBase: A Firm Real-Time Database Manager for Time-Critical Applications

    DTIC Science & Technology

    1995-01-01

    Mellon University [IO]. StarBase differs from previous RT-DBMS work [l, 2, 31 in that a) it relies on a real - time operating system which provides...simulation studies, StarBase uses a real - time operating system to provide basic real-time functionality and deals with issues beyond transaction...re- source scheduling provided by the underlying real - time operating system . Issues of data contention are dealt with by use of a priority

  3. Decision Support for Emergency Operations Centers

    NASA Technical Reports Server (NTRS)

    Harvey, Craig; Lawhead, Joel; Watts, Zack

    2005-01-01

    The Flood Disaster Mitigation Decision Support System (DSS) is a computerized information system that allows regional emergency-operations government officials to make decisions regarding the dispatch of resources in response to flooding. The DSS implements a real-time model of inundation utilizing recently acquired lidar elevation data as well as real-time data from flood gauges, and other instruments within and upstream of an area that is or could become flooded. The DSS information is updated as new data become available. The model generates realtime maps of flooded areas and predicts flood crests at specified locations. The inundation maps are overlaid with information on population densities, property values, hazardous materials, evacuation routes, official contact information, and other information needed for emergency response. The program maintains a database and a Web portal through which real-time data from instrumentation are gathered into the database. Also included in the database is a geographic information system, from which the program obtains the overlay data for areas of interest as needed. The portal makes some portions of the database accessible to the public. Access to other portions of the database is restricted to government officials according to various levels of authorization. The Flood Disaster Mitigation DSS has been integrated into a larger DSS named REACT (Real-time Emergency Action Coordination Tool), which also provides emergency operations managers with data for any type of impact area such as floods, fires, bomb

  4. Science information systems: Archive, access, and retrieval

    NASA Technical Reports Server (NTRS)

    Campbell, William J.

    1991-01-01

    The objective of this research is to develop technology for the automated characterization and interactive retrieval and visualization of very large, complex scientific data sets. Technologies will be developed for the following specific areas: (1) rapidly archiving data sets; (2) automatically characterizing and labeling data in near real-time; (3) providing users with the ability to browse contents of databases efficiently and effectively; (4) providing users with the ability to access and retrieve system independent data sets electronically; and (5) automatically alerting scientists to anomalies detected in data.

  5. Real-Time Continuous Response Spectra Exceedance Calculation Displayed in a Web-Browser Enables Rapid and Robust Damage Evaluation by First Responders

    NASA Astrophysics Data System (ADS)

    Franke, M.; Skolnik, D. A.; Harvey, D.; Lindquist, K.

    2014-12-01

    A novel and robust approach is presented that provides near real-time earthquake alarms for critical structures at distributed locations and large facilities using real-time estimation of response spectra obtained from near free-field motions. Influential studies dating back to the 1980s identified spectral response acceleration as a key ground motion characteristic that correlates well with observed damage in structures. Thus, monitoring and reporting on exceedance of spectra-based thresholds are useful tools for assessing the potential for damage to facilities or multi-structure campuses based on input ground motions only. With as little as one strong-motion station per site, this scalable approach can provide rapid alarms on the damage status of remote towns, critical infrastructure (e.g., hospitals, schools) and points of interests (e.g., bridges) for a very large number of locations enabling better rapid decision making during critical and difficult immediate post-earthquake response actions. Details on the novel approach are presented along with an example implementation for a large energy company. Real-time calculation of PSA exceedance and alarm dissemination are enabled with Bighorn, an extension module based on the Antelope software package that combines real-time spectral monitoring and alarm capabilities with a robust built-in web display server. Antelope is an environmental data collection software package from Boulder Real Time Technologies (BRTT) typically used for very large seismic networks and real-time seismic data analyses. The primary processing engine produces continuous time-dependent response spectra for incoming acceleration streams. It utilizes expanded floating-point data representations within object ring-buffer packets and waveform files in a relational database. This leads to a very fast method for computing response spectra for a large number of channels. A Python script evaluates these response spectra for exceedance of one or more specified spectral limits, reporting any such exceedances via alarm packets that are put in the object ring-buffer for use by any alarm processes that need them. The web-display subsystem allows alert dissemination, interactive exploration, and alarm cancellation via the WWW.

  6. Shape-based human detection for threat assessment

    NASA Astrophysics Data System (ADS)

    Lee, Dah-Jye; Zhan, Pengcheng; Thomas, Aaron; Schoenberger, Robert B.

    2004-07-01

    Detection of intrusions for early threat assessment requires the capability of distinguishing whether the intrusion is a human, an animal, or other objects. Most low-cost security systems use simple electronic motion detection sensors to monitor motion or the location of objects within the perimeter. Although cost effective, these systems suffer from high rates of false alarm, especially when monitoring open environments. Any moving objects including animals can falsely trigger the security system. Other security systems that utilize video equipment require human interpretation of the scene in order to make real-time threat assessment. Shape-based human detection technique has been developed for accurate early threat assessments for open and remote environment. Potential threats are isolated from the static background scene using differential motion analysis and contours of the intruding objects are extracted for shape analysis. Contour points are simplified by removing redundant points connecting short and straight line segments and preserving only those with shape significance. Contours are represented in tangent space for comparison with shapes stored in database. Power cepstrum technique has been developed to search for the best matched contour in database and to distinguish a human from other objects from different viewing angles and distances.

  7. Architectural Implications for Spatial Object Association Algorithms*

    PubMed Central

    Kumar, Vijay S.; Kurc, Tahsin; Saltz, Joel; Abdulla, Ghaleb; Kohn, Scott R.; Matarazzo, Celeste

    2013-01-01

    Spatial object association, also referred to as crossmatch of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server®, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation provides insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST). PMID:25692244

  8. Objects Architecture: A Comprehensive Design Approach for Real-Time, Distributed, Fault-Tolerant, Reactive Operating Systems.

    DTIC Science & Technology

    1987-09-01

    real - time operating system should be efficient from the real-time point...5,8]) system naming scheme. 3.2 Protecting Objects Real-time embedded systems usually neglect protection mechanisms. However, a real - time operating system cannot...allocation mechanism should adhere to application constraints. This strong relationship between a real - time operating system and the application

  9. Registration of terrestrial mobile laser data on 2D or 3D geographic database by use of a non-rigid ICP approach.

    NASA Astrophysics Data System (ADS)

    Monnier, F.; Vallet, B.; Paparoditis, N.; Papelard, J.-P.; David, N.

    2013-10-01

    This article presents a generic and efficient method to register terrestrial mobile data with imperfect location on a geographic database with better overall accuracy but less details. The registration method proposed in this paper is based on a semi-rigid point to plane ICP ("Iterative Closest Point"). The main applications of such registration is to improve existing geographic databases, particularly in terms of accuracy, level of detail and diversity of represented objects. Other applications include fine geometric modelling and fine façade texturing, object extraction such as trees, poles, road signs marks, facilities, vehicles, etc. The geopositionning system of mobile mapping systems is affected by GPS masks that are only partially corrected by an Inertial Navigation System (INS) which can cause an important drift. As this drift varies non-linearly, but slowly in time, it will be modelled by a translation defined as a piecewise linear function of time which variation over time will be minimized (rigidity term). For each iteration of the ICP, the drift is estimated in order to minimise the distance between laser points and planar model primitives (data attachment term). The method has been tested on real data (a scan of the city of Paris of 3.6 million laser points registered on a 3D model of approximately 71,400 triangles).

  10. Real-time magnetic resonance imaging and electromagnetic articulography database for speech production research (TC)

    PubMed Central

    Narayanan, Shrikanth; Toutios, Asterios; Ramanarayanan, Vikram; Lammert, Adam; Kim, Jangwon; Lee, Sungbok; Nayak, Krishna; Kim, Yoon-Chul; Zhu, Yinghua; Goldstein, Louis; Byrd, Dani; Bresch, Erik; Ghosh, Prasanta; Katsamanis, Athanasios; Proctor, Michael

    2014-01-01

    USC-TIMIT is an extensive database of multimodal speech production data, developed to complement existing resources available to the speech research community and with the intention of being continuously refined and augmented. The database currently includes real-time magnetic resonance imaging data from five male and five female speakers of American English. Electromagnetic articulography data have also been presently collected from four of these speakers. The two modalities were recorded in two independent sessions while the subjects produced the same 460 sentence corpus used previously in the MOCHA-TIMIT database. In both cases the audio signal was recorded and synchronized with the articulatory data. The database and companion software are freely available to the research community. PMID:25190403

  11. [Research on Zhejiang blood information network and management system].

    PubMed

    Yan, Li-Xing; Xu, Yan; Meng, Zhong-Hua; Kong, Chang-Hong; Wang, Jian-Min; Jin, Zhen-Liang; Wu, Shi-Ding; Chen, Chang-Shui; Luo, Ling-Fei

    2007-02-01

    This research was aimed to develop the first level blood information centralized database and real time communication network at a province area in China. Multiple technology like local area network database separate operation, real time data concentration and distribution mechanism, allopatric backup, and optical fiber virtual private network (VPN) were used. As a result, the blood information centralized database and management system were successfully constructed, which covers all the Zhejiang province, and the real time exchange of blood data was realised. In conclusion, its implementation promote volunteer blood donation and ensure the blood safety in Zhejiang, especially strengthen the quick response to public health emergency. This project lays the first stone of centralized test and allotment among blood banks in Zhejiang, and can serve as a reference of contemporary blood bank information systems in China.

  12. Creating of Central Geospatial Database of the Slovak Republic and Procedures of its Revision

    NASA Astrophysics Data System (ADS)

    Miškolci, M.; Šafář, V.; Šrámková, R.

    2016-06-01

    The article describes the creation of initial three dimensional geodatabase from planning and designing through the determination of technological and manufacturing processes to practical using of Central Geospatial Database (CGD - official name in Slovak language is Centrálna Priestorová Databáza - CPD) and shortly describes procedures of its revision. CGD ensures proper collection, processing, storing, transferring and displaying of digital geospatial information. CGD is used by Ministry of Defense (MoD) for defense and crisis management tasks and by Integrated rescue system. For military personnel CGD is run on MoD intranet, and for other users outside of MoD is transmutated to ZbGIS (Primary Geodatabase of Slovak Republic) and is run on public web site. CGD is a global set of geo-spatial information. CGD is a vector computer model which completely covers entire territory of Slovakia. Seamless CGD is created by digitizing of real world using of photogrammetric stereoscopic methods and measurements of objects properties. Basic vector model of CGD (from photogrammetric processing) is then taken out to the field for inspection and additional gathering of objects properties in the whole area of mapping. Finally real-world objects are spatially modeled as a entities of three-dimensional database. CGD gives us opportunity, to get know the territory complexly in all the three spatial dimensions. Every entity in CGD has recorded the time of collection, which allows the individual to assess the timeliness of information. CGD can be utilized for the purposes of geographical analysis, geo-referencing, cartographic purposes as well as various special-purpose mapping and has the ambition to cover the needs not only the MoD, but to become a reference model for the national geographical infrastructure.

  13. Real-Time Payload Control and Monitoring on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Sun, Charles; Windrem, May; Givens, John J. (Technical Monitor)

    1998-01-01

    World Wide Web (W3) technologies such as the Hypertext Transfer Protocol (HTTP) and the Java object-oriented programming environment offer a powerful, yet relatively inexpensive, framework for distributed application software development. This paper describes the design of a real-time payload control and monitoring system that was developed with W3 technologies at NASA Ames Research Center. Based on Java Development Toolkit (JDK) 1.1, the system uses an event-driven "publish and subscribe" approach to inter-process communication and graphical user-interface construction. A C Language Integrated Production System (CLIPS) compatible inference engine provides the back-end intelligent data processing capability, while Oracle Relational Database Management System (RDBMS) provides the data management function. Preliminary evaluation shows acceptable performance for some classes of payloads, with Java's portability and multimedia support identified as the most significant benefit.

  14. A Summary of the Naval Postgraduate School Research Program

    DTIC Science & Technology

    1989-08-30

    5 Fundamental Theory for Automatically Combining Changes to Software Systems ............................ 6 Database -System Approach to...Software Engineering Environments(SEE’s) .................................. 10 Multilevel Database Security .......................... 11 Temporal... Database Management and Real-Time Database Computers .................................... 12 The Multi-lingual, Multi Model, Multi-Backend Database

  15. Semantic guidance of eye movements in real-world scenes

    PubMed Central

    Hwang, Alex D.; Wang, Hsueh-Cheng; Pomplun, Marc

    2011-01-01

    The perception of objects in our visual world is influenced by not only their low-level visual features such as shape and color, but also their high-level features such as meaning and semantic relations among them. While it has been shown that low-level features in real-world scenes guide eye movements during scene inspection and search, the influence of semantic similarity among scene objects on eye movements in such situations has not been investigated. Here we study guidance of eye movements by semantic similarity among objects during real-world scene inspection and search. By selecting scenes from the LabelMe object-annotated image database and applying Latent Semantic Analysis (LSA) to the object labels, we generated semantic saliency maps of real-world scenes based on the semantic similarity of scene objects to the currently fixated object or the search target. An ROC analysis of these maps as predictors of subjects’ gaze transitions between objects during scene inspection revealed a preference for transitions to objects that were semantically similar to the currently inspected one. Furthermore, during the course of a scene search, subjects’ eye movements were progressively guided toward objects that were semantically similar to the search target. These findings demonstrate substantial semantic guidance of eye movements in real-world scenes and show its importance for understanding real-world attentional control. PMID:21426914

  16. Semantic guidance of eye movements in real-world scenes.

    PubMed

    Hwang, Alex D; Wang, Hsueh-Cheng; Pomplun, Marc

    2011-05-25

    The perception of objects in our visual world is influenced by not only their low-level visual features such as shape and color, but also their high-level features such as meaning and semantic relations among them. While it has been shown that low-level features in real-world scenes guide eye movements during scene inspection and search, the influence of semantic similarity among scene objects on eye movements in such situations has not been investigated. Here we study guidance of eye movements by semantic similarity among objects during real-world scene inspection and search. By selecting scenes from the LabelMe object-annotated image database and applying latent semantic analysis (LSA) to the object labels, we generated semantic saliency maps of real-world scenes based on the semantic similarity of scene objects to the currently fixated object or the search target. An ROC analysis of these maps as predictors of subjects' gaze transitions between objects during scene inspection revealed a preference for transitions to objects that were semantically similar to the currently inspected one. Furthermore, during the course of a scene search, subjects' eye movements were progressively guided toward objects that were semantically similar to the search target. These findings demonstrate substantial semantic guidance of eye movements in real-world scenes and show its importance for understanding real-world attentional control. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Real-Time Occlusion Handling in Augmented Reality Based on an Object Tracking Approach

    PubMed Central

    Tian, Yuan; Guan, Tao; Wang, Cheng

    2010-01-01

    To produce a realistic augmentation in Augmented Reality, the correct relative positions of real objects and virtual objects are very important. In this paper, we propose a novel real-time occlusion handling method based on an object tracking approach. Our method is divided into three steps: selection of the occluding object, object tracking and occlusion handling. The user selects the occluding object using an interactive segmentation method. The contour of the selected object is then tracked in the subsequent frames in real-time. In the occlusion handling step, all the pixels on the tracked object are redrawn on the unprocessed augmented image to produce a new synthesized image in which the relative position between the real and virtual object is correct. The proposed method has several advantages. First, it is robust and stable, since it remains effective when the camera is moved through large changes of viewing angles and volumes or when the object and the background have similar colors. Second, it is fast, since the real object can be tracked in real-time. Last, a smoothing technique provides seamless merging between the augmented and virtual object. Several experiments are provided to validate the performance of the proposed method. PMID:22319278

  18. Earthquake forecasting studies using radon time series data in Taiwan

    NASA Astrophysics Data System (ADS)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  19. Accelerating Pathology Image Data Cross-Comparison on CPU-GPU Hybrid Systems

    PubMed Central

    Wang, Kaibo; Huai, Yin; Lee, Rubao; Wang, Fusheng; Zhang, Xiaodong; Saltz, Joel H.

    2012-01-01

    As an important application of spatial databases in pathology imaging analysis, cross-comparing the spatial boundaries of a huge amount of segmented micro-anatomic objects demands extremely data- and compute-intensive operations, requiring high throughput at an affordable cost. However, the performance of spatial database systems has not been satisfactory since their implementations of spatial operations cannot fully utilize the power of modern parallel hardware. In this paper, we provide a customized software solution that exploits GPUs and multi-core CPUs to accelerate spatial cross-comparison in a cost-effective way. Our solution consists of an efficient GPU algorithm and a pipelined system framework with task migration support. Extensive experiments with real-world data sets demonstrate the effectiveness of our solution, which improves the performance of spatial cross-comparison by over 18 times compared with a parallelized spatial database approach. PMID:23355955

  20. Architectural Implications for Spatial Object Association Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, V S; Kurc, T; Saltz, J

    2009-01-29

    Spatial object association, also referred to as cross-match of spatial datasets, is the problem of identifying and comparing objects in two or more datasets based on their positions in a common spatial coordinate system. In this work, we evaluate two crossmatch algorithms that are used for astronomical sky surveys, on the following database system architecture configurations: (1) Netezza Performance Server R, a parallel database system with active disk style processing capabilities, (2) MySQL Cluster, a high-throughput network database system, and (3) a hybrid configuration consisting of a collection of independent database system instances with data replication support. Our evaluation providesmore » insights about how architectural characteristics of these systems affect the performance of the spatial crossmatch algorithms. We conducted our study using real use-case scenarios borrowed from a large-scale astronomy application known as the Large Synoptic Survey Telescope (LSST).« less

  1. World Ocean Database and the Global Temperature and Salinity Profile Program Database: Synthesis of historical and near real-time ocean profile data

    NASA Astrophysics Data System (ADS)

    Boyer, T.; Sun, L.; Locarnini, R. A.; Mishonov, A. V.; Hall, N.; Ouellet, M.

    2016-02-01

    The World Ocean Database (WOD) contains systematically quality controlled historical and recent ocean profile data (temperature, salinity, oxygen, nutrients, carbon cycle variables, biological variables) ranging from Captain Cooks second voyage (1773) to this year's Argo floats. The US National Centers for Environmental Information (NCEI) also hosts the Global Temperature and Salinity Profile Program (GTSPP) Continuously Managed Database (CMD) which provides quality controlled near-real time ocean profile data and higher level quality controlled temperature and salinity profiles from 1990 to present. Both databases are used extensively for ocean and climate studies. Synchronization of these two databases will allow easier access and use of comprehensive regional and global ocean profile data sets for ocean and climate studies. Synchronizing consists of two distinct phases: 1) a retrospective comparison of data in WOD and GTSPP to ensure that the most comprehensive and highest quality data set is available to researchers without the need to individually combine and contrast the two datasets and 2) web services to allow the constantly accruing near-real time data in the GTSPP CMD and the continuous addition and quality control of historical data in WOD to be made available to researchers together, seamlessly.

  2. Regional early flood warning system: design and implementation

    NASA Astrophysics Data System (ADS)

    Chang, L. C.; Yang, S. N.; Kuo, C. L.; Wang, Y. F.

    2017-12-01

    This study proposes a prototype of the regional early flood inundation warning system in Tainan City, Taiwan. The AI technology is used to forecast multi-step-ahead regional flood inundation maps during storm events. The computing time is only few seconds that leads to real-time regional flood inundation forecasting. A database is built to organize data and information for building real-time forecasting models, maintaining the relations of forecasted points, and displaying forecasted results, while real-time data acquisition is another key task where the model requires immediately accessing rain gauge information to provide forecast services. All programs related database are constructed in Microsoft SQL Server by using Visual C# to extracting real-time hydrological data, managing data, storing the forecasted data and providing the information to the visual map-based display. The regional early flood inundation warning system use the up-to-date Web technologies driven by the database and real-time data acquisition to display the on-line forecasting flood inundation depths in the study area. The friendly interface includes on-line sequentially showing inundation area by Google Map, maximum inundation depth and its location, and providing KMZ file download of the results which can be watched on Google Earth. The developed system can provide all the relevant information and on-line forecast results that helps city authorities to make decisions during typhoon events and make actions to mitigate the losses.

  3. Managing Contention and Timing Constraints in a Real-Time Database System

    DTIC Science & Technology

    1995-01-01

    In order to realize many of these goals, StarBase is constructed on top of RT-Mach, a real - time operating system developed at Carnegie Mellon...University [ll]. StarBase differs from previous RT-DBMS work [l, 2, 31 in that a) it relies on a real - time operating system which provides priority...CPU and resource scheduling pro- vided by tlhe underlying real - time operating system . Issues of data contention are dealt with by use of a priority

  4. Laptop Computer - Based Facial Recognition System Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. A. Cain; G. B. Singleton

    2001-03-01

    The objective of this project was to assess the performance of the leading commercial-off-the-shelf (COTS) facial recognition software package when used as a laptop application. We performed the assessment to determine the system's usefulness for enrolling facial images in a database from remote locations and conducting real-time searches against a database of previously enrolled images. The assessment involved creating a database of 40 images and conducting 2 series of tests to determine the product's ability to recognize and match subject faces under varying conditions. This report describes the test results and includes a description of the factors affecting the results.more » After an extensive market survey, we selected Visionics' FaceIt{reg_sign} software package for evaluation and a review of the Facial Recognition Vendor Test 2000 (FRVT 2000). This test was co-sponsored by the US Department of Defense (DOD) Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency (DARPA). Administered in May-June 2000, the FRVT 2000 assessed the capabilities of facial recognition systems that were currently available for purchase on the US market. Our selection of this Visionics product does not indicate that it is the ''best'' facial recognition software package for all uses. It was the most appropriate package based on the specific applications and requirements for this specific application. In this assessment, the system configuration was evaluated for effectiveness in identifying individuals by searching for facial images captured from video displays against those stored in a facial image database. An additional criterion was that the system be capable of operating discretely. For this application, an operational facial recognition system would consist of one central computer hosting the master image database with multiple standalone systems configured with duplicates of the master operating in remote locations. Remote users could perform real-time searches where network connectivity is not available. As images are enrolled at the remote locations, periodic database synchronization is necessary.« less

  5. Real-Time Classification of Hand Motions Using Ultrasound Imaging of Forearm Muscles.

    PubMed

    Akhlaghi, Nima; Baker, Clayton A; Lahlou, Mohamed; Zafar, Hozaifah; Murthy, Karthik G; Rangwala, Huzefa S; Kosecka, Jana; Joiner, Wilsaan M; Pancrazio, Joseph J; Sikdar, Siddhartha

    2016-08-01

    Surface electromyography (sEMG) has been the predominant method for sensing electrical activity for a number of applications involving muscle-computer interfaces, including myoelectric control of prostheses and rehabilitation robots. Ultrasound imaging for sensing mechanical deformation of functional muscle compartments can overcome several limitations of sEMG, including the inability to differentiate between deep contiguous muscle compartments, low signal-to-noise ratio, and lack of a robust graded signal. The objective of this study was to evaluate the feasibility of real-time graded control using a computationally efficient method to differentiate between complex hand motions based on ultrasound imaging of forearm muscles. Dynamic ultrasound images of the forearm muscles were obtained from six able-bodied volunteers and analyzed to map muscle activity based on the deformation of the contracting muscles during different hand motions. Each participant performed 15 different hand motions, including digit flexion, different grips (i.e., power grasp and pinch grip), and grips in combination with wrist pronation. During the training phase, we generated a database of activity patterns corresponding to different hand motions for each participant. During the testing phase, novel activity patterns were classified using a nearest neighbor classification algorithm based on that database. The average classification accuracy was 91%. Real-time image-based control of a virtual hand showed an average classification accuracy of 92%. Our results demonstrate the feasibility of using ultrasound imaging as a robust muscle-computer interface. Potential clinical applications include control of multiarticulated prosthetic hands, stroke rehabilitation, and fundamental investigations of motor control and biomechanics.

  6. A High Speed Mobile Courier Data Access System That Processes Database Queries in Real-Time

    NASA Astrophysics Data System (ADS)

    Gatsheni, Barnabas Ndlovu; Mabizela, Zwelakhe

    A secure high-speed query processing mobile courier data access (MCDA) system for a Courier Company has been developed. This system uses the wireless networks in combination with wired networks for updating a live database at the courier centre in real-time by an offsite worker (the Courier). The system is protected by VPN based on IPsec. There is no system that we know of to date that performs the task for the courier as proposed in this paper.

  7. Column Store for GWAC: A High-cadence, High-density, Large-scale Astronomical Light Curve Pipeline and Distributed Shared-nothing Database

    NASA Astrophysics Data System (ADS)

    Wan, Meng; Wu, Chao; Wang, Jing; Qiu, Yulei; Xin, Liping; Mullender, Sjoerd; Mühleisen, Hannes; Scheers, Bart; Zhang, Ying; Nes, Niels; Kersten, Martin; Huang, Yongpan; Deng, Jinsong; Wei, Jianyan

    2016-11-01

    The ground-based wide-angle camera array (GWAC), a part of the SVOM space mission, will search for various types of optical transients by continuously imaging a field of view (FOV) of 5000 degrees2 every 15 s. Each exposure consists of 36 × 4k × 4k pixels, typically resulting in 36 × ˜175,600 extracted sources. For a modern time-domain astronomy project like GWAC, which produces massive amounts of data with a high cadence, it is challenging to search for short timescale transients in both real-time and archived data, and to build long-term light curves for variable sources. Here, we develop a high-cadence, high-density light curve pipeline (HCHDLP) to process the GWAC data in real-time, and design a distributed shared-nothing database to manage the massive amount of archived data which will be used to generate a source catalog with more than 100 billion records during 10 years of operation. First, we develop HCHDLP based on the column-store DBMS of MonetDB, taking advantage of MonetDB’s high performance when applied to massive data processing. To realize the real-time functionality of HCHDLP, we optimize the pipeline in its source association function, including both time and space complexity from outside the database (SQL semantic) and inside (RANGE-JOIN implementation), as well as in its strategy of building complex light curves. The optimized source association function is accelerated by three orders of magnitude. Second, we build a distributed database using a two-level time partitioning strategy via the MERGE TABLE and REMOTE TABLE technology of MonetDB. Intensive tests validate that our database architecture is able to achieve both linear scalability in response time and concurrent access by multiple users. In summary, our studies provide guidance for a solution to GWAC in real-time data processing and management of massive data.

  8. Flexible augmented reality architecture applied to environmental management

    NASA Astrophysics Data System (ADS)

    Correia, Nuno M. R.; Romao, Teresa; Santos, Carlos; Trabuco, Adelaide; Santos, Rossana; Romero, Luis; Danado, Jose; Dias, Eduardo; Camara, Antonio; Nobre, Edmundo

    2003-05-01

    Environmental management often requires in loco observation of the area under analysis. Augmented Reality (AR) technologies allow real time superimposition of synthetic objects on real images, providing augmented knowledge about the surrounding world. Users of an AR system can visualize the real surrounding world together with additional data generated in real time in a contextual way. The work reported in this paper was done in the scope of ANTS (Augmented Environments) project. ANTS is an AR project that explores the development of an augmented reality technological infrastructure for environmental management. This paper presents the architecture and the most relevant modules of ANTS. The system"s architecture follows the client-server model and is based on several independent, but functionally interdependent modules. It has a flexible design, which allows the transfer of some modules to and from the client side, according to the available processing capacities of the client device and the application"s requirements. It combines several techniques to identify the user"s position and orientation allowing the system to adapt to the particular characteristics of each environment. The determination of the data associated to a certain location involves the use of both a 3D Model of the location and the multimedia geo-referenced database.

  9. Detection and recognition of targets by using signal polarization properties

    NASA Astrophysics Data System (ADS)

    Ponomaryov, Volodymyr I.; Peralta-Fabi, Ricardo; Popov, Anatoly V.; Babakov, Mikhail F.

    1999-08-01

    The quality of radar target recognition can be enhanced by exploiting its polarization signatures. A specialized X-band polarimetric radar was used for target recognition in experimental investigations. The following polarization characteristics connected to the object geometrical properties were investigated: the amplitudes of the polarization matrix elements; an anisotropy coefficient; depolarization coefficient; asymmetry coefficient; the energy of a backscattering signal; object shape factor. A large quantity of polarimetric radar data was measured and processed to form a database of different object and different weather conditions. The histograms of polarization signatures were approximated by a Nakagami distribution, then used for real- time target recognition. The Neyman-Pearson criterion was used for the target detection, and the criterion of the maximum of a posterior probability was used for recognition problem. Some results of experimental verification of pattern recognition and detection of objects with different electrophysical and geometrical characteristics urban in clutter are presented in this paper.

  10. Deterministic object tracking using Gaussian ringlet and directional edge features

    NASA Astrophysics Data System (ADS)

    Krieger, Evan W.; Sidike, Paheding; Aspiras, Theus; Asari, Vijayan K.

    2017-10-01

    Challenges currently existing for intensity-based histogram feature tracking methods in wide area motion imagery (WAMI) data include object structural information distortions, background variations, and object scale change. These issues are caused by different pavement or ground types and from changing the sensor or altitude. All of these challenges need to be overcome in order to have a robust object tracker, while attaining a computation time appropriate for real-time processing. To achieve this, we present a novel method, Directional Ringlet Intensity Feature Transform (DRIFT), which employs Kirsch kernel filtering for edge features and a ringlet feature mapping for rotational invariance. The method also includes an automatic scale change component to obtain accurate object boundaries and improvements for lowering computation times. We evaluated the DRIFT algorithm on two challenging WAMI datasets, namely Columbus Large Image Format (CLIF) and Large Area Image Recorder (LAIR), to evaluate its robustness and efficiency. Additional evaluations on general tracking video sequences are performed using the Visual Tracker Benchmark and Visual Object Tracking 2014 databases to demonstrate the algorithms ability with additional challenges in long complex sequences including scale change. Experimental results show that the proposed approach yields competitive results compared to state-of-the-art object tracking methods on the testing datasets.

  11. Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

    NASA Astrophysics Data System (ADS)

    Wang, Lusheng; Yang, Yong; Lin, Guohui

    Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.

  12. A comparison of moving object detection methods for real-time moving object detection

    NASA Astrophysics Data System (ADS)

    Roshan, Aditya; Zhang, Yun

    2014-06-01

    Moving object detection has a wide variety of applications from traffic monitoring, site monitoring, automatic theft identification, face detection to military surveillance. Many methods have been developed across the globe for moving object detection, but it is very difficult to find one which can work globally in all situations and with different types of videos. The purpose of this paper is to evaluate existing moving object detection methods which can be implemented in software on a desktop or laptop, for real time object detection. There are several moving object detection methods noted in the literature, but few of them are suitable for real time moving object detection. Most of the methods which provide for real time movement are further limited by the number of objects and the scene complexity. This paper evaluates the four most commonly used moving object detection methods as background subtraction technique, Gaussian mixture model, wavelet based and optical flow based methods. The work is based on evaluation of these four moving object detection methods using two (2) different sets of cameras and two (2) different scenes. The moving object detection methods have been implemented using MatLab and results are compared based on completeness of detected objects, noise, light change sensitivity, processing time etc. After comparison, it is observed that optical flow based method took least processing time and successfully detected boundary of moving objects which also implies that it can be implemented for real-time moving object detection.

  13. Designing a Multi-Petabyte Database for LSST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, Jacek; Hanushevsky, Andrew; Nikolaev, Sergei

    2007-01-10

    The 3.2 giga-pixel LSST camera will produce approximately half a petabyte of archive images every month. These data need to be reduced in under a minute to produce real-time transient alerts, and then added to the cumulative catalog for further analysis. The catalog is expected to grow about three hundred terabytes per year. The data volume, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require innovative techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on a database for catalogs and metadata. Several database systems are beingmore » evaluated to understand how they perform at these data rates, data volumes, and access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, results to date from evaluating available database technologies against LSST requirements, and the proposed database architecture to meet the data challenges.« less

  14. The PARIGA server for real time filtering and analysis of reciprocal BLAST results.

    PubMed

    Orsini, Massimiliano; Carcangiu, Simone; Cuccuru, Gianmauro; Uva, Paolo; Tramontano, Anna

    2013-01-01

    BLAST-based similarity searches are commonly used in several applications involving both nucleotide and protein sequences. These applications span from simple tasks such as mapping sequences over a database to more complex procedures as clustering or annotation processes. When the amount of analysed data increases, manual inspection of BLAST results become a tedious procedure. Tools for parsing or filtering BLAST results for different purposes are then required. We describe here PARIGA (http://resources.bioinformatica.crs4.it/pariga/), a server that enables users to perform all-against-all BLAST searches on two sets of sequences selected by the user. Moreover, since it stores the two BLAST output in a python-serialized-objects database, results can be filtered according to several parameters in real-time fashion, without re-running the process and avoiding additional programming efforts. Results can be interrogated by the user using logical operations, for example to retrieve cases where two queries match same targets, or when sequences from the two datasets are reciprocal best hits, or when a query matches a target in multiple regions. The Pariga web server is designed to be a helpful tool for managing the results of sequence similarity searches. The design and implementation of the server renders all operations very fast and easy to use.

  15. The Toxic Exposure Surveillance System (TESS): Risk assessment and real-time toxicovigilance across United States poison centers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, William A.; Litovitz, Toby L.; Belson, Martin G.

    2005-09-01

    The Toxic Exposure Surveillance System (TESS) is a uniform data set of US poison centers cases. Categories of information include the patient, the caller, the exposure, the substance(s), clinical toxicity, treatment, and medical outcome. The TESS database was initiated in 1985, and provides a baseline of more than 36.2 million cases through 2003. The database has been utilized for a number of safety evaluations. Consideration of the strengths and limitations of TESS data must be incorporated into data interpretation. Real-time toxicovigilance was initiated in 2003 with continuous uploading of new cases from all poison centers to a central database. Real-timemore » toxicovigilance utilizing general and specific approaches is systematically run against TESS, further increasing the potential utility of poison center experiences as a means of early identification of potential public health threats.« less

  16. Pseudonymisation of radiology data for research purposes

    NASA Astrophysics Data System (ADS)

    Noumeir, Rita; Lemay, Alain; Lina, Jean-Marc

    2005-04-01

    Medical image processing methods and algorithms, developed by researchers, need to be validated and tested. Test data should ideally be real clinical data especially when that clinical data is varied and exists in large volume. In nowadays, clinical data is accessible electronically and has important value for researchers. However, the usage of clinical data for research purposes should respect data confidentiality, patient right to privacy and the patient consent. In fact, clinical data is nominative given that it contains information about the patient such as name, age and identification number. Evidently, clinical data should be de-identified to be exported to research databases. However, the same patient is usually followed during a long period of time. The disease progression and the diagnostic evolution represent extremely valuable information for researchers, as well. Our objective is to build a research database from de-identified clinical data while enabling the database to be easily incremented by exporting new pseudonymous data, acquired over a long period of time. Pseudonymisation is data de-identification such that data belonging to the same individual in the clinical environment bear the same relation to each other in the de-identified research version. In this paper, we propose a software architecture that enables the implementation of a research database that can be incremented in time. We also evaluate its security and discuss its security pitfalls.

  17. Providing accurate near real-time fire alerts for Protected Areas through NASA FIRMS: Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Ilavajhala, S.; Davies, D.; Schmaltz, J. E.; Wong, M.; Murphy, K. J.

    2013-12-01

    The NASA Fire Information for Resource Management System (FIRMS) is at the forefront of providing global near real-time (NRT) MODIS thermal anomalies / hotspot location data to end-users . FIRMS serves the data via an interactive Web GIS named Web Fire Mapper, downloads of NRT active fire, archive data downloads for MODIS hotspots dating back to 1999 and a hotspot email alert system The FIRMS Email Alerts system has been successfully alerting users of fires in their area of interest in near real-time and/or via daily and weekly email summaries, with an option to receive MODIS hotspot data as a text file (CSV) attachment. Currently, there are more than 7000 email alert subscriptions from more than 100 countries. Specifically, the email alerts system is designed to generate and send an email alert for any region or area on the globe, with a special focus on providing alerts for protected areas worldwide. For many protected areas, email alerts are particularly useful for early fire detection, monitoring on going fires, as well as allocating resources to protect wildlife and natural resources of particular value. For protected areas, FIRMS uses the World Database on Protected Areas (WDPA) supplied by United Nations Environment Program - World Conservation Monitoring Centre (UNEP-WCMC). Maintaining the most up-to-date, accurate boundary geometry for the protected areas for the email alerts is a challenge as the WDPA is continuously updated due to changing boundaries, merging or delisting of certain protected areas. Because of this dynamic nature of the protected areas database, the FIRMS protected areas database is frequently out-of-date with the most current version of WDPA database. To maintain the most up-to-date boundary information for protected areas and to be in compliance with the WDPA terms and conditions, FIRMS needs to constantly update its database of protected areas. Currently, FIRMS strives to keep its database up to date by downloading the most recent WDPA database at regular intervals, processing it, and ingesting it into the FIRMS spatial database. However, due to the large size of database, the process to download, process and ingest the database is quite time consuming. The FIRMS team is currently working on developing a method to update the protected areas database via web at regular intervals or on-demand. Using such a solution, FIRMS will be able access the most up-to-date extents of any protected area and the corresponding spatial geometries in real time. As such, FIRMS can utilize such a service to access the protected areas and their associated geometries to keep users' protected area boundaries in sync with those of the most recent WDPA database, and thus serve a more accurate email alert to the users. Furthermore, any client accessing the WDPA protected areas database could potentially use the solution of real-time access to the protected areas database. This talk primarily focuses on the challenges for FIRMS in sending accurate email alerts for protected areas, along with the solution the FIRMS team is developing. This talk also introduces the FIRMS fire information system and its components, with a special emphasis on the FIRMS email alerts system.

  18. Digital Image Support in the ROADNet Real-time Monitoring Platform

    NASA Astrophysics Data System (ADS)

    Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.

    2004-12-01

    The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.

  19. Recent achievements in real-time computational seismology in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liang, W.; Huang, B.

    2012-12-01

    Real-time computational seismology is currently possible to be achieved which needs highly connection between seismic database and high performance computing. We have developed a real-time moment tensor monitoring system (RMT) by using continuous BATS records and moment tensor inversion (CMT) technique. The real-time online earthquake simulation service is also ready to open for researchers and public earthquake science education (ROS). Combine RMT with ROS, the earthquake report based on computational seismology can provide within 5 minutes after an earthquake occurred (RMT obtains point source information < 120 sec; ROS completes a 3D simulation < 3 minutes). All of these computational results are posted on the internet in real-time now. For more information, welcome to visit real-time computational seismology earthquake report webpage (RCS).

  20. Volcanic observation data and simulation database at NIED, Japan (Invited)

    NASA Astrophysics Data System (ADS)

    Fujita, E.; Ueda, H.; Kozono, T.

    2009-12-01

    NIED (Nat’l Res. Inst. for Earth Sci. & Disast. Prev.) has a project to develop two volcanic database systems: (1) volcanic observation database; (2) volcanic simulation database. The volcanic observation database is the data archive center obtained by the geophysical observation networks at Mt. Fuji, Miyake, Izu-Oshima, Iwo-jima and Nasu volcanoes, central Japan. The data consist of seismic (both high-sensitivity and broadband), ground deformation (tiltmeter, GPS) and those from other sensors (e.g., rain gauge, gravimeter, magnetometer, pressure gauge.) These data is originally stored in “WIN format,” the Japanese standard format, which is also at the Hi-net (High sensitivity seismic network Japan, http://www.hinet.bosai.go.jp/). NIED joins to WOVOdat and we have prepared to upload our data, via XML format. Our concept of the XML format is 1)a common format for intermediate files to upload into the WOVOdat DB, 2) for data files downloaded from the WOVOdat DB, 3) for data exchanges between observatories without the WOVOdat DB, 4) for common data files in each observatory, 5) for data communications between systems and softwares and 6)a for softwares. NIED is now preparing for (2) the volcanic simulation database. The objective of this project is to support to develop a “real-time” hazard map, i.e., the system which is effective to evaluate volcanic hazard in case of emergency, including the up-to-date conditions. Our system will include lava flow simulation (LavaSIM) and pyroclastic flow simulation (grvcrt). The database will keep many cases of assumed simulations and we can pick up the most probable case as the first evaluation in case the eruption started. The final goals of the both database will realize the volcanic eruption prediction and forecasting in real time by the combination of monitoring data and numerical simulations.

  1. Evaluation of web-based annotation of ophthalmic images for multicentric clinical trials.

    PubMed

    Chalam, K V; Jain, P; Shah, V A; Shah, Gaurav Y

    2006-06-01

    An Internet browser-based annotation system can be used to identify and describe features in digitalized retinal images, in multicentric clinical trials, in real time. In this web-based annotation system, the user employs a mouse to draw and create annotations on a transparent layer, that encapsulates the observations and interpretations of a specific image. Multiple annotation layers may be overlaid on a single image. These layers may correspond to annotations by different users on the same image or annotations of a temporal sequence of images of a disease process, over a period of time. In addition, geometrical properties of annotated figures may be computed and measured. The annotations are stored in a central repository database on a server, which can be retrieved by multiple users in real time. This system facilitates objective evaluation of digital images and comparison of double-blind readings of digital photographs, with an identifiable audit trail. Annotation of ophthalmic images allowed clinically feasible and useful interpretation to track properties of an area of fundus pathology. This provided an objective method to monitor properties of pathologies over time, an essential component of multicentric clinical trials. The annotation system also allowed users to view stereoscopic images that are stereo pairs. This web-based annotation system is useful and valuable in monitoring patient care, in multicentric clinical trials, telemedicine, teaching and routine clinical settings.

  2. Automatic Reacquisition of Satellite Positions by Detecting Their Expected Streaks in Astronomical Images

    NASA Astrophysics Data System (ADS)

    Levesque, M.

    Artificial satellites, and particularly space junk, drift continuously from their known orbits. In the surveillance-of-space context, they must be observed frequently to ensure that the corresponding orbital parameter database entries are up-to-date. Autonomous ground-based optical systems are periodically tasked to observe these objects, calculate the difference between their predicted and real positions and update object orbital parameters. The real satellite positions are provided by the detection of the satellite streaks in the astronomical images specifically acquired for this purpose. This paper presents the image processing techniques used to detect and extract the satellite positions. The methodology includes several processing steps including: image background estimation and removal, star detection and removal, an iterative matched filter for streak detection, and finally false alarm rejection algorithms. This detection methodology is able to detect very faint objects. Simulated data were used to evaluate the methodology's performance and determine the sensitivity limits where the algorithm can perform detection without false alarm, which is essential to avoid corruption of the orbital parameter database.

  3. Remote consultation and diagnosis in medical imaging using a global PACS backbone network

    NASA Astrophysics Data System (ADS)

    Martinez, Ralph; Sutaria, Bijal N.; Kim, Jinman; Nam, Jiseung

    1993-10-01

    A Global PACS is a national network which interconnects several PACS networks at medical and hospital complexes using a national backbone network. A Global PACS environment enables new and beneficial operations between radiologists and physicians, when they are located in different geographical locations. One operation allows the radiologist to view the same image folder at both Local and Remote sites so that a diagnosis can be performed. The paper describes the user interface, database management, and network communication software which has been developed in the Computer Engineering Research Laboratory and Radiology Research Laboratory. Specifically, a design for a file management system in a distributed environment is presented. In the remote consultation and diagnosis operation, a set of images is requested from the database archive system and sent to the Local and Remote workstation sites on the Global PACS network. Viewing the same images, the radiologists use pointing overlay commands, or frames to point out features on the images. Each workstation transfers these frames, to the other workstation, so that an interactive session for diagnosis takes place. In this phase, we use fixed frames and variable size frames, used to outline an object. The data pockets for these frames traverses the national backbone in real-time. We accomplish this feature by using TCP/IP protocol sockets for communications. The remote consultation and diagnosis operation has been tested in real-time between the University Medical Center and the Bowman Gray School of Medicine at Wake Forest University, over the Internet. In this paper, we show the feasibility of the operation in a Global PACS environment. Future improvements to the system will include real-time voice and interactive compressed video scenarios.

  4. Surviving the Glut: The Management of Event Streams in Cyberphysical Systems

    NASA Astrophysics Data System (ADS)

    Buchmann, Alejandro

    Alejandro Buchmann is Professor in the Department of Computer Science, Technische Universität Darmstadt, where he heads the Databases and Distributed Systems Group. He received his MS (1977) and PhD (1980) from the University of Texas at Austin. He was an Assistant/Associate Professor at the Institute for Applied Mathematics and Systems IIMAS/UNAM in Mexico, doing research on databases for CAD, geographic information systems, and objectoriented databases. At Computer Corporation of America (later Xerox Advanced Information Systems) in Cambridge, Mass., he worked in the areas of active databases and real-time databases, and at GTE Laboratories, Waltham, in the areas of distributed object systems and the integration of heterogeneous legacy systems. 1991 he returned to academia and joined T.U. Darmstadt. His current research interests are at the intersection of middleware, databases, eventbased distributed systems, ubiquitous computing, and very large distributed systems (P2P, WSN). Much of the current research is concerned with guaranteeing quality of service and reliability properties in these systems, for example, scalability, performance, transactional behaviour, consistency, and end-to-end security. Many research projects imply collaboration with industry and cover a broad spectrum of application domains. Further information can be found at http://www.dvs.tu-darmstadt.de

  5. View generated database

    NASA Technical Reports Server (NTRS)

    Downward, James G.

    1992-01-01

    This document represents the final report for the View Generated Database (VGD) project, NAS7-1066. It documents the work done on the project up to the point at which all project work was terminated due to lack of project funds. The VGD was to provide the capability to accurately represent any real-world object or scene as a computer model. Such models include both an accurate spatial/geometric representation of surfaces of the object or scene, as well as any surface detail present on the object. Applications of such models are numerous, including acquisition and maintenance of work models for tele-autonomous systems, generation of accurate 3-D geometric/photometric models for various 3-D vision systems, and graphical models for realistic rendering of 3-D scenes via computer graphics.

  6. Application based on ArcObject inquiry and Google maps demonstration to real estate database

    NASA Astrophysics Data System (ADS)

    Hwang, JinTsong

    2007-06-01

    Real estate industry in Taiwan has been flourishing in recent years. To acquire various and abundant information of real estate for sale is the same goal for the consumers and the brokerages. Therefore, before looking at the property, it is important to get all pertinent information possible. Not only this beneficial for the real estate agent as they can provide the sellers with the most information, thereby solidifying the interest of the buyer, but may also save time and the cost of manpower were something out of place. Most of the brokerage sites are aware of utilizes Internet as form of media for publicity however; the contents are limited to specific property itself and the functions of query are mostly just provided searching by condition. This paper proposes a query interface on website which gives function of zone query by spatial analysis for non-GIS users, developing a user-friendly interface with ArcObject in VB6, and query by condition. The inquiry results can show on the web page which is embedded functions of Google Maps and the UrMap API on it. In addition, the demonstration of inquiry results will give the multimedia present way which includes hyperlink to Google Earth with surrounding of the property, the Virtual Reality scene of house, panorama of interior of building and so on. Therefore, the website provides extra spatial solution for query and demonstration abundant information of real estate in two-dimensional and three-dimensional types of view.

  7. System for Performing Single Query Searches of Heterogeneous and Dispersed Databases

    NASA Technical Reports Server (NTRS)

    Maluf, David A. (Inventor); Okimura, Takeshi (Inventor); Gurram, Mohana M. (Inventor); Tran, Vu Hoang (Inventor); Knight, Christopher D. (Inventor); Trinh, Anh Ngoc (Inventor)

    2017-01-01

    The present invention is a distributed computer system of heterogeneous databases joined in an information grid and configured with an Application Programming Interface hardware which includes a search engine component for performing user-structured queries on multiple heterogeneous databases in real time. This invention reduces overhead associated with the impedance mismatch that commonly occurs in heterogeneous database queries.

  8. 3D-SURFER 2.0: web platform for real-time search and characterization of protein surfaces.

    PubMed

    Xiong, Yi; Esquivel-Rodriguez, Juan; Sael, Lee; Kihara, Daisuke

    2014-01-01

    The increasing number of uncharacterized protein structures necessitates the development of computational approaches for function annotation using the protein tertiary structures. Protein structure database search is the basis of any structure-based functional elucidation of proteins. 3D-SURFER is a web platform for real-time protein surface comparison of a given protein structure against the entire PDB using 3D Zernike descriptors. It can smoothly navigate the protein structure space in real-time from one query structure to another. A major new feature of Release 2.0 is the ability to compare the protein surface of a single chain, a single domain, or a single complex against databases of protein chains, domains, complexes, or a combination of all three in the latest PDB. Additionally, two types of protein structures can now be compared: all-atom-surface and backbone-atom-surface. The server can also accept a batch job for a large number of database searches. Pockets in protein surfaces can be identified by VisGrid and LIGSITE (csc) . The server is available at http://kiharalab.org/3d-surfer/.

  9. Distributed On-line Monitoring System Based on Modem and Public Phone Net

    NASA Astrophysics Data System (ADS)

    Chen, Dandan; Zhang, Qiushi; Li, Guiru

    In order to solve the monitoring problem of urban sewage disposal, a distributed on-line monitoring system is proposed. By introducing dial-up communication technology based on Modem, the serial communication program can rationally solve the information transmission problem between master station and slave station. The realization of serial communication program is based on the MSComm control of C++ Builder 6.0.The software includes real-time data operation part and history data handling part, which using Microsoft SQL Server 2000 for database, and C++ Builder6.0 for user interface. The monitoring center displays a user interface with alarm information of over-standard data and real-time curve. Practical application shows that the system has successfully accomplished the real-time data acquisition from data gather station, and stored them in the terminal database.

  10. Hardware accelerator design for tracking in smart camera

    NASA Astrophysics Data System (ADS)

    Singh, Sanjay; Dunga, Srinivasa Murali; Saini, Ravi; Mandal, A. S.; Shekhar, Chandra; Vohra, Anil

    2011-10-01

    Smart Cameras are important components in video analysis. For video analysis, smart cameras needs to detect interesting moving objects, track such objects from frame to frame, and perform analysis of object track in real time. Therefore, the use of real-time tracking is prominent in smart cameras. The software implementation of tracking algorithm on a general purpose processor (like PowerPC) could achieve low frame rate far from real-time requirements. This paper presents the SIMD approach based hardware accelerator designed for real-time tracking of objects in a scene. The system is designed and simulated using VHDL and implemented on Xilinx XUP Virtex-IIPro FPGA. Resulted frame rate is 30 frames per second for 250x200 resolution video in gray scale.

  11. Rotation And Scale Invariant Object Recognition Using A Distributed Associative Memory

    NASA Astrophysics Data System (ADS)

    Wechsler, Harry; Zimmerman, George Lee

    1988-04-01

    This paper describes an approach to 2-dimensional object recognition. Complex-log conformal mapping is combined with a distributed associative memory to create a system which recognizes objects regardless of changes in rotation or scale. Recalled information from the memorized database is used to classify an object, reconstruct the memorized version of the object, and estimate the magnitude of changes in scale or rotation. The system response is resistant to moderate amounts of noise and occlusion. Several experiments, using real, gray scale images, are presented to show the feasibility of our approach.

  12. Connecting real-time data to algorithms and databases: EarthCube's Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS)

    NASA Astrophysics Data System (ADS)

    Daniels, M. D.; Graves, S. J.; Kerkez, B.; Chandrasekar, V.; Vernon, F.; Martin, C. L.; Maskey, M.; Keiser, K.; Dye, M. J.

    2015-12-01

    The Cloud-Hosted Real-time Data Services for the Geosciences (CHORDS) project was funded under the National Science Foundation's EarthCube initiative. CHORDS addresses the ever-increasing importance of real-time scientific data in the geosciences, particularly in mission critical scenarios, where informed decisions must be made rapidly. Access to constant streams of real-time data also allow many new transient phenomena in space-time to be observed, however, much of these streaming data are either completely inaccessible or only available to proprietary in-house tools or displays. Small research teams do not have the resources to develop tools for the broad dissemination of their unique real-time data and require an easy to use, scalable, cloud-based solution to facilitate this access. CHORDS will make these diverse streams of real-time data available to the broader geosciences community. This talk will highlight a recently developed CHORDS portal tools and processing systems which address some of the gaps in handling real-time data, particularly in the provisioning of data from the "long-tail" scientific community through a simple interface that is deployed in the cloud, is scalable and is able to be customized by research teams. A running portal, with operational data feeds from across the nation, will be presented. The processing within the CHORDS system will expose these real-time streams via standard services from the Open Geospatial Consortium (OGC) in a way that is simple and transparent to the data provider, while maximizing the usage of these investments. The ingestion of high velocity, high volume and diverse data has allowed the project to explore a NoSQL database implementation. Broad use of the CHORDS framework by geoscientists will help to facilitate adaptive experimentation, model assimilation and real-time hypothesis testing.

  13. Compressed multi-block local binary pattern for object tracking

    NASA Astrophysics Data System (ADS)

    Li, Tianwen; Gao, Yun; Zhao, Lei; Zhou, Hao

    2018-04-01

    Both robustness and real-time are very important for the application of object tracking under a real environment. The focused trackers based on deep learning are difficult to satisfy with the real-time of tracking. Compressive sensing provided a technical support for real-time tracking. In this paper, an object can be tracked via a multi-block local binary pattern feature. The feature vector was extracted based on the multi-block local binary pattern feature, which was compressed via a sparse random Gaussian matrix as the measurement matrix. The experiments showed that the proposed tracker ran in real-time and outperformed the existed compressive trackers based on Haar-like feature on many challenging video sequences in terms of accuracy and robustness.

  14. Performance related issues in distributed database systems

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    The key elements of research performed during the year long effort of this project are: Investigate the effects of heterogeneity in distributed real time systems; Study the requirements to TRAC towards building a heterogeneous database system; Study the effects of performance modeling on distributed database performance; and Experiment with an ORACLE based heterogeneous system.

  15. Developing a Near Real-time System for Earthquake Slip Distribution Inversion

    NASA Astrophysics Data System (ADS)

    Zhao, Li; Hsieh, Ming-Che; Luo, Yan; Ji, Chen

    2016-04-01

    Advances in observational and computational seismology in the past two decades have enabled completely automatic and real-time determinations of the focal mechanisms of earthquake point sources. However, seismic radiations from moderate and large earthquakes often exhibit strong finite-source directivity effect, which is critically important for accurate ground motion estimations and earthquake damage assessments. Therefore, an effective procedure to determine earthquake rupture processes in near real-time is in high demand for hazard mitigation and risk assessment purposes. In this study, we develop an efficient waveform inversion approach for the purpose of solving for finite-fault models in 3D structure. Full slip distribution inversions are carried out based on the identified fault planes in the point-source solutions. To ensure efficiency in calculating 3D synthetics during slip distribution inversions, a database of strain Green tensors (SGT) is established for 3D structural model with realistic surface topography. The SGT database enables rapid calculations of accurate synthetic seismograms for waveform inversion on a regular desktop or even a laptop PC. We demonstrate our source inversion approach using two moderate earthquakes (Mw~6.0) in Taiwan and in mainland China. Our results show that 3D velocity model provides better waveform fitting with more spatially concentrated slip distributions. Our source inversion technique based on the SGT database is effective for semi-automatic, near real-time determinations of finite-source solutions for seismic hazard mitigation purposes.

  16. Programming Wireless Handheld Devices for Applications in Teaching Astronomy

    NASA Astrophysics Data System (ADS)

    Budiardja, R.; Saranathan, V.; Guidry, M.

    2002-12-01

    Wireless technology implemented with handheld devices has attractive features because of the potential to access large amounts of data and the prospect of on-the-fly computational analysis from a device that can be carried in a shirt pocket. We shall describe applications of such technology to the general paradigm of making digital wireless connections from the field to upload information and queries to network servers, executing (potentially complex) data analysis and/or database operations on fast network computers, and returning real-time information from this analysis to the handheld device in the field. As illustration, we shall describe several client/server programs that we have written for applications in teaching introductory astronomy. For example, one program allows static and dynamic properties of astronomical objects to be accessed in a remote observation laboratory setting using a digital cell phone or PDA. Another implements interactive quizzing over a cell phone or PDA using a 700-question introductory astronomy quiz database, thus permitting students to study for astronomy quizzes in any environment in which they have a few free minutes and a digital cell phone or wireless PDA. The presentation will include hands-on demonstrations with real devices.

  17. Characteristic Markers of the WNT Signaling Pathways Are Differentially Expressed in Osteoarthritic Cartilage

    PubMed Central

    Dehne, T.; Lindahl, A.; Brittberg, M.; Pruss, A.; Ringe, J.; Sittinger, M.; Karlsson, C.

    2012-01-01

    Objective: It is well known that expression of markers for WNT signaling is dysregulated in osteoarthritic (OA) bone. However, it is still not fully known if the expression of these markers also is affected in OA cartilage. The aim of this study was therefore to examine this issue. Methods: Human cartilage biopsies from OA and control donors were subjected to genome-wide oligonucleotide microarrays. Genes involved in WNT signaling were selected using the BioRetis database, KEGG pathway analysis was searched using DAVID software tools, and cluster analysis was performed using Genesis software. Results from the microarray analysis were verified using quantitative real-time PCR and immunohistochemistry. In order to study the impact of cytokines for the dysregulated WNT signaling, OA and control chondrocytes were stimulated with interleukin-1 and analyzed with real-time PCR for their expression of WNT-related genes. Results: Several WNT markers displayed a significantly altered expression in OA compared to normal cartilage. Interestingly, inhibitors of the canonical and planar cell polarity WNT signaling pathways displayed significantly increased expression in OA cartilage, while the Ca2+/WNT signaling pathway was activated. Both real-time PCR and immunohistochemistry verified the microarray results. Real-time PCR analysis demonstrated that interleukin-1 upregulated expression of important WNT markers. Conclusions: WNT signaling is significantly affected in OA cartilage. The result suggests that both the canonical and planar cell polarity WNT signaling pathways were partly inhibited while the Ca2+/WNT pathway was activated in OA cartilage. PMID:26069618

  18. Thoracolumbar spine fractures in frontal impact crashes.

    PubMed

    Pintar, Frank A; Yoganandan, Narayan; Maiman, Dennis J; Scarboro, Mark; Rudd, Rodney W

    2012-01-01

    There is currently no injury assessment for thoracic or lumbar spine fractures in the motor vehicle crash standards throughout the world. Compression-related thoracolumbar fractures are occurring in frontal impacts and yet the mechanism of injury is poorly understood. The objective of this investigation was to characterize these injuries using real world crash data from the US-DOT-NHTSA NASS-CDS and CIREN databases. Thoracic and lumbar AIS vertebral body fracture codes were searched for in the two databases. The NASS database was used to characterize population trends as a function of crash year and vehicle model year. The CIREN database was used to examine a case series in more detail. From the NASS database there were 2000-4000 occupants in frontal impacts with thoracic and lumbar vertebral body fractures per crash year. There was an increasing trend in incidence rate of thoracolumbar fractures in frontal impact crashes as a function of vehicle model year from 1986 to 2008; this was not the case for other crash types. From the CIREN database, the thoracolumbar spine was most commonly fractured at either the T12 or L1 level. Major, burst type fractures occurred predominantly at T12, L1 or L5; wedge fractures were most common at L1. Most CIREN occupants were belted; there were slightly more females involved; they were almost all in bucket seats; impact location occurred approximately half the time on the road and half off the road. The type of object struck also seemed to have some influence on fractured spine level, suggesting that the crash deceleration pulse may be influential in the type of compression vector that migrates up the spinal column. Future biomechanical studies are required to define mechanistically how these fractures are influenced by these many factors.

  19. The VLBA correlator: Real-time in the distributed era

    NASA Technical Reports Server (NTRS)

    Wells, D. C.

    1992-01-01

    The correlator is the signal processing engine of the Very Long Baseline Array (VLBA). Radio signals are recorded on special wideband (128 Mb/s) digital recorders at the 10 telescopes, with sampling times controlled by hydrogen maser clocks. The magnetic tapes are shipped to the Array Operations Center in Socorro, New Mexico, where they are played back simultaneously into the correlator. Real-time software and firmware controls the playback drives to achieve synchronization, compute models of the wavefront delay, control the numerous modules of the correlator, and record FITS files of the fringe visibilities at the back-end of the correlator. In addition to the more than 3000 custom VLSI chips which handle the massive data flow of the signal processing, the correlator contains a total of more than 100 programmable computers, 8-, 16- and 32-bit CPUs. Code is downloaded into front-end CPU's dependent on operating mode. Low-level code is assembly language, high-level code is C running under a RT OS. We use VxWorks on Motorola MVME147 CPU's. Code development is on a complex of SPARC workstations connected to the RT CPU's by Ethernet. The overall management of the correlation process is dependent on a database management system. We use Ingres running on a Sparcstation-2. We transfer logging information from the database of the VLBA Monitor and Control System to our database using Ingres/NET. Job scripts are computed and are transferred to the real-time computers using NFS, and correlation job execution logs and status flow back by the route. Operator status and control displays use windows on workstations, interfaced to the real-time processes by network protocols. The extensive network protocol support provided by VxWorks is invaluable. The VLBA Correlator's dependence on network protocols is an example of the radical transformation of the real-time world over the past five years. Real-time is becoming more like conventional computing. Paradoxically, 'conventional' computing is also adopting practices from the real-time world: semaphores, shared memory, light-weight threads, and concurrency. This appears to be a convergence of thinking.

  20. Causal relations among events and states in dynamic geographical phenomena

    NASA Astrophysics Data System (ADS)

    Huang, Zhaoqiang; Feng, Xuezhi; Xuan, Wenling; Chen, Xiuwan

    2007-06-01

    There is only a static state of the real world to be recorded in conventional geographical information systems. However, there is not only static information but also dynamic information in geographical phenomena. So that how to record the dynamic information and reveal the relations among dynamic information is an important issue in a spatio-temporal information system. From an ontological perspective, we can initially divide the spatio-temporal entities in the world into continuants and occurrents. Continuant entities endure through some extended (although possibly very short) interval of time (e.g., houses, roads, cities, and real-estate). Occurrent entities happen and are then gone (e.g., a house repair job, road construction project, urban expansion, real-estate transition). From an information system perspective, continuants and occurrents that have a unique identity in the system are referred to as objects and events, respectively. And the change is represented implicitly by static snapshots in current spatial temporal information systems. In the previous models, the objects can be considered as the fundamental components of the system, and the change is modeled by considering time-varying attributes of these objects. In the spatio-temporal database, the temporal information that is either interval or instant is involved and the underlying data structures and indexes for temporal are considerable investigated. However, there is the absence of explicit ways of considering events, which affect the attributes of objects or the state. So the research issue of this paper focuses on how to model events in conceptual models of dynamic geographical phenomena and how to represent the causal relations among events and the objects or states. Firstly, the paper reviews the conceptual modeling in a temporal GIS by researchers. Secondly, this paper discusses the spatio-temporal entities: objects and events. Thirdly, this paper investigates the causal relations amongst events and states. The qualitative spatiotemporal change is an important issue in the dynamic geographic-scale phenomena. In real estate transition, the events and states are needed to be represented explicitly. In our modeling the evolution of a dynamic system, it can not avoid fetching in the view of causality. The object's transition is represented by the state of object. Event causes the state of objects changing and causes other events happen. Events connect with objects closely. The basic causal relations are the state-event and event-state relationships. Lastly, the paper concludes with the overview about the causal relations amongst events and states. And this future work is pointed.

  1. Real-time Author Co-citation Mapping for Online Searching.

    ERIC Educational Resources Information Center

    Lin, Xia; White, Howard D.; Buzydlowski, Jan

    2003-01-01

    Describes the design and implementation of a prototype visualization system, AuthorLink, to enhance author searching. AuthorLink is based on author co-citation analysis and visualization mapping algorithms. AuthorLink produces interactive author maps in real time from a database of 1.26 million records supplied by the Institute for Scientific…

  2. Real-time solution of linear computational problems using databases of parametric reduced-order models with arbitrary underlying meshes

    NASA Astrophysics Data System (ADS)

    Amsallem, David; Tezaur, Radek; Farhat, Charbel

    2016-12-01

    A comprehensive approach for real-time computations using a database of parametric, linear, projection-based reduced-order models (ROMs) based on arbitrary underlying meshes is proposed. In the offline phase of this approach, the parameter space is sampled and linear ROMs defined by linear reduced operators are pre-computed at the sampled parameter points and stored. Then, these operators and associated ROMs are transformed into counterparts that satisfy a certain notion of consistency. In the online phase of this approach, a linear ROM is constructed in real-time at a queried but unsampled parameter point by interpolating the pre-computed linear reduced operators on matrix manifolds and therefore computing an interpolated linear ROM. The proposed overall model reduction framework is illustrated with two applications: a parametric inverse acoustic scattering problem associated with a mockup submarine, and a parametric flutter prediction problem associated with a wing-tank system. The second application is implemented on a mobile device, illustrating the capability of the proposed computational framework to operate in real-time.

  3. Flight Testing an Integrated Synthetic Vision System

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Arthur, Jarvis J., III; Bailey, Randall E.; Prinzel, Lawrence J., III

    2005-01-01

    NASA's Synthetic Vision Systems (SVS) project is developing technologies with practical applications to eliminate low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced pathway guidance for transport aircraft. The SVS concept being developed at NASA encompasses the integration of tactical and strategic Synthetic Vision Display Concepts (SVDC) with Runway Incursion Prevention System (RIPS) alerting and display concepts, real-time terrain database integrity monitoring equipment (DIME), and Enhanced Vision Systems (EVS) and/or improved Weather Radar for real-time object detection and database integrity monitoring. A flight test evaluation was jointly conducted (in July and August 2004) by NASA Langley Research Center and an industry partner team under NASA's Aviation Safety and Security, Synthetic Vision System project. A Gulfstream GV aircraft was flown over a 3-week period in the Reno/Tahoe International Airport (NV) local area and an additional 3-week period in the Wallops Flight Facility (VA) local area to evaluate integrated Synthetic Vision System concepts. The enabling technologies (RIPS, EVS and DIME) were integrated into the larger SVS concept design. This paper presents experimental methods and the high level results of this flight test.

  4. The Application of Lidar to Synthetic Vision System Integrity

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve

    2003-01-01

    One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.

  5. Real-time color measurement using active illuminant

    NASA Astrophysics Data System (ADS)

    Tominaga, Shoji; Horiuchi, Takahiko; Yoshimura, Akihiko

    2010-01-01

    This paper proposes a method for real-time color measurement using active illuminant. A synchronous measurement system is constructed by combining a high-speed active spectral light source and a high-speed monochrome camera. The light source is a programmable spectral source which is capable of emitting arbitrary spectrum in high speed. This system is the essential advantage of capturing spectral images without using filters in high frame rates. The new method of real-time colorimetry is different from the traditional method based on the colorimeter or the spectrometers. We project the color-matching functions onto an object surface as spectral illuminants. Then we can obtain the CIE-XYZ tristimulus values directly from the camera outputs at every point on the surface. We describe the principle of our colorimetric technique based on projection of the color-matching functions and the procedure for realizing a real-time measurement system of a moving object. In an experiment, we examine the performance of real-time color measurement for a static object and a moving object.

  6. Security Event Recognition for Visual Surveillance

    NASA Astrophysics Data System (ADS)

    Liao, W.; Yang, C.; Yang, M. Ying; Rosenhahn, B.

    2017-05-01

    With rapidly increasing deployment of surveillance cameras, the reliable methods for automatically analyzing the surveillance video and recognizing special events are demanded by different practical applications. This paper proposes a novel effective framework for security event analysis in surveillance videos. First, convolutional neural network (CNN) framework is used to detect objects of interest in the given videos. Second, the owners of the objects are recognized and monitored in real-time as well. If anyone moves any object, this person will be verified whether he/she is its owner. If not, this event will be further analyzed and distinguished between two different scenes: moving the object away or stealing it. To validate the proposed approach, a new video dataset consisting of various scenarios is constructed for more complex tasks. For comparison purpose, the experiments are also carried out on the benchmark databases related to the task on abandoned luggage detection. The experimental results show that the proposed approach outperforms the state-of-the-art methods and effective in recognizing complex security events.

  7. Aerial vehicles collision avoidance using monocular vision

    NASA Astrophysics Data System (ADS)

    Balashov, Oleg; Muraviev, Vadim; Strotov, Valery

    2016-10-01

    In this paper image-based collision avoidance algorithm that provides detection of nearby aircraft and distance estimation is presented. The approach requires a vision system with a single moving camera and additional information about carrier's speed and orientation from onboard sensors. The main idea is to create a multi-step approach based on a preliminary detection, regions of interest (ROI) selection, contour segmentation, object matching and localization. The proposed algorithm is able to detect small targets but unlike many other approaches is designed to work with large-scale objects as well. To localize aerial vehicle position the system of equations relating object coordinates in space and observed image is solved. The system solution gives the current position and speed of the detected object in space. Using this information distance and time to collision can be estimated. Experimental research on real video sequences and modeled data is performed. Video database contained different types of aerial vehicles: aircrafts, helicopters, and UAVs. The presented algorithm is able to detect aerial vehicles from several kilometers under regular daylight conditions.

  8. A framework for intelligent data acquisition and real-time database searching for shotgun proteomics.

    PubMed

    Graumann, Johannes; Scheltema, Richard A; Zhang, Yong; Cox, Jürgen; Mann, Matthias

    2012-03-01

    In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides "on-the-fly" within 30 ms, well within the time constraints of a shotgun fragmentation "topN" method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available.

  9. A Framework for Intelligent Data Acquisition and Real-Time Database Searching for Shotgun Proteomics*

    PubMed Central

    Graumann, Johannes; Scheltema, Richard A.; Zhang, Yong; Cox, Jürgen; Mann, Matthias

    2012-01-01

    In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides “on-the-fly” within 30 ms, well within the time constraints of a shotgun fragmentation “topN” method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available. PMID:22171319

  10. CropEx Web-Based Agricultural Monitoring and Decision Support

    NASA Technical Reports Server (NTRS)

    Harvey. Craig; Lawhead, Joel

    2011-01-01

    CropEx is a Web-based agricultural Decision Support System (DSS) that monitors changes in crop health over time. It is designed to be used by a wide range of both public and private organizations, including individual producers and regional government offices with a vested interest in tracking vegetation health. The database and data management system automatically retrieve and ingest data for the area of interest. Another stores results of the processing and supports the DSS. The processing engine will allow server-side analysis of imagery with support for image sub-setting and a set of core raster operations for image classification, creation of vegetation indices, and change detection. The system includes the Web-based (CropEx) interface, data ingestion system, server-side processing engine, and a database processing engine. It contains a Web-based interface that has multi-tiered security profiles for multiple users. The interface provides the ability to identify areas of interest to specific users, user profiles, and methods of processing and data types for selected or created areas of interest. A compilation of programs is used to ingest available data into the system, classify that data, profile that data for quality, and make data available for the processing engine immediately upon the data s availability to the system (near real time). The processing engine consists of methods and algorithms used to process the data in a real-time fashion without copying, storing, or moving the raw data. The engine makes results available to the database processing engine for storage and further manipulation. The database processing engine ingests data from the image processing engine, distills those results into numerical indices, and stores each index for an area of interest. This process happens each time new data is ingested and processed for the area of interest, and upon subsequent database entries, the database processing engine qualifies each value for each area of interest and conducts a logical processing of results indicating when and where thresholds are exceeded. Reports are provided at regular, operator-determined intervals that include variances from thresholds and links to view raw data for verification, if necessary. The technology and method of development allow the code base to easily be modified for varied use in the real-time and near-real-time processing environments. In addition, the final product will be demonstrated as a means for rapid draft assessment of imagery.

  11. Robust real-time horizon detection in full-motion video

    NASA Astrophysics Data System (ADS)

    Young, Grace B.; Bagnall, Bryan; Lane, Corey; Parameswaran, Shibin

    2014-06-01

    The ability to detect the horizon on a real-time basis in full-motion video is an important capability to aid and facilitate real-time processing of full-motion videos for the purposes such as object detection, recognition and other video/image segmentation applications. In this paper, we propose a method for real-time horizon detection that is designed to be used as a front-end processing unit for a real-time marine object detection system that carries out object detection and tracking on full-motion videos captured by ship/harbor-mounted cameras, Unmanned Aerial Vehicles (UAVs) or any other method of surveillance for Maritime Domain Awareness (MDA). Unlike existing horizon detection work, we cannot assume a priori the angle or nature (for e.g. straight line) of the horizon, due to the nature of the application domain and the data. Therefore, the proposed real-time algorithm is designed to identify the horizon at any angle and irrespective of objects appearing close to and/or occluding the horizon line (for e.g. trees, vehicles at a distance) by accounting for its non-linear nature. We use a simple two-stage hierarchical methodology, leveraging color-based features, to quickly isolate the region of the image containing the horizon and then perform a more ne-grained horizon detection operation. In this paper, we present our real-time horizon detection results using our algorithm on real-world full-motion video data from a variety of surveillance sensors like UAVs and ship mounted cameras con rming the real-time applicability of this method and its ability to detect horizon with no a priori assumptions.

  12. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  13. Enhancements to the EPANET-RTX (Real-Time Analytics) ...

    EPA Pesticide Factsheets

    Technical brief and software The U.S. Environmental Protection Agency (EPA) developed EPANET-RTX as a collection of object-oriented software libraries comprising the core data access, data transformation, and data synthesis (real-time analytics) components of a real-time hydraulic and water quality modeling system. While EPANET-RTX uses the hydraulic and water quality solvers of EPANET, the object libraries are a self-contained set of building blocks for software developers. “Real-time EPANET” promises to change the way water utilities, commercial vendors, engineers, and the water community think about modeling.

  14. Controlled Substance Reconciliation Accuracy Improvement Using Near Real-Time Drug Transaction Capture from Automated Dispensing Cabinets.

    PubMed

    Epstein, Richard H; Dexter, Franklin; Gratch, David M; Perino, Michael; Magrann, Jerry

    2016-06-01

    Accurate accounting of controlled drug transactions by inpatient hospital pharmacies is a requirement in the United States under the Controlled Substances Act. At many hospitals, manual distribution of controlled substances from pharmacies is being replaced by automated dispensing cabinets (ADCs) at the point of care. Despite the promise of improved accountability, a high prevalence (15%) of controlled substance discrepancies between ADC records and anesthesia information management systems (AIMS) has been published, with a similar incidence (15.8%; 95% confidence interval [CI], 15.3% to 16.2%) noted at our institution. Most reconciliation errors are clerical. In this study, we describe a method to capture drug transactions in near real-time from our ADCs, compare them with documentation in our AIMS, and evaluate subsequent improvement in reconciliation accuracy. ADC-controlled substance transactions are transmitted to a hospital interface server, parsed, reformatted, and sent to a software script written in Perl. The script extracts the data and writes them to a SQL Server database. Concurrently, controlled drug totals for each patient having care are documented in the AIMS and compared with the balance of the ADC transactions (i.e., vending, transferring, wasting, and returning drug). Every minute, a reconciliation report is available to anesthesia providers over the hospital Intranet from AIMS workstations. The report lists all patients, the current provider, the balance of ADC transactions, the totals from the AIMS, the difference, and whether the case is still ongoing or had concluded. Accuracy and latency of the ADC transaction capture process were assessed via simulation and by comparison with pharmacy database records, maintained by the vendor on a central server located remotely from the hospital network. For assessment of reconciliation accuracy over time, data were collected from our AIMS from January 2012 to June 2013 (Baseline), July 2013 to April 2014 (Next Day Reports), and May 2014 to September 2015 (Near Real-Time Reports) and reconciled against pharmacy records from the central pharmacy database maintained by the vendor. Control chart (batch means) methods were used between successive epochs to determine if improvement had taken place. During simulation, 100% of 10,000 messages, transmitted at a rate of 1295 per minute, were accurately captured and inserted into the database. Latency (transmission time to local database insertion time) was 46.3 ± 0.44 milliseconds (SEM). During acceptance testing, only 1 of 1384 transactions analyzed had a difference between the near real-time process and what was in the central database; this was for a "John Doe" patient whose name had been changed subsequent to data capture. Once a transaction was entered at the ADC workstation, 84.9% (n = 18 bins; 95% CI, 78.4% to 91.3%) of these transactions were available in the database on the AIMS server within 2 minutes. Within 5 minutes, 98.2% (n = 18 bins; 95% CI, 97.2% to 99.3%) were available. Among 145,642 transactions present in the central pharmacy database, only 24 were missing from the local database table (mean = 0.018%; 95% CI, 0.002% to 0.034%). Implementation of near real-time reporting improved the controlled substance reconciliation error rate compared to the previous Next Day Reports epoch, from 8.8% to 5.2% (difference = -3.6%; 95% CI, -4.3% to -2.8%; P < 10). Errors were distributed among staff, with 50% of discrepancies accounted for by 12.4% of providers and 80% accounted for by 28.5% of providers executing transactions during the Near Real-Time Reports epoch. The near real-time system for the capture of transactional data flowing over the hospital network was highly accurate, reliable, and exhibited acceptable latency. This methodology can be used to implement similar data capture for transactions from their drug ADCs. Reconciliation accuracy improved significantly as a result of implementation. Our approach may be of particular utility at facilities with limited pharmacy resources to audit anesthesia records for controlled substance administration and reconcile them against dispensing records.

  15. Software-safety and software quality assurance in real-time applications Part 2: Real-time structures and languages

    NASA Astrophysics Data System (ADS)

    Schoitsch, Erwin

    1988-07-01

    Our society is depending more and more on the reliability of embedded (real-time) computer systems even in every-day life. Considering the complexity of the real world, this might become a severe threat. Real-time programming is a discipline important not only in process control and data acquisition systems, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt- and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other and with respect to their potential to quality and safety.

  16. Case Study Analyses of the Success DC-8 Scanning Lidar Database

    NASA Technical Reports Server (NTRS)

    Uthe, Edward E.

    2000-01-01

    Under project SUCCESS (Subsonic Aircraft Contrail and Cloud Effects Special Study) funded by the Atmospheric Effects of Aviation Program, SRI International (SRI) developed an angular scanning back'scatter lidar for operation on the NASA DC-8 research aircraft and deployed the scanning lidar during the SUCCESS field campaign. The primary purpose of the lidar was to generate real-time video displays of clouds and contrails above, ahead of, and below the DC-8 as a means to help position the aircraft for optimum cloud and contrail sampling by onboard in situ sensors, and to help extend the geometrical domain of the in situ sampling records. A large, relatively complex lidar database was collected and several data examples were processed to illustrate the value of the lidar data for interpreting the other data records collected during SUCCESS. These data examples were used to develop a journal publication for the special SUCCESS Geophysical Research Letters issue (reprint presented as Appendix A). The data examples justified data analyses of a larger part of the DC-8 lidar database and is the objective of the current study.

  17. Kerman Photovoltaic Power Plant R&D data collection computer system operations and maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosen, P.B.

    1994-06-01

    The Supervisory Control and Data Acquisition (SCADA) system at the Kerman PV Plant monitors 52 analog, 44 status, 13 control, and 4 accumulator data points in real-time. A Remote Terminal Unit (RTU) polls 7 peripheral data acquisition units that are distributed throughout the plant once every second, and stores all analog, status, and accumulator points that have changed since the last scan. The R&D Computer, which is connected to the SCADA RTU via a RS-232 serial link, polls the RTU once every 5-7 seconds and records any values that have changed since the last scan. A SCADA software package calledmore » RealFlex runs on the R&D computer and stores all updated data values taken from the RTU, along with a time-stamp for each, in a historical real-time database. From this database, averages of all analog data points and snapshots of all status points are generated every 10 minutes and appended to a daily file. These files are downloaded via modem by PVUSA/Davis staff every day, and the data is placed into the PVUSA database.« less

  18. Rapid production of optimal-quality reduced-resolution representations of very large databases

    DOEpatents

    Sigeti, David E.; Duchaineau, Mark; Miller, Mark C.; Wolinsky, Murray; Aldrich, Charles; Mineev-Weinstein, Mark B.

    2001-01-01

    View space representation data is produced in real time from a world space database representing terrain features. The world space database is first preprocessed. A database is formed having one element for each spatial region corresponding to a finest selected level of detail. A multiresolution database is then formed by merging elements and a strict error metric is computed for each element at each level of detail that is independent of parameters defining the view space. The multiresolution database and associated strict error metrics are then processed in real time for real time frame representations. View parameters for a view volume comprising a view location and field of view are selected. The error metric with the view parameters is converted to a view-dependent error metric. Elements with the coarsest resolution are chosen for an initial representation. Data set first elements from the initial representation data set are selected that are at least partially within the view volume. The first elements are placed in a split queue ordered by the value of the view-dependent error metric. If the number of first elements in the queue meets or exceeds a predetermined number of elements or whether the largest error metric is less than or equal to a selected upper error metric bound, the element at the head of the queue is force split and the resulting elements are inserted into the queue. Force splitting is continued until the determination is positive to form a first multiresolution set of elements. The first multiresolution set of elements is then outputted as reduced resolution view space data representing the terrain features.

  19. Real Time Monitor of Grid job executions

    NASA Astrophysics Data System (ADS)

    Colling, D. J.; Martyniak, J.; McGough, A. S.; Křenek, A.; Sitera, J.; Mulač, M.; Dvořák, F.

    2010-04-01

    In this paper we describe the architecture and operation of the Real Time Monitor (RTM), developed by the Grid team in the HEP group at Imperial College London. This is arguably the most popular dissemination tool within the EGEE [1] Grid. Having been used, on many occasions including GridFest and LHC inauguration events held at CERN in October 2008. The RTM gathers information from EGEE sites hosting Logging and Bookkeeping (LB) services. Information is cached locally at a dedicated server at Imperial College London and made available for clients to use in near real time. The system consists of three main components: the RTM server, enquirer and an apache Web Server which is queried by clients. The RTM server queries the LB servers at fixed time intervals, collecting job related information and storing this in a local database. Job related data includes not only job state (i.e. Scheduled, Waiting, Running or Done) along with timing information but also other attributes such as Virtual Organization and Computing Element (CE) queue - if known. The job data stored in the RTM database is read by the enquirer every minute and converted to an XML format which is stored on a Web Server. This decouples the RTM server database from the client removing the bottleneck problem caused by many clients simultaneously accessing the database. This information can be visualized through either a 2D or 3D Java based client with live job data either being overlaid on to a 2 dimensional map of the world or rendered in 3 dimensions over a globe map using OpenGL.

  20. BIO-Plex Information System Concept

    NASA Technical Reports Server (NTRS)

    Jones, Harry; Boulanger, Richard; Arnold, James O. (Technical Monitor)

    1999-01-01

    This paper describes a suggested design for an integrated information system for the proposed BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) at Johnson Space Center (JSC), including distributed control systems, central control, networks, database servers, personal computers and workstations, applications software, and external communications. The system will have an open commercial computing and networking, architecture. The network will provide automatic real-time transfer of information to database server computers which perform data collection and validation. This information system will support integrated, data sharing applications for everything, from system alarms to management summaries. Most existing complex process control systems have information gaps between the different real time subsystems, between these subsystems and central controller, between the central controller and system level planning and analysis application software, and between the system level applications and management overview reporting. An integrated information system is vitally necessary as the basis for the integration of planning, scheduling, modeling, monitoring, and control, which will allow improved monitoring and control based on timely, accurate and complete data. Data describing the system configuration and the real time processes can be collected, checked and reconciled, analyzed and stored in database servers that can be accessed by all applications. The required technology is available. The only opportunity to design a distributed, nonredundant, integrated system is before it is built. Retrofit is extremely difficult and costly.

  1. An Introduction to Database Structure and Database Machines.

    ERIC Educational Resources Information Center

    Detweiler, Karen

    1984-01-01

    Enumerates principal management objectives of database management systems (data independence, quality, security, multiuser access, central control) and criteria for comparison (response time, size, flexibility, other features). Conventional database management systems, relational databases, and database machines used for backend processing are…

  2. Designing a multi-petabyte database for LSST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becla, J; Hanushevsky, A

    2005-12-21

    The 3.2 giga-pixel LSST camera will produce over half a petabyte of raw images every month. This data needs to be reduced in under a minute to produce real-time transient alerts, and then cataloged and indexed to allow efficient access and simplify further analysis. The indexed catalogs alone are expected to grow at a speed of about 600 terabytes per year. The sheer volume of data, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require cutting-edge techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on amore » database for catalogs and metadata. Several database systems are being evaluated to understand how they will scale and perform at these data volumes in anticipated LSST access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, and the database architecture that is expected to be adopted in order to meet the data challenges.« less

  3. The implementation of aerial object recognition algorithm based on contour descriptor in FPGA-based on-board vision system

    NASA Astrophysics Data System (ADS)

    Babayan, Pavel; Smirnov, Sergey; Strotov, Valery

    2017-10-01

    This paper describes the aerial object recognition algorithm for on-board and stationary vision system. Suggested algorithm is intended to recognize the objects of a specific kind using the set of the reference objects defined by 3D models. The proposed algorithm based on the outer contour descriptor building. The algorithm consists of two stages: learning and recognition. Learning stage is devoted to the exploring of reference objects. Using 3D models we can build the database containing training images by rendering the 3D model from viewpoints evenly distributed on a sphere. Sphere points distribution is made by the geosphere principle. Gathered training image set is used for calculating descriptors, which will be used in the recognition stage of the algorithm. The recognition stage is focusing on estimating the similarity of the captured object and the reference objects by matching an observed image descriptor and the reference object descriptors. The experimental research was performed using a set of the models of the aircraft of the different types (airplanes, helicopters, UAVs). The proposed orientation estimation algorithm showed good accuracy in all case studies. The real-time performance of the algorithm in FPGA-based vision system was demonstrated.

  4. Real-Time Laser Ultrasound Tomography for Profilometry of Solids

    NASA Astrophysics Data System (ADS)

    Zarubin, V. P.; Bychkov, A. S.; Karabutov, A. A.; Simonova, V. A.; Kudinov, I. A.; Cherepetskaya, E. B.

    2018-01-01

    We studied the possibility of applying laser ultrasound tomography for profilometry of solids. The proposed approach provides high spatial resolution and efficiency, as well as profilometry of contaminated objects or objects submerged in liquids. The algorithms for the construction of tomograms and recognition of the profiles of studied objects using the parallel programming technology NDIVIA CUDA are proposed. A prototype of the real-time laser ultrasound profilometer was used to obtain the profiles of solid surfaces of revolution. The proposed method allows the real-time determination of the surface position for cylindrical objects with an approximation accuracy of up to 16 μm.

  5. Automatic labeling and characterization of objects using artificial neural networks

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Hill, Scott E.; Cromp, Robert F.

    1989-01-01

    Existing NASA supported scientific data bases are usually developed, managed and populated in a tedious, error prone and self-limiting way in terms of what can be described in a relational Data Base Management System (DBMS). The next generation Earth remote sensing platforms, i.e., Earth Observation System, (EOS), will be capable of generating data at a rate of over 300 Mbs per second from a suite of instruments designed for different applications. What is needed is an innovative approach that creates object-oriented databases that segment, characterize, catalog and are manageable in a domain-specific context and whose contents are available interactively and in near-real-time to the user community. Described here is work in progress that utilizes an artificial neural net approach to characterize satellite imagery of undefined objects into high-level data objects. The characterized data is then dynamically allocated to an object-oriented data base where it can be reviewed and assessed by a user. The definition, development, and evolution of the overall data system model are steps in the creation of an application-driven knowledge-based scientific information system.

  6. Real-time Transients from Palomar-QUEST Synoptic Sky Survey

    NASA Astrophysics Data System (ADS)

    Mahabal, Ashish A.; Drake, A.; Djorgovski, S. G.; Donalek, C.; Glikman, E.; Graham, M. J.; Williams, R.; Baltay, C.; Rabinowitz, D.; Bauer, A.; Ellman, N.; Lauer, R.; PQ Team Indiana

    2006-12-01

    The data from the driftscans of the Palomar-QUEST synoptic sky survey is now routinely processed in real-time. We describe here the various components of the pipeline. We search for both variable and transient objects, including supernovae, variable AGN, GRB orphan afterglows, cataclysmic variables, interesting stellar flares, novae, other types of variable stars, and do not exclude the possibility of even entirely new types of objects or phenomena. In order to flag as many asteroids as possible we have been doing two 4-hour scans of the same area covering 250 sq. deg and detect over a million sources. Flagging a source as a candidate transient requires detection in at least two filters besides its absence in fiducial sky constructed from past images. We use various software filters to eliminate instrument artifacts, and false alarms due to the proximity of bright, saturated stars which dominate the initial detection rate. This leaves up to a couple of hundred asteroids and genuine transients. Previously known asteroids are flagged through an automated comparison with a databases of known asteroids, and new ones through apparent motion. In the end, we have typically 10 20 astrophysical transients remaining per night, and we are currently working on their automated classification, and spectroscopic follow-up. We present preliminary results from real-time follow-up of a few candidates carried out with the Palomar 200-inch telescope as part of a pilot project. Finally we outline the plans for the much harder problem of classifying the transients more accurately for distribution through VOEventNet to astronomers interested only in specific types of transients, more details and overall setting of which is covered in our VOEventNet poster (Drake et al.)

  7. Real-time visual tracking of less textured three-dimensional objects on mobile platforms

    NASA Astrophysics Data System (ADS)

    Seo, Byung-Kuk; Park, Jungsik; Park, Hanhoon; Park, Jong-Il

    2012-12-01

    Natural feature-based approaches are still challenging for mobile applications (e.g., mobile augmented reality), because they are feasible only in limited environments such as highly textured and planar scenes/objects, and they need powerful mobile hardware for fast and reliable tracking. In many cases where conventional approaches are not effective, three-dimensional (3-D) knowledge of target scenes would be beneficial. We present a well-established framework for real-time visual tracking of less textured 3-D objects on mobile platforms. Our framework is based on model-based tracking that efficiently exploits partially known 3-D scene knowledge such as object models and a background's distinctive geometric or photometric knowledge. Moreover, we elaborate on implementation in order to make it suitable for real-time vision processing on mobile hardware. The performance of the framework is tested and evaluated on recent commercially available smartphones, and its feasibility is shown by real-time demonstrations.

  8. Video enhancement workbench: an operational real-time video image processing system

    NASA Astrophysics Data System (ADS)

    Yool, Stephen R.; Van Vactor, David L.; Smedley, Kirk G.

    1993-01-01

    Video image sequences can be exploited in real-time, giving analysts rapid access to information for military or criminal investigations. Video-rate dynamic range adjustment subdues fluctuations in image intensity, thereby assisting discrimination of small or low- contrast objects. Contrast-regulated unsharp masking enhances differentially shadowed or otherwise low-contrast image regions. Real-time removal of localized hotspots, when combined with automatic histogram equalization, may enhance resolution of objects directly adjacent. In video imagery corrupted by zero-mean noise, real-time frame averaging can assist resolution and location of small or low-contrast objects. To maximize analyst efficiency, lengthy video sequences can be screened automatically for low-frequency, high-magnitude events. Combined zoom, roam, and automatic dynamic range adjustment permit rapid analysis of facial features captured by video cameras recording crimes in progress. When trying to resolve small objects in murky seawater, stereo video places the moving imagery in an optimal setting for human interpretation.

  9. Objective speech quality evaluation of real-time speech coders

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. R.; Russell, W. H.; Huggins, A. W. F.

    1984-02-01

    This report describes the work performed in two areas: subjective testing of a real-time 16 kbit/s adaptive predictive coder (APC) and objective speech quality evaluation of real-time coders. The speech intelligibility of the APC coder was tested using the Diagnostic Rhyme Test (DRT), and the speech quality was tested using the Diagnostic Acceptability Measure (DAM) test, under eight operating conditions involving channel error, acoustic background noise, and tandem link with two other coders. The test results showed that the DRT and DAM scores of the APC coder equalled or exceeded the corresponding test scores fo the 32 kbit/s CVSD coder. In the area of objective speech quality evaluation, the report describes the development, testing, and validation of a procedure for automatically computing several objective speech quality measures, given only the tape-recordings of the input speech and the corresponding output speech of a real-time speech coder.

  10. Software engineering aspects of real-time programming concepts

    NASA Astrophysics Data System (ADS)

    Schoitsch, Erwin

    1986-08-01

    Real-time programming is a discipline of great importance not only in process control, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other. The second part deals with structuring and modularization of technical processes to build reliable and maintainable real time systems. Software-quality and software engineering aspects are considered throughout the paper.

  11. University Real Estate Development Database: A Database-Driven Internet Research Tool

    ERIC Educational Resources Information Center

    Wiewel, Wim; Kunst, Kara

    2008-01-01

    The University Real Estate Development Database is an Internet resource developed by the University of Baltimore for the Lincoln Institute of Land Policy, containing over six hundred cases of university expansion outside of traditional campus boundaries. The University Real Estate Development database is a searchable collection of real estate…

  12. Real-time immune-inspired optimum state-of-charge trajectory estimation using upcoming route information preview and neural networks for plug-in hybrid electric vehicles fuel economy

    NASA Astrophysics Data System (ADS)

    Mozaffari, Ahmad; Vajedi, Mahyar; Azad, Nasser L.

    2015-06-01

    The main proposition of the current investigation is to develop a computational intelligence-based framework which can be used for the real-time estimation of optimum battery state-of-charge (SOC) trajectory in plug-in hybrid electric vehicles (PHEVs). The estimated SOC trajectory can be then employed for an intelligent power management to significantly improve the fuel economy of the vehicle. The devised intelligent SOC trajectory builder takes advantage of the upcoming route information preview to achieve the lowest possible total cost of electricity and fossil fuel. To reduce the complexity of real-time optimization, the authors propose an immune system-based clustering approach which allows categorizing the route information into a predefined number of segments. The intelligent real-time optimizer is also inspired on the basis of interactions in biological immune systems, and is called artificial immune algorithm (AIA). The objective function of the optimizer is derived from a computationally efficient artificial neural network (ANN) which is trained by a database obtained from a high-fidelity model of the vehicle built in the Autonomie software. The simulation results demonstrate that the integration of immune inspired clustering tool, AIA and ANN, will result in a powerful framework which can generate a near global optimum SOC trajectory for the baseline vehicle, that is, the Toyota Prius PHEV. The outcomes of the current investigation prove that by taking advantage of intelligent approaches, it is possible to design a computationally efficient and powerful SOC trajectory builder for the intelligent power management of PHEVs.

  13. A Comparison of Alternative Methods of Obtaining Defense Logistics Agency (DLA) Cognizance Spare Parts for Contractor Furnished Equipment (CFE) during Initial Outfitting of New Construction U.S. Navy Ships

    DTIC Science & Technology

    1991-12-01

    database, the Real Time Operation Management Information System (ROMIS), and Fitting Out Management Information System (FOMIS). These three configuration...Codes ROMIS Real Time Operation Management Information System SCLSIS Ship’s Configuration and Logistics Information System SCN Shipbuilding and

  14. Climate Signals: An On-Line Digital Platform for Mapping Climate Change Impacts in Real Time

    NASA Astrophysics Data System (ADS)

    Cutting, H.

    2016-12-01

    Climate Signals is an on-line digital platform for cataloging and mapping the impacts of climate change. The CS platform specifies and details the chains of connections between greenhouse gas emissions and individual climate events. Currently in open-beta release, the platform is designed to to engage and serve the general public, news media, and policy-makers, particularly in real-time during extreme climate events. Climate Signals consists of a curated relational database of events and their links to climate change, a mapping engine, and a gallery of climate change monitors offering real-time data. For each event in the database, an infographic engine provides a custom attribution "tree" that illustrates the connections to climate change. In addition, links to key contextual resources are aggregated and curated for each event. All event records are fully annotated with detailed source citations and corresponding hyper links. The system of attribution used to link events to climate change in real-time is detailed here. This open-beta release is offered for public user testing and engagement. Launched in May 2016, the operation of this platform offers lessons for public engagement in climate change impacts.

  15. Real-time speech gisting for ATC applications

    NASA Astrophysics Data System (ADS)

    Dunkelberger, Kirk A.

    1995-06-01

    Command and control within the ATC environment remains primarily voice-based. Hence, automatic real time, speaker independent, continuous speech recognition (CSR) has many obvious applications and implied benefits to the ATC community: automated target tagging, aircraft compliance monitoring, controller training, automatic alarm disabling, display management, and many others. However, while current state-of-the-art CSR systems provide upwards of 98% word accuracy in laboratory environments, recent low-intrusion experiments in the ATCT environments demonstrated less than 70% word accuracy in spite of significant investments in recognizer tuning. Acoustic channel irregularities and controller/pilot grammar verities impact current CSR algorithms at their weakest points. It will be shown herein, however, that real time context- and environment-sensitive gisting can provide key command phrase recognition rates of greater than 95% using the same low-intrusion approach. The combination of real time inexact syntactic pattern recognition techniques and a tight integration of CSR, gisting, and ATC database accessor system components is the key to these high phase recognition rates. A system concept for real time gisting in the ATC context is presented herein. After establishing an application context, discussion presents a minimal CSR technology context then focuses on the gisting mechanism, desirable interfaces into the ATCT database environment, and data and control flow within the prototype system. Results of recent tests for a subset of the functionality are presented together with suggestions for further research.

  16. Real-time detection of moving objects from moving vehicles using dense stereo and optical flow

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Matthies, Larry

    2004-01-01

    Dynamic scene perception is very important for autonomous vehicles operating around other moving vehicles and humans. Most work on real-time object tracking from moving platforms has used sparse features or assumed flat scene structures. We have recently extended a real-time, dense stereo system to include real-time, dense optical flow, enabling more comprehensive dynamic scene analysis. We describe algorithms to robustly estimate 6-DOF robot egomotion in the presence of moving objects using dense flow and dense stereo. We then use dense stereo and egomotion estimates to identity other moving objects while the robot itself is moving. We present results showing accurate egomotion estimation and detection of moving people and vehicles under general 6-DOF motion of the robot and independently moving objects. The system runs at 18.3 Hz on a 1.4 GHz Pentium M laptop, computing 160x120 disparity maps and optical flow fields, egomotion, and moving object segmentation. We believe this is a significant step toward general unconstrained dynamic scene analysis for mobile robots, as well as for improved position estimation where GPS is unavailable.

  17. A scientific database for real-time Neutron Monitor measurements - taking Neutron Monitors into the 21st century

    NASA Astrophysics Data System (ADS)

    Steigies, Christian

    2012-07-01

    The Neutron Monitor Database project, www.nmdb.eu, has been funded in 2008 and 2009 by the European Commission's 7th framework program (FP7). Neutron monitors (NMs) have been in use worldwide since the International Geophysical Year (IGY) in 1957 and cosmic ray data from the IGY and the improved NM64 NMs has been distributed since this time, but a common data format existed only for data with one hour resolution. This data was first distributed in printed books, later via the World Data Center ftp server. In the 1990's the first NM stations started to record data at higher resolutions (typically 1 minute) and publish in on their webpages. However, every NM station chose their own format, making it cumbersome to work with this distributed data. In NMDB all European and some neighboring NM stations came together to agree on a common format for high-resolution data and made this available via a centralized database. The goal of NMDB is to make all data from all NM stations available in real-time. The original NMDB network has recently been joined by the Bartol Research Institute (Newark DE, USA), the National Autonomous University of Mexico and the North-West University (Potchefstroom, South Africa). The data is accessible to everyone via an easy to use webinterface, but expert users can also directly access the database to build applications like real-time space weather alerts. Even though SQL databases are used today by most webservices (blogs, wikis, social media, e-commerce), the power of an SQL database has not yet been fully realized by the scientific community. In training courses, we are teaching how to make use of NMDB, how to join NMDB, and how to ensure the data quality. The present status of the extended NMDB will be presented. The consortium welcomes further data providers to help increase the scientific contributions of the worldwide neutron monitor network to heliospheric physics and space weather.

  18. XTCE (XML Telemetric and Command Exchange) Standard Making It Work at NASA. Can It Work For You?

    NASA Technical Reports Server (NTRS)

    Munoz-Fernandez, Michela; Smith, Danford S.; Rice, James K.; Jones, Ronald A.

    2017-01-01

    The XML Telemetric and Command Exchange (XTCE) standard is intended as a way to describe telemetry and command databases to be exchanged across centers and space agencies. XTCE usage has the potential to lead to consolidation of the Mission Operations Center (MOC) Monitor and Control displays for mission cross-support, reducing equipment and configuration costs, as well as a decrease in the turnaround time for telemetry and command modifications during all the mission phases. The adoption of XTCE will reduce software maintenance costs by reducing the variation between our existing mission dictionaries. The main objective of this poster is to show how powerful XTCE is in terms of interoperability across centers and missions. We will provide results for a use case where two centers can use their local tools to process and display the same mission telemetry in their MOC independently of one another. In our use case we have first quantified the ability for XTCE to capture the telemetry definitions of the mission by use of our suite of support tools (Conversion, Validation, and Compliance measurement). The next step was to show processing and monitoring of the same telemetry in two mission centers. Once the database was converted to XTCE using our tool, the XTCE file became our primary database and was shared among the various tool chains through their XTCE importers and ultimately configured to ingest the telemetry stream and display or capture the telemetered information in similar ways.Summary results include the ability to take a real mission database and real mission telemetry and display them on various tools from two centers, as well as using commercially free COTS.

  19. A method for real-time visual stimulus selection in the study of cortical object perception.

    PubMed

    Leeds, Daniel D; Tarr, Michael J

    2016-06-01

    The properties utilized by visual object perception in the mid- and high-level ventral visual pathway are poorly understood. To better establish and explore possible models of these properties, we adopt a data-driven approach in which we repeatedly interrogate neural units using functional Magnetic Resonance Imaging (fMRI) to establish each unit's image selectivity. This approach to imaging necessitates a search through a broad space of stimulus properties using a limited number of samples. To more quickly identify the complex visual features underlying human cortical object perception, we implemented a new functional magnetic resonance imaging protocol in which visual stimuli are selected in real-time based on BOLD responses to recently shown images. Two variations of this protocol were developed, one relying on natural object stimuli and a second based on synthetic object stimuli, both embedded in feature spaces based on the complex visual properties of the objects. During fMRI scanning, we continuously controlled stimulus selection in the context of a real-time search through these image spaces in order to maximize neural responses across pre-determined 1cm(3) rain regions. Elsewhere we have reported the patterns of cortical selectivity revealed by this approach (Leeds et al., 2014). In contrast, here our objective is to present more detailed methods and explore the technical and biological factors influencing the behavior of our real-time stimulus search. We observe that: 1) Searches converged more reliably when exploring a more precisely parameterized space of synthetic objects; 2) real-time estimation of cortical responses to stimuli is reasonably consistent; 3) search behavior was acceptably robust to delays in stimulus displays and subject motion effects. Overall, our results indicate that real-time fMRI methods may provide a valuable platform for continuing study of localized neural selectivity, both for visual object representation and beyond. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. A method for real-time visual stimulus selection in the study of cortical object perception

    PubMed Central

    Leeds, Daniel D.; Tarr, Michael J.

    2016-01-01

    The properties utilized by visual object perception in the mid- and high-level ventral visual pathway are poorly understood. To better establish and explore possible models of these properties, we adopt a data-driven approach in which we repeatedly interrogate neural units using functional Magnetic Resonance Imaging (fMRI) to establish each unit’s image selectivity. This approach to imaging necessitates a search through a broad space of stimulus properties using a limited number of samples. To more quickly identify the complex visual features underlying human cortical object perception, we implemented a new functional magnetic resonance imaging protocol in which visual stimuli are selected in real-time based on BOLD responses to recently shown images. Two variations of this protocol were developed, one relying on natural object stimuli and a second based on synthetic object stimuli, both embedded in feature spaces based on the complex visual properties of the objects. During fMRI scanning, we continuously controlled stimulus selection in the context of a real-time search through these image spaces in order to maximize neural responses across predetermined 1 cm3 brain regions. Elsewhere we have reported the patterns of cortical selectivity revealed by this approach (Leeds 2014). In contrast, here our objective is to present more detailed methods and explore the technical and biological factors influencing the behavior of our real-time stimulus search. We observe that: 1) Searches converged more reliably when exploring a more precisely parameterized space of synthetic objects; 2) Real-time estimation of cortical responses to stimuli are reasonably consistent; 3) Search behavior was acceptably robust to delays in stimulus displays and subject motion effects. Overall, our results indicate that real-time fMRI methods may provide a valuable platform for continuing study of localized neural selectivity, both for visual object representation and beyond. PMID:26973168

  1. Software Design Document GT Real-Time Software Host CSCI (9B). Volume 1, Sections 1.0 - 2.12.19.2

    DTIC Science & Technology

    1991-06-01

    78 2.4.2.8 bOsig-frame-rate.c ............................. 79 2.4.2.9 bO- database -info.c.............................. 79...93 2.4.3 Ballistics Database Interaction ......................................... 94 2.4.3.1 bxbvolintc... database -disable................................ 417 2.12.6.11 _handle-.point-lights ............................. 418 2.12.6.12 _-reset~model-Wpinters

  2. Real-time optical multiple object recognition and tracking system and method

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin (Inventor); Liu, Hua Kuang (Inventor)

    1987-01-01

    The invention relates to an apparatus and associated methods for the optical recognition and tracking of multiple objects in real time. Multiple point spatial filters are employed that pre-define the objects to be recognized at run-time. The system takes the basic technology of a Vander Lugt filter and adds a hololens. The technique replaces time, space and cost-intensive digital techniques. In place of multiple objects, the system can also recognize multiple orientations of a single object. This later capability has potential for space applications where space and weight are at a premium.

  3. Automated segmentation and feature extraction of product inspection items

    NASA Astrophysics Data System (ADS)

    Talukder, Ashit; Casasent, David P.

    1997-03-01

    X-ray film and linescan images of pistachio nuts on conveyor trays for product inspection are considered. The final objective is the categorization of pistachios into good, blemished and infested nuts. A crucial step before classification is the separation of touching products and the extraction of features essential for classification. This paper addresses new detection and segmentation algorithms to isolate touching or overlapping items. These algorithms employ a new filter, a new watershed algorithm, and morphological processing to produce nutmeat-only images. Tests on a large database of x-ray film and real-time x-ray linescan images of around 2900 small, medium and large nuts showed excellent segmentation results. A new technique to detect and segment dark regions in nutmeat images is also presented and tested on approximately 300 x-ray film and approximately 300 real-time linescan x-ray images with 95-97 percent detection and correct segmentation. New algorithms are described that determine nutmeat fill ratio and locate splits in nutmeat. The techniques formulated in this paper are of general use in many different product inspection and computer vision problems.

  4. Electro-Optical Inspection For Tolerance Control As An Integral Part Of A Flexible Machining Cell

    NASA Astrophysics Data System (ADS)

    Renaud, Blaise

    1986-11-01

    Institut CERAC has been involved in optical metrology and 3-dimensional surface control for the last couple of years. Among the industrial applications considered is the on-line shape evaluation of machined parts within the manufacturing cell. The specific objective is to measure the machining errors and to compare them with the tolerances set by designers. An electro-optical sensing technique has been developed which relies on a projection Moire contouring optical method. A prototype inspection system has been designed, making use of video detection and computer image processing. Moire interferograms are interpreted, and the metrological information automatically retrieved. A structured database can be generated for subsequent data analysis and for real-time closed-loop corrective actions. A real-time kernel embedded into a synchronisation network (Petri-net) for the control of concurrent processes in the Electra-Optical Inspection (E0I) station was realised and implemented in a MODULA-2 program DIN01. The prototype system for on-line automatic tolerance control taking place within a flexible machining cell is described in this paper, together with the fast-prototype synchronisation program.

  5. WWW.NMDB.EU: The real-time Neutron Monitor database

    NASA Astrophysics Data System (ADS)

    Klein, Karl-Ludwig; Steigies, Christian T.; NMDB Consortium

    2010-05-01

    The Real time database for high-resolution neutron monitor measurements (NMDB), which was supported by the 7th framework program of the European Commission, hosts data on cosmic rays in the GeV range from European and some non-European neutron monitor stations. It offers a variety of applications ranging from the representation and retrieval of cosmic ray data over solar energetic particle alerts to the calculation of ionisation doses in the atmosphere and radiation dose rates at aircraft altitudes. Furthermore the web site comprises public outreach pages in several languages and offers training material on cosmic rays for university students and researchers and engineers who want to get familiar with cosmic rays and neutron monitor measurements. This contribution presents an overview of the provided services and indications on how to access the database. Operators of other neutron monitor stations are welcome to submit their data to NMDB.

  6. National database for calculating fuel available to wildfires

    Treesearch

    Donald McKenzie; Nancy H.F. French; Roger D. Ottmar

    2012-01-01

    Wildfires are increasingly emerging as an important component of Earth system models, particularly those that involve emissions from fires and their effects on climate. Currently, there are few resources available for estimating emissions from wildfires in real time, at subcontinental scales, in a spatially consistent manner. Developing subcontinental-scale databases...

  7. Could Blobs Fuel Storage-Based Convergence between HPC and Big Data?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matri, Pierre; Alforov, Yevhen; Brandon, Alvaro

    The increasingly growing data sets processed on HPC platforms raise major challenges for the underlying storage layer. A promising alternative to POSIX-IO- compliant file systems are simpler blobs (binary large objects), or object storage systems. Such systems offer lower overhead and better performance at the cost of largely unused features such as file hierarchies or permissions. Similarly, blobs are increasingly considered for replacing distributed file systems for big data analytics or as a base for storage abstractions such as key-value stores or time-series databases. This growing interest in such object storage on HPC and big data platforms raises the question:more » Are blobs the right level of abstraction to enable storage-based convergence between HPC and Big Data? In this paper we study the impact of blob-based storage for real-world applications on HPC and cloud environments. The results show that blobbased storage convergence is possible, leading to a significant performance improvement on both platforms« less

  8. WWW.NMDB.EU: The real-time Neutron Monitor databas

    NASA Astrophysics Data System (ADS)

    Klein, Karl-Ludwig; Steigies, Christian; Steigies, Christian T.; Wimmer-Schweingruber, Robert F.; Kudela, Karel; Strharsky, Igor; Langer, Ronald; Usoskin, Ilya; Ibragimov, Askar; Flückiger, Erwin O.; Bütikofer, Rolf; Eroshenko, Eugenia; Belov, Anatoly; Yanke, Victor; Klein, Karl-Ludwig; Fuller, Nicolas; Mavromichalaki, Helen; Papaioannou, Athana-Sios; Sarlanis, Christos; Souvatzoglou, George; Plainaki, Christina; Geron-Tidou, Maria; Papailiou, Maria-Christina; Mariatos, George; Chilingaryan, Ashot; Hovsepyan, G.; Reymers, Artur; Parisi, Mario; Kryakunova, Olga; Tsepakina, Irina; Nikolayevskiy, Nikolay; Dor-Man, Lev; Pustil'Nik, Lev; García-Población, Oscar

    The Real time database for high-resolution neutron monitor measurements(NMDB), which was supported by the 7th Framework Programme of the European Commission, hosts data on cosmic rays in the GeV range from European and some non-European neutron monitor stations. Besides real-time data and historical data over several decades in a unified format, it offers data products such as galactic cosmic ray spectra and applications including solar energetic particle alerts and the calculation of ionisation rates in the atmosphere and effective radiation dose rates at aircraft altitudes. Furthermore the web site comprises public outreach pages in several languages and offers training material on cosmic rays for university students and researchers and engineers who want to become familiar with cosmic rays and neutron monitor measurements. This contribution presents an overview of the provided services and indications on how to access the database. Operators of other neutron monitor stations are welcome to submit their data to NMDB.

  9. Airport take-off noise assessment aimed at identify responsible aircraft classes.

    PubMed

    Sanchez-Perez, Luis A; Sanchez-Fernandez, Luis P; Shaout, Adnan; Suarez-Guerra, Sergio

    2016-01-15

    Assessment of aircraft noise is an important task of nowadays airports in order to fight environmental noise pollution given the recent discoveries on the exposure negative effects on human health. Noise monitoring and estimation around airports mostly use aircraft noise signals only for computing statistical indicators and depends on additional data sources so as to determine required inputs such as the aircraft class responsible for noise pollution. In this sense, the noise monitoring and estimation systems have been tried to improve by creating methods for obtaining more information from aircraft noise signals, especially real-time aircraft class recognition. Consequently, this paper proposes a multilayer neural-fuzzy model for aircraft class recognition based on take-off noise signal segmentation. It uses a fuzzy inference system to build a final response for each class p based on the aggregation of K parallel neural networks outputs Op(k) with respect to Linear Predictive Coding (LPC) features extracted from K adjacent signal segments. Based on extensive experiments over two databases with real-time take-off noise measurements, the proposed model performs better than other methods in literature, particularly when aircraft classes are strongly correlated to each other. A new strictly cross-checked database is introduced including more complex classes and real-time take-off noise measurements from modern aircrafts. The new model is at least 5% more accurate with respect to previous database and successfully classifies 87% of measurements in the new database. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Collaborative real-time scheduling of multiple PTZ cameras for multiple object tracking in video surveillance

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Che; Huang, Chung-Lin

    2013-03-01

    This paper proposes a multi-PTZ-camera control mechanism to acquire close-up imagery of human objects in a surveillance system. The control algorithm is based on the output of multi-camera, multi-target tracking. Three main concerns of the algorithm are (1) the imagery of human object's face for biometric purposes, (2) the optimal video quality of the human objects, and (3) minimum hand-off time. Here, we define an objective function based on the expected capture conditions such as the camera-subject distance, pan tile angles of capture, face visibility and others. Such objective function serves to effectively balance the number of captures per subject and quality of captures. In the experiments, we demonstrate the performance of the system which operates in real-time under real world conditions on three PTZ cameras.

  11. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  12. Mining of high utility-probability sequential patterns from uncertain databases

    PubMed Central

    Zhang, Binbin; Fournier-Viger, Philippe; Li, Ting

    2017-01-01

    High-utility sequential pattern mining (HUSPM) has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs). They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM) for mining high utility-probability sequential patterns (HUPSPs) in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds. PMID:28742847

  13. ControlShell - A real-time software framework

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Ullman, Marc A.; Chen, Vincent W.

    1991-01-01

    ControlShell is designed to enable modular design and impplementation of real-time software. It is an object-oriented tool-set for real-time software system programming. It provides a series of execution and data interchange mechansims that form a framework for building real-time applications. These mechanisms allow a component-based approach to real-time software generation and mangement. By defining a set of interface specifications for intermodule interaction, ControlShell provides a common platform that is the basis for real-time code development and exchange.

  14. The Deep Lens Survey : Real--time Optical Transient and Moving Object Detection

    NASA Astrophysics Data System (ADS)

    Becker, Andy; Wittman, David; Stubbs, Chris; Dell'Antonio, Ian; Loomba, Dinesh; Schommer, Robert; Tyson, J. Anthony; Margoniner, Vera; DLS Collaboration

    2001-12-01

    We report on the real-time optical transient program of the Deep Lens Survey (DLS). Meeting the DLS core science weak-lensing objective requires repeated visits to the same part of the sky, 20 visits for 63 sub-fields in 4 filters, on a 4-m telescope. These data are reduced in real-time, and differenced against each other on all available timescales. Our observing strategy is optimized to allow sensitivity to transients on several minute, one day, one month, and one year timescales. The depth of the survey allows us to detect and classify both moving and stationary transients down to ~ 25th magnitude, a relatively unconstrained region of astronomical variability space. All transients and moving objects, including asteroids, Kuiper belt (or trans-Neptunian) objects, variable stars, supernovae, 'unknown' bursts with no apparent host, orphan gamma-ray burst afterglows, as well as airplanes, are posted on the web in real-time for use by the community. We emphasize our sensitivity to detect and respond in real-time to orphan afterglows of gamma-ray bursts, and present one candidate orphan in the field of Abell 1836. See http://dls.bell-labs.com/transients.html.

  15. Real-time terrain storage generation from multiple sensors towards mobile robot operation interface.

    PubMed

    Song, Wei; Cho, Seoungjae; Xi, Yulong; Cho, Kyungeun; Um, Kyhyun

    2014-01-01

    A mobile robot mounted with multiple sensors is used to rapidly collect 3D point clouds and video images so as to allow accurate terrain modeling. In this study, we develop a real-time terrain storage generation and representation system including a nonground point database (PDB), ground mesh database (MDB), and texture database (TDB). A voxel-based flag map is proposed for incrementally registering large-scale point clouds in a terrain model in real time. We quantize the 3D point clouds into 3D grids of the flag map as a comparative table in order to remove the redundant points. We integrate the large-scale 3D point clouds into a nonground PDB and a node-based terrain mesh using the CPU. Subsequently, we program a graphics processing unit (GPU) to generate the TDB by mapping the triangles in the terrain mesh onto the captured video images. Finally, we produce a nonground voxel map and a ground textured mesh as a terrain reconstruction result. Our proposed methods were tested in an outdoor environment. Our results show that the proposed system was able to rapidly generate terrain storage and provide high resolution terrain representation for mobile mapping services and a graphical user interface between remote operators and mobile robots.

  16. Real-Time Terrain Storage Generation from Multiple Sensors towards Mobile Robot Operation Interface

    PubMed Central

    Cho, Seoungjae; Xi, Yulong; Cho, Kyungeun

    2014-01-01

    A mobile robot mounted with multiple sensors is used to rapidly collect 3D point clouds and video images so as to allow accurate terrain modeling. In this study, we develop a real-time terrain storage generation and representation system including a nonground point database (PDB), ground mesh database (MDB), and texture database (TDB). A voxel-based flag map is proposed for incrementally registering large-scale point clouds in a terrain model in real time. We quantize the 3D point clouds into 3D grids of the flag map as a comparative table in order to remove the redundant points. We integrate the large-scale 3D point clouds into a nonground PDB and a node-based terrain mesh using the CPU. Subsequently, we program a graphics processing unit (GPU) to generate the TDB by mapping the triangles in the terrain mesh onto the captured video images. Finally, we produce a nonground voxel map and a ground textured mesh as a terrain reconstruction result. Our proposed methods were tested in an outdoor environment. Our results show that the proposed system was able to rapidly generate terrain storage and provide high resolution terrain representation for mobile mapping services and a graphical user interface between remote operators and mobile robots. PMID:25101321

  17. Building a new space weather facility at the National Observatory of Athens

    NASA Astrophysics Data System (ADS)

    Kontogiannis, Ioannis; Belehaki, Anna; Tsiropoula, Georgia; Tsagouri, Ioanna; Anastasiadis, Anastasios; Papaioannou, Athanasios

    2016-01-01

    The PROTEAS project has been initiated at the Institute of Astronomy, Astrophysics, Space Applications and Remote Sensing (IAASARS) of the National Observatory of Athens (NOA). One of its main objectives is to provide observations, processed data and space weather nowcasting and forecasting products, designed to support the space weather research community and operators of commercial and industrial systems. The space weather products to be released by this facility, will be the result of the exploitation of ground-based, as well as space-borne observations and of model results and tools already available or under development by IAASARS researchers. The objective will be achieved through: (a) the operation of a small full-disk solar telescope to conduct regular observations of the Sun in the H-alpha line; (b) the construction of a database with near real-time solar observations which will be available to the community through a web-based facility (HELIOSERVER); (c) the development of a tool for forecasting Solar Energetic Particle (SEP) events in relation to observed solar eruptive events; (d) the upgrade of the Athens Digisonde with digital transceivers and the capability of operating in bi-static link mode and (e) the sustainable operation of the European Digital Upper Atmosphere Server (DIAS) upgraded with additional data sets integrated in an interface with the HELIOSERVER and with improved models for the real-time quantification of the effects of solar eruptive events in the ionosphere.

  18. Real Time Baseball Database

    NASA Astrophysics Data System (ADS)

    Fukue, Yasuhiro

    The author describes the system outline, features and operations of "Nikkan Sports Realtime Basaball Database" which was developed and operated by Nikkan Sports Shimbun, K. K. The system enables to input numerical data of professional baseball games as they proceed simultaneously, and execute data updating at realtime, just-in-time. Other than serving as supporting tool for prepareing newspapers it is also available for broadcasting media, general users through NTT dial Q2 and others.

  19. Accuracy assessment of the Precise Point Positioning method applied for surveys and tracking moving objects in GIS environment

    NASA Astrophysics Data System (ADS)

    Ilieva, Tamara; Gekov, Svetoslav

    2017-04-01

    The Precise Point Positioning (PPP) method gives the users the opportunity to determine point locations using a single GNSS receiver. The accuracy of the determined by PPP point locations is better in comparison to the standard point positioning, due to the precise satellite orbit and clock corrections that are developed and maintained by the International GNSS Service (IGS). The aim of our current research is the accuracy assessment of the PPP method applied for surveys and tracking moving objects in GIS environment. The PPP data is collected by using preliminary developed by us software application that allows different sets of attribute data for the measurements and their accuracy to be used. The results from the PPP measurements are directly compared within the geospatial database to different other sets of terrestrial data - measurements obtained by total stations, real time kinematic and static GNSS.

  20. Navigation integrity monitoring and obstacle detection for enhanced-vision systems

    NASA Astrophysics Data System (ADS)

    Korn, Bernd; Doehler, Hans-Ullrich; Hecker, Peter

    2001-08-01

    Typically, Enhanced Vision (EV) systems consist of two main parts, sensor vision and synthetic vision. Synthetic vision usually generates a virtual out-the-window view using databases and accurate navigation data, e. g. provided by differential GPS (DGPS). The reliability of the synthetic vision highly depends on both, the accuracy of the used database and the integrity of the navigation data. But especially in GPS based systems, the integrity of the navigation can't be guaranteed. Furthermore, only objects that are stored in the database can be displayed to the pilot. Consequently, unexpected obstacles are invisible and this might cause severe problems. Therefore, additional information has to be extracted from sensor data to overcome these problems. In particular, the sensor data analysis has to identify obstacles and has to monitor the integrity of databases and navigation. Furthermore, if a lack of integrity arises, navigation data, e.g. the relative position of runway and aircraft, has to be extracted directly from the sensor data. The main contribution of this paper is about the realization of these three sensor data analysis tasks within our EV system, which uses the HiVision 35 GHz MMW radar of EADS, Ulm as the primary EV sensor. For the integrity monitoring, objects extracted from radar images are registered with both database objects and objects (e. g. other aircrafts) transmitted via data link. This results in a classification into known and unknown radar image objects and consequently, in a validation of the integrity of database and navigation. Furthermore, special runway structures are searched for in the radar image where they should appear. The outcome of this runway check contributes to the integrity analysis, too. Concurrent to this investigation a radar image based navigation is performed without using neither precision navigation nor detailed database information to determine the aircraft's position relative to the runway. The performance of our approach is demonstrated with real data acquired during extensive flight tests to several airports in Northern Germany.

  1. Geospatial database for heritage building conservation

    NASA Astrophysics Data System (ADS)

    Basir, W. N. F. W. A.; Setan, H.; Majid, Z.; Chong, A.

    2014-02-01

    Heritage buildings are icons from the past that exist in present time. Through heritage architecture, we can learn about economic issues and social activities of the past. Nowadays, heritage buildings are under threat from natural disaster, uncertain weather, pollution and others. In order to preserve this heritage for the future generation, recording and documenting of heritage buildings are required. With the development of information system and data collection technique, it is possible to create a 3D digital model. This 3D information plays an important role in recording and documenting heritage buildings. 3D modeling and virtual reality techniques have demonstrated the ability to visualize the real world in 3D. It can provide a better platform for communication and understanding of heritage building. Combining 3D modelling with technology of Geographic Information System (GIS) will create a database that can make various analyses about spatial data in the form of a 3D model. Objectives of this research are to determine the reliability of Terrestrial Laser Scanning (TLS) technique for data acquisition of heritage building and to develop a geospatial database for heritage building conservation purposes. The result from data acquisition will become a guideline for 3D model development. This 3D model will be exported to the GIS format in order to develop a database for heritage building conservation. In this database, requirements for heritage building conservation process are included. Through this research, a proper database for storing and documenting of the heritage building conservation data will be developed.

  2. Crisis Management Demonstration and Development Facility (CM/DDF)

    DTIC Science & Technology

    1980-12-01

    summaries were input to the DDF in real time over leased communication lines. Thus, EWAMS had an up-to-date database on which to operate. Users were...ready access to the historical record of recent U.S. crisis operations for DOD per- sonnel and to prescribe actions in real - time . The user searched the...personnel showed increasing interest in, and demand for, crisis management tools. Based on the state-of-the-art at the time , it was expected that

  3. Pragmatic precision oncology: the secondary uses of clinical tumor molecular profiling

    PubMed Central

    Thota, Ramya; Staggs, David B; Johnson, Douglas B; Warner, Jeremy L

    2016-01-01

    Background Precision oncology increasingly utilizes molecular profiling of tumors to determine treatment decisions with targeted therapeutics. The molecular profiling data is valuable in the treatment of individual patients as well as for multiple secondary uses. Objective To automatically parse, categorize, and aggregate clinical molecular profile data generated during cancer care as well as use this data to address multiple secondary use cases. Methods A system to parse, categorize and aggregate molecular profile data was created. A naÿve Bayesian classifier categorized results according to clinical groups. The accuracy of these systems were validated against a published expertly-curated subset of molecular profiling data. Results Following one year of operation, 819 samples have been accurately parsed and categorized to generate a data repository of 10,620 genetic variants. The database has been used for operational, clinical trial, and discovery science research. Conclusions A real-time database of molecular profiling data is a pragmatic solution to several knowledge management problems in the practice and science of precision oncology. PMID:27026612

  4. [Research and Development of A Kinect Based Virtual System for Upper Limb Rehabilitation].

    PubMed

    Ding, Weili; Zheng, Yazhuo; Su, Yuping; Li, Xiaoli; Wei, Xiuli

    2015-06-01

    We developed a rehabilitation system by using the virtual reality technique and the Kinect in this paper. The system combines rehabilitation training with HMI and serious game organically, and provides a game and motion database to meet different patients' demands. Extended interface of game database is provided in two ways: personalized games can be developed by Virtools and Flash games which are suitable for patients' rehabilitation can be download from the Internet directly. In addition, the system provides patients with flexible interaction and easy control mode, and also presents real time data recording. An objective and subjective evaluation method is proposed to review the effectiveness of the rehabilitation training. According to the results of short questionnaires and the evaluation results of patients' rehabilitation training, the system compared with traditional rehabilitation can record and analyze the training data, which is useful to make rehabilitation plans. More entertainment and lower cost will increase patients' motivation, which helps to increase the rehabilitation effectiveness.

  5. Real-Time Network Management

    DTIC Science & Technology

    1998-07-01

    Report No. WH97JR00-A002 Sponsored by REAL-TIME NETWORK MANAGEMENT FINAL TECHNICAL REPORT K CD July 1998 CO CO O W O Defense Advanced...Approved for public release; distribution unlimited. t^GquALmmsPEami Report No. WH97JR00-A002 REAL-TIME NETWORK MANAGEMENT Synectics Corporation...2.1.2.1 WAN-class Networks 12 2.1.2.2 IEEE 802.3-class Networks 13 2.2 Task 2 - Object Modeling for Architecture 14 2.2.1 Managed Objects 14 2.2.2

  6. Real-time optical holographic tracking of multiple objects

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin; Liu, Hua-Kuang

    1989-01-01

    A coherent optical correlation technique for real-time simultaneous tracking of several different objects making independent movements is described, and experimental results are presented. An evaluation of this system compared with digital computing systems is made. The real-time processing capability is obtained through the use of a liquid crystal television spatial light modulator and a dichromated gelatin multifocus hololens. A coded reference beam is utilized in the separation of the output correlation plane associated with each input target so that independent tracking can be achieved.

  7. Multiple objects tracking with HOGs matching in circular windows

    NASA Astrophysics Data System (ADS)

    Miramontes-Jaramillo, Daniel; Kober, Vitaly; Díaz-Ramírez, Víctor H.

    2014-09-01

    In recent years tracking applications with development of new technologies like smart TVs, Kinect, Google Glass and Oculus Rift become very important. When tracking uses a matching algorithm, a good prediction algorithm is required to reduce the search area for each object to be tracked as well as processing time. In this work, we analyze the performance of different tracking algorithms based on prediction and matching for a real-time tracking multiple objects. The used matching algorithm utilizes histograms of oriented gradients. It carries out matching in circular windows, and possesses rotation invariance and tolerance to viewpoint and scale changes. The proposed algorithm is implemented in a personal computer with GPU, and its performance is analyzed in terms of processing time in real scenarios. Such implementation takes advantage of current technologies and helps to process video sequences in real-time for tracking several objects at the same time.

  8. Creating a High-Frequency Electronic Database in the PICU: The Perpetual Patient.

    PubMed

    Brossier, David; El Taani, Redha; Sauthier, Michael; Roumeliotis, Nadia; Emeriaud, Guillaume; Jouvet, Philippe

    2018-04-01

    Our objective was to construct a prospective high-quality and high-frequency database combining patient therapeutics and clinical variables in real time, automatically fed by the information system and network architecture available through fully electronic charting in our PICU. The purpose of this article is to describe the data acquisition process from bedside to the research electronic database. Descriptive report and analysis of a prospective database. A 24-bed PICU, medical ICU, surgical ICU, and cardiac ICU in a tertiary care free-standing maternal child health center in Canada. All patients less than 18 years old were included at admission to the PICU. None. Between May 21, 2015, and December 31, 2016, 1,386 consecutive PICU stays from 1,194 patients were recorded in the database. Data were prospectively collected from admission to discharge, every 5 seconds from monitors and every 30 seconds from mechanical ventilators and infusion pumps. These data were linked to the patient's electronic medical record. The database total volume was 241 GB. The patients' median age was 2.0 years (interquartile range, 0.0-9.0). Data were available for all mechanically ventilated patients (n = 511; recorded duration, 77,678 hr), and respiratory failure was the most frequent reason for admission (n = 360). The complete pharmacologic profile was synched to database for all PICU stays. Following this implementation, a validation phase is in process and several research projects are ongoing using this high-fidelity database. Using the existing bedside information system and network architecture of our PICU, we implemented an ongoing high-fidelity prospectively collected electronic database, preventing the continuous loss of scientific information. This offers the opportunity to develop research on clinical decision support systems and computational models of cardiorespiratory physiology for example.

  9. Real-time object detection, tracking and occlusion reasoning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Divakaran, Ajay; Yu, Qian; Tamrakar, Amir

    A system for object detection and tracking includes technologies to, among other things, detect and track moving objects, such as pedestrians and/or vehicles, in a real-world environment, handle static and dynamic occlusions, and continue tracking moving objects across the fields of view of multiple different cameras.

  10. Real-Time and Post-Processed Georeferencing for Hyperpspectral Drone Remote Sensing

    NASA Astrophysics Data System (ADS)

    Oliveira, R. A.; Khoramshahi, E.; Suomalainen, J.; Hakala, T.; Viljanen, N.; Honkavaara, E.

    2018-05-01

    The use of drones and photogrammetric technologies are increasing rapidly in different applications. Currently, drone processing workflow is in most cases based on sequential image acquisition and post-processing, but there are great interests towards real-time solutions. Fast and reliable real-time drone data processing can benefit, for instance, environmental monitoring tasks in precision agriculture and in forest. Recent developments in miniaturized and low-cost inertial measurement systems and GNSS sensors, and Real-time kinematic (RTK) position data are offering new perspectives for the comprehensive remote sensing applications. The combination of these sensors and light-weight and low-cost multi- or hyperspectral frame sensors in drones provides the opportunity of creating near real-time or real-time remote sensing data of target object. We have developed a system with direct georeferencing onboard drone to be used combined with hyperspectral frame cameras in real-time remote sensing applications. The objective of this study is to evaluate the real-time georeferencing comparing with post-processing solutions. Experimental data sets were captured in agricultural and forested test sites using the system. The accuracy of onboard georeferencing data were better than 0.5 m. The results showed that the real-time remote sensing is promising and feasible in both test sites.

  11. A low-cost mobile adaptive tracking system for chronic pulmonary patients in home environment.

    PubMed

    Işik, Ali Hakan; Güler, Inan; Sener, Melahat Uzel

    2013-01-01

    The main objective of this study is presenting a real-time mobile adaptive tracking system for patients diagnosed with diseases such as asthma or chronic obstructive pulmonary disease and application results at home. The main role of the system is to support and track chronic pulmonary patients in real time who are comfortable in their home environment. It is not intended to replace the doctor, regular treatment, and diagnosis. In this study, the Java 2 micro edition-based system is integrated with portable spirometry, smartphone, extensible markup language-based Web services, Web server, and Web pages for visualizing pulmonary function test results. The Bluetooth(®) (Bluetooth SIG, Kirkland, WA) virtual serial port protocol is used to obtain the test results from spirometry. General packet radio service, wireless local area network, or third-generation-based wireless networks are used to send the test results from a smartphone to the remote database. The system provides real-time classification of test results with the back propagation artificial neural network algorithm on a mobile smartphone. It also provides the generation of appropriate short message service-based notification and sending of all data to the Web server. In this study, the test results of 486 patients, obtained from Atatürk Chest Diseases and Thoracic Surgery Training and Research Hospital in Ankara, Turkey, are used as the training and test set in the algorithm. The algorithm has 98.7% accuracy, 97.83% specificity, 97.63% sensitivity, and 0.946 correlation values. The results show that the system is cheap (900 Euros) and reliable. The developed real-time system provides improvement in classification accuracy and facilitates tracking of chronic pulmonary patients.

  12. GIS-project: geodynamic globe for global monitoring of geological processes

    NASA Astrophysics Data System (ADS)

    Ryakhovsky, V.; Rundquist, D.; Gatinsky, Yu.; Chesalova, E.

    2003-04-01

    A multilayer geodynamic globe at the scale 1:10,000,000 was created at the end of the nineties in the GIS Center of the Vernadsky Museum. A special soft-and-hardware complex was elaborated for its visualization with a set of multitarget object directed databases. The globe includes separate thematic covers represented by digital sets of spatial geological, geochemical, and geophysical information (maps, schemes, profiles, stratigraphic columns, arranged databases etc.). At present the largest databases included in the globe program are connected with petrochemical and isotopic data on magmatic rocks of the World Ocean and with the large and supperlarge mineral deposits. Software by the Environmental Scientific Research Institute (ESRI), USA as well as ArcScan vectrorizator were used for covers digitizing and database adaptation (ARC/INFO 7.0, 8.0). All layers of the geoinformational project were obtained by scanning of separate objects and their transfer to the real geographic co-ordinates of an equiintermediate conic projection. Then the covers were projected on plane degree-system geographic co-ordinates. Some attributive databases were formed for each thematic layer, and in the last stage all covers were combined into the single information system. Separate digital covers represent mathematical descriptions of geological objects and relations between them, such as Earth's altimetry, active fault systems, seismicity etc. Some grounds of the cartographic generalization were taken into consideration in time of covers compilation with projection and co-ordinate systems precisely answered a given scale. The globe allows us to carry out in the interactive regime the formation of coordinated with each other object-oriented databases and thematic covers directly connected with them. They can be spread for all the Earth and the near-Earth space, and for the most well known parts of divergent and convergent boundaries of the lithosphere plates. Such covers and time series reflect in diagram form a total combination and dynamics of data on the geological structure, geophysical fields, seismicity, geomagnetism, composition of rock complexes, and metalloge-ny of different areas on the Earth's surface. They give us possibility to scale, detail, and develop 3D spatial visualization. Information filling the covers could be replenished as in the existing so in newly formed databases with new data. The integrated analyses of the data allows us more precisely to define our ideas on regularities in development of lithosphere and mantle unhomogeneities using some original technologies. It also enables us to work out 3D digital models for geodynamic development of tectonic zones in convergent and divergent plate boundaries with the purpose of integrated monitoring of mineral resources and establishing correlation between seismicity, magmatic activity, and metallogeny in time-spatial co-ordinates. The created multifold geoinformation system gives a chance to execute an integral analyses of geoinformation flows in the interactive regime and, in particular, to establish some regularities in the time-spatial distribution and dynamics of main structural units in the lithosphere, as well as illuminate the connection between stages of their development and epochs of large and supperlarge mineral deposit formation. Now we try to use the system for prediction of large oil and gas concentration in the main sedimentary basins. The work was supported by RFBR, (grants 93-07-14680, 96-07-89499, 99-07-90030, 00-15-98535, 02-07-90140) and MTC.

  13. GeneSCF: a real-time based functional enrichment tool with support for multiple organisms.

    PubMed

    Subhash, Santhilal; Kanduri, Chandrasekhar

    2016-09-13

    High-throughput technologies such as ChIP-sequencing, RNA-sequencing, DNA sequencing and quantitative metabolomics generate a huge volume of data. Researchers often rely on functional enrichment tools to interpret the biological significance of the affected genes from these high-throughput studies. However, currently available functional enrichment tools need to be updated frequently to adapt to new entries from the functional database repositories. Hence there is a need for a simplified tool that can perform functional enrichment analysis by using updated information directly from the source databases such as KEGG, Reactome or Gene Ontology etc. In this study, we focused on designing a command-line tool called GeneSCF (Gene Set Clustering based on Functional annotations), that can predict the functionally relevant biological information for a set of genes in a real-time updated manner. It is designed to handle information from more than 4000 organisms from freely available prominent functional databases like KEGG, Reactome and Gene Ontology. We successfully employed our tool on two of published datasets to predict the biologically relevant functional information. The core features of this tool were tested on Linux machines without the need for installation of more dependencies. GeneSCF is more reliable compared to other enrichment tools because of its ability to use reference functional databases in real-time to perform enrichment analysis. It is an easy-to-integrate tool with other pipelines available for downstream analysis of high-throughput data. More importantly, GeneSCF can run multiple gene lists simultaneously on different organisms thereby saving time for the users. Since the tool is designed to be ready-to-use, there is no need for any complex compilation and installation procedures.

  14. High-performance hardware implementation of a parallel database search engine for real-time peptide mass fingerprinting

    PubMed Central

    Bogdán, István A.; Rivers, Jenny; Beynon, Robert J.; Coca, Daniel

    2008-01-01

    Motivation: Peptide mass fingerprinting (PMF) is a method for protein identification in which a protein is fragmented by a defined cleavage protocol (usually proteolysis with trypsin), and the masses of these products constitute a ‘fingerprint’ that can be searched against theoretical fingerprints of all known proteins. In the first stage of PMF, the raw mass spectrometric data are processed to generate a peptide mass list. In the second stage this protein fingerprint is used to search a database of known proteins for the best protein match. Although current software solutions can typically deliver a match in a relatively short time, a system that can find a match in real time could change the way in which PMF is deployed and presented. In a paper published earlier we presented a hardware design of a raw mass spectra processor that, when implemented in Field Programmable Gate Array (FPGA) hardware, achieves almost 170-fold speed gain relative to a conventional software implementation running on a dual processor server. In this article we present a complementary hardware realization of a parallel database search engine that, when running on a Xilinx Virtex 2 FPGA at 100 MHz, delivers 1800-fold speed-up compared with an equivalent C software routine, running on a 3.06 GHz Xeon workstation. The inherent scalability of the design means that processing speed can be multiplied by deploying the design on multiple FPGAs. The database search processor and the mass spectra processor, running on a reconfigurable computing platform, provide a complete real-time PMF protein identification solution. Contact: d.coca@sheffield.ac.uk PMID:18453553

  15. Realistic Real-Time Outdoor Rendering in Augmented Reality

    PubMed Central

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  16. Realistic real-time outdoor rendering in augmented reality.

    PubMed

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  17. James Webb Space Telescope: Supporting Multiple Ground System Transitions in One Year

    NASA Technical Reports Server (NTRS)

    Detter, Ryan; Fatig, Curtis; Steck, Jane

    2004-01-01

    Ideas, requirements, and concepts developed during the very early phases of the mission design often conflict with the reality of a situation once the prime contractors are awarded. This happened for the James Webb Space Telescope (JWST) as well. The high level requirement of a common real-time ground system for both the Integration and Test (I&T), as well as the Operation phase of the mission is meant to reduce the cost and time needed later in the mission development for re-certification of databases, command and control systems, scripts, display pages, etc. In the case of JWST, the early Phase A flight software development needed a real-time ground system and database prior to the spacecraft prime contractor being selected. To compound the situation, the very low level requirements for the real-time ground system were not well defined. These two situations caused the initial real-time ground system to be switched out for a system that was previously used by the Bight software development team. To meet the high-!evel requirement, a third ground system was selected based on the prime spacecraft contractor needs and JWST Project decisions. The JWST ground system team has responded to each of these changes successfully. The lessons learned from each transition have not only made each transition smoother, but have also resolved issues earlier in the mission development than what would normally occur.

  18. Real-Time Integrity Monitoring of Stored Geo-Spatial Data Using Forward-Looking Remote Sensing Technology

    NASA Technical Reports Server (NTRS)

    Young, Steven D.; Harrah, Steven D.; deHaag, Maarten Uijt

    2002-01-01

    Terrain Awareness and Warning Systems (TAWS) and Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data (e.g. terrain, obstacles, and/or features). As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. This lack of a quantifiable integrity level is one of the constraints that has limited certification and operational approval of TAWS/SVS to "advisory-only" systems for civil aviation. Previous work demonstrated the feasibility of using a real-time monitor to bound database integrity by using downward-looking remote sensing technology (i.e. radar altimeters). This paper describes an extension of the integrity monitor concept to include a forward-looking sensor to cover additional classes of terrain database faults and to reduce the exposure time associated with integrity threats. An operational concept is presented that combines established feature extraction techniques with a statistical assessment of similarity measures between the sensed and stored features using principles from classical detection theory. Finally, an implementation is presented that uses existing commercial-off-the-shelf weather radar sensor technology.

  19. Integrated radiologist's workstation enabling the radiologist as an effective clinical consultant

    NASA Astrophysics Data System (ADS)

    McEnery, Kevin W.; Suitor, Charles T.; Hildebrand, Stan; Downs, Rebecca; Thompson, Stephen K.; Shepard, S. Jeff

    2002-05-01

    Since February 2000, radiologists at the M. D. Anderson Cancer Center have accessed clinical information through an internally developed radiologist's clinical interpretation workstation called RadStation. This project provides a fully integrated digital dictation workstation with clinical data review. RadStation enables the radiologist as an effective clinical consultant with access to pertinent sources of clinical information at the time of dictation. Data sources not only include prior radiology reports from the radiology information system (RIS) but access to pathology data, laboratory data, history and physicals, clinic notes, and operative reports. With integrated clinical information access, a radiologists's interpretation not only comments on morphologic findings but also can enable evaluation of study findings in the context of pertinent clinical presentation and history. Image access is enabled through the integration of an enterprise image archive (Stentor, San Francisco). Database integration is achieved by a combination of real time HL7 messaging and queries to SQL-based legacy databases. A three-tier system architecture accommodates expanding access to additional databases including real-time patient schedule as well as patient medications and allergies.

  20. Real-Time, Interactive Sonic Boom Display

    NASA Technical Reports Server (NTRS)

    Haering, Jr., Edward A. (Inventor); Plotkin, Kenneth J. (Inventor)

    2012-01-01

    The present invention is an improved real-time, interactive sonic boom display for aircraft. By using physical properties obtained via various sensors and databases, the invention determines, in real-time, sonic boom impacts locations and intensities for aircraft traveling at supersonic speeds. The information is provided to a pilot via a display that lists a selectable set of maneuvers available to the pilot to mitigate sonic boom issues. Upon selection of a maneuver, the information as to the result of the maneuver is displayed and the pilot may proceed with making the maneuver, or provide new data to the system in order to calculate a different maneuver.

  1. LASER BIOLOGY AND MEDICINE: Optoacoustic laser monitoring of cooling and freezing of tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Larina, I. V.; Motamedi, M.; Esenaliev, R. O.

    2002-11-01

    Real-time monitoring of cooling and freezing of tissues, cells, and other biological objects with a high spatial and time resolution, which is necessary for selective destruction of cancer and benign tumours during cryotherapy, as well as for preventing any damage to the structure and functioning of biological objects in cryobiology, is considered. The optoacoustic method, based on the measurement and analysis of acoustic waves induced by short laser pulses, is proposed for monitoring the cooling and freezing of the tissue. The effect of cooling and freezing on the amplitude and time profile of acoustic signals generated in real tissues and in a model object is studied. The experimental results indicate that the optoacoustic laser technique can be used for real-time monitoring of cooling and freezing of biological objects with a submillimeter spatial resolution and a high contrast.

  2. The Danish Microbiology Database (MiBa) 2010 to 2013.

    PubMed

    Voldstedlund, M; Haarh, M; Mølbak, K

    2014-01-09

    The Danish Microbiology Database (MiBa) is a national database that receives copies of reports from all Danish departments of clinical microbiology. The database was launched in order to provide healthcare personnel with nationwide access to microbiology reports and to enable real-time surveillance of communicable diseases and microorganisms. The establishment and management of MiBa has been a collaborative process among stakeholders, and the present paper summarises lessons learned from this nationwide endeavour which may be relevant to similar projects in the rapidly changing landscape of health informatics.

  3. The accuracy of real-time procedure coding by theatre nurses: a comparison with the central national system.

    PubMed

    Maclean, Donald; Younes, Hakim Ben; Forrest, Margaret; Towers, Hazel K

    2012-03-01

    Accurate and timely clinical data are required for clinical and organisational purposes and is especially important for patient management, audit of surgical performance and the electronic health record. The recent introduction of computerised theatre management systems has enabled real-time (point-of-care) operative procedure coding by clinical staff. However the accuracy of these data is unknown. The aim of this Scottish study was to compare the accuracy of theatre nurses' real-time coding on the local theatre management system with the central Scottish Morbidity Record (SMR01). Paired procedural codes were recorded, qualitatively graded for precision and compared (n = 1038). In this study, real-time, point-of-care coding by theatre nurses resulted in significant coding errors compared with the central SMR01 database. Improved collaboration between full-time coders and clinical staff using computerised decision support systems is suggested.

  4. NOAA Propagation Database Value in Tsunami Forecast Guidance

    NASA Astrophysics Data System (ADS)

    Eble, M. C.; Wright, L. M.

    2016-02-01

    The National Oceanic and Atmospheric Administration (NOAA) Center for Tsunami Research (NCTR) has developed a tsunami forecasting capability that combines a graphical user interface with data ingestion and numerical models to produce estimates of tsunami wave arrival times, amplitudes, current or water flow rates, and flooding at specific coastal communities. The capability integrates several key components: deep-ocean observations of tsunamis in real-time, a basin-wide pre-computed propagation database of water level and flow velocities based on potential pre-defined seismic unit sources, an inversion or fitting algorithm to refine the tsunami source based on the observations during an event, and tsunami forecast models. As tsunami waves propagate across the ocean, observations from the deep ocean are automatically ingested into the application in real-time to better define the source of the tsunami itself. Since passage of tsunami waves over a deep ocean reporting site is not immediate, we explore the value of the NOAA propagation database in providing placeholder forecasts in advance of deep ocean observations. The propagation database consists of water elevations and flow velocities pre-computed for 50 x 100 [km] unit sources in a continuous series along all known ocean subduction zones. The 2011 Japan Tohoku tsunami is presented as the case study

  5. Real time reconstruction of quasiperiodic multi parameter physiological signals

    NASA Astrophysics Data System (ADS)

    Ganeshapillai, Gartheeban; Guttag, John

    2012-12-01

    A modern intensive care unit (ICU) has automated analysis systems that depend on continuous uninterrupted real time monitoring of physiological signals such as electrocardiogram (ECG), arterial blood pressure (ABP), and photo-plethysmogram (PPG). These signals are often corrupted by noise, artifacts, and missing data. We present an automated learning framework for real time reconstruction of corrupted multi-parameter nonstationary quasiperiodic physiological signals. The key idea is to learn a patient-specific model of the relationships between signals, and then reconstruct corrupted segments using the information available in correlated signals. We evaluated our method on MIT-BIH arrhythmia data, a two-channel ECG dataset with many clinically significant arrhythmias, and on the CinC challenge 2010 data, a multi-parameter dataset containing ECG, ABP, and PPG. For each, we evaluated both the residual distance between the original signals and the reconstructed signals, and the performance of a heartbeat classifier on a reconstructed ECG signal. At an SNR of 0 dB, the average residual distance on the CinC data was roughly 3% of the energy in the signal, and on the arrhythmia database it was roughly 16%. The difference is attributable to the large amount of diversity in the arrhythmia database. Remarkably, despite the relatively high residual difference, the classification accuracy on the arrhythmia database was still 98%, indicating that our method restored the physiologically important aspects of the signal.

  6. Real-time Geographic Information System (GIS) for Monitoring the Area of Potential Water Level Using Rule Based System

    NASA Astrophysics Data System (ADS)

    Anugrah, Wirdah; Suryono; Suseno, Jatmiko Endro

    2018-02-01

    Management of water resources based on Geographic Information System can provide substantial benefits to water availability settings. Monitoring the potential water level is needed in the development sector, agriculture, energy and others. In this research is developed water resource information system using real-time Geographic Information System concept for monitoring the potential water level of web based area by applying rule based system method. GIS consists of hardware, software, and database. Based on the web-based GIS architecture, this study uses a set of computer that are connected to the network, run on the Apache web server and PHP programming language using MySQL database. The Ultrasound Wireless Sensor System is used as a water level data input. It also includes time and geographic location information. This GIS maps the five sensor locations. GIS is processed through a rule based system to determine the level of potential water level of the area. Water level monitoring information result can be displayed on thematic maps by overlaying more than one layer, and also generating information in the form of tables from the database, as well as graphs are based on the timing of events and the water level values.

  7. Preliminary investigation of submerged aquatic vegetation mapping using hyperspectral remote sensing.

    PubMed

    William, David J; Rybicki, Nancy B; Lombana, Alfonso V; O'Brien, Tim M; Gomez, Richard B

    2003-01-01

    The use of airborne hyperspectral remote sensing imagery for automated mapping of submerged aquatic vegetation (SAV) in the tidal Potomac River was investigated for near to real-time resource assessment and monitoring. Airborne hyperspectral imagery and field spectrometer measurements were obtained in October of 2000. A spectral library database containing selected ground-based and airborne sensor spectra was developed for use in image processing. The spectral library is used to automate the processing of hyperspectral imagery for potential real-time material identification and mapping. Field based spectra were compared to the airborne imagery using the database to identify and map two species of SAV (Myriophyllum spicatum and Vallisneria americana). Overall accuracy of the vegetation maps derived from hyperspectral imagery was determined by comparison to a product that combined aerial photography and field based sampling at the end of the SAV growing season. The algorithms and databases developed in this study will be useful with the current and forthcoming space-based hyperspectral remote sensing systems.

  8. Dual-dimensional microscopy: real-time in vivo three-dimensional observation method using high-resolution light-field microscopy and light-field display.

    PubMed

    Kim, Jonghyun; Moon, Seokil; Jeong, Youngmo; Jang, Changwon; Kim, Youngmin; Lee, Byoungho

    2018-06-01

    Here, we present dual-dimensional microscopy that captures both two-dimensional (2-D) and light-field images of an in-vivo sample simultaneously, synthesizes an upsampled light-field image in real time, and visualizes it with a computational light-field display system in real time. Compared with conventional light-field microscopy, the additional 2-D image greatly enhances the lateral resolution at the native object plane up to the diffraction limit and compensates for the image degradation at the native object plane. The whole process from capturing to displaying is done in real time with the parallel computation algorithm, which enables the observation of the sample's three-dimensional (3-D) movement and direct interaction with the in-vivo sample. We demonstrate a real-time 3-D interactive experiment with Caenorhabditis elegans. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  9. Real-time, in-situ detection of volatile profiles for the prevention of aflatoxin fungal contamination in pistachios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bond, Tiziana C.; Chang, Allan; Zhou, Jenny

    The objective in this project is to provide a proof of concept will demonstrate the feasibility of a Raman, in-situ warning system for detecting and removing developing fungal hot spots from pistachio stockpiles and transit containers, thus decreasing human health risks and product loss as a result of contamination. The proposed project has the following goals: to calibrate the Raman fingerprinting of biomarkers, standalone and in premixed samples, to build a database with the vibrational profiles distinctive to the signatures of the bouquet emitted by the contaminated pistachios; to test the improvement in the detection of the detectable markers withmore » enhanced Raman on a small probe.« less

  10. Mining moving object trajectories in location-based services for spatio-temporal database update

    NASA Astrophysics Data System (ADS)

    Guo, Danhuai; Cui, Weihong

    2008-10-01

    Advances in wireless transmission and mobile technology applied to LBS (Location-based Services) flood us with amounts of moving objects data. Vast amounts of gathered data from position sensors of mobile phones, PDAs, or vehicles hide interesting and valuable knowledge and describe the behavior of moving objects. The correlation between temporal moving patterns of moving objects and geo-feature spatio-temporal attribute was ignored, and the value of spatio-temporal trajectory data was not fully exploited too. Urban expanding or frequent town plan change bring about a large amount of outdated or imprecise data in spatial database of LBS, and they cannot be updated timely and efficiently by manual processing. In this paper we introduce a data mining approach to movement pattern extraction of moving objects, build a model to describe the relationship between movement patterns of LBS mobile objects and their environment, and put up with a spatio-temporal database update strategy in LBS database based on trajectories spatiotemporal mining. Experimental evaluation reveals excellent performance of the proposed model and strategy. Our original contribution include formulation of model of interaction between trajectory and its environment, design of spatio-temporal database update strategy based on moving objects data mining, and the experimental application of spatio-temporal database update by mining moving objects trajectories.

  11. Building a Lego wall: Sequential action selection.

    PubMed

    Arnold, Amy; Wing, Alan M; Rotshtein, Pia

    2017-05-01

    The present study draws together two distinct lines of enquiry into the selection and control of sequential action: motor sequence production and action selection in everyday tasks. Participants were asked to build 2 different Lego walls. The walls were designed to have hierarchical structures with shared and dissociated colors and spatial components. Participants built 1 wall at a time, under low and high load cognitive states. Selection times for correctly completed trials were measured using 3-dimensional motion tracking. The paradigm enabled precise measurement of the timing of actions, while using real objects to create an end product. The experiment demonstrated that action selection was slowed at decision boundary points, relative to boundaries where no between-wall decision was required. Decision points also affected selection time prior to the actual selection window. Dual-task conditions increased selection errors. Errors mostly occurred at boundaries between chunks and especially when these required decisions. The data support hierarchical control of sequenced behavior. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. Proceedings of the Real-Time Systems Engineering Workshop

    DTIC Science & Technology

    2001-08-01

    real - time systems engineering. The workshop was held as part of the SEI Symposium in...Washington, DC, during September 2000. The objective of the workshop was to identify key issues and obtain feedback from attendees concerning real - time systems engineering...and interoperability. This report summarizes the workshop in terms of foundation, management, and technical topics, and it contains a discussion related to developing a community of interest for real - time systems

  13. Educational treasures in radiology: a free online program for Radiology Boards preparation.

    PubMed

    Talanow, Roland

    2011-01-01

    An objective tool is desired, which optimally prepares for Radiology boards examination. Such program should prepare examinees with pertinent radiological contents and simulations as expected in the real examination. Many countries require written boards examinations for Radiology certification eligibility. No objective measure exists to tell if the examinee is ready to pass the exam or not. Time pressure and computer environment might be unfamiliar to examinees. Traditional preparation lectures don't simulate the "real" Radiology exam because they don't provide the special environment with multiple choice questions and timing. This online program consists of 4 parts. The entry section allows to create questions with additional fields for comprehensive information. Sections include Pediatrics/Mammography/GI/IR/Nucs/Thoracic/Musculoskeletal/GU/Neuro/Ultrasound/Cardiac/OB/GYN and Miscellaneous. Experienced radiologists and educators evaluate and release/delete these entries in the administrator section. In the exam section users can create (un)timed customized exams for individual needs and learning pace. Exams can either include all sections or only specific sections to gear learning towards areas with weaker performance. Comprehensive statistics unveil the user's strengths and weaknesses to help focussing on "weak" areas. In the search section a comprehensive search and review can be performed by searching the entire database for keywords/topics or only searching within specific sections. www.RadiologyBoards.org is a new working concept of Radiology boards preparation to detect and improve the examinee's weaknesses and finally to increase the examinee's confidence level for the final exam. It is beneficial for Radiology residents and also board certified radiologists to refresh/maintain radiological knowledge.

  14. Osteoporosis therapies: evidence from health-care databases and observational population studies.

    PubMed

    Silverman, Stuart L

    2010-11-01

    Osteoporosis is a well-recognized disease with severe consequences if left untreated. Randomized controlled trials are the most rigorous method for determining the efficacy and safety of therapies. Nevertheless, randomized controlled trials underrepresent the real-world patient population and are costly in both time and money. Modern technology has enabled researchers to use information gathered from large health-care or medical-claims databases to assess the practical utilization of available therapies in appropriate patients. Observational database studies lack randomization but, if carefully designed and successfully completed, can provide valuable information that complements results obtained from randomized controlled trials and extends our knowledge to real-world clinical patients. Randomized controlled trials comparing fracture outcomes among osteoporosis therapies are difficult to perform. In this regard, large observational database studies could be useful in identifying clinically important differences among therapeutic options. Database studies can also provide important information with regard to osteoporosis prevalence, health economics, and compliance and persistence with treatment. This article describes the strengths and limitations of both randomized controlled trials and observational database studies, discusses considerations for observational study design, and reviews a wealth of information generated by database studies in the field of osteoporosis.

  15. Real-time detection of natural objects using AM-coded spectral matching imager

    NASA Astrophysics Data System (ADS)

    Kimachi, Akira

    2004-12-01

    This paper describes application of the amplitude-modulation (AM)-coded spectral matching imager (SMI) to real-time detection of natural objects such as human beings, animals, vegetables, or geological objects or phenomena, which are much more liable to change with time than artificial products while often exhibiting characteristic spectral functions associated with some specific activity states. The AM-SMI produces correlation between spectral functions of the object and a reference at each pixel of the correlation image sensor (CIS) in every frame, based on orthogonal amplitude modulation (AM) of each spectral channel and simultaneous demodulation of all channels on the CIS. This principle makes the SMI suitable to monitoring dynamic behavior of natural objects in real-time by looking at a particular spectral reflectance or transmittance function. A twelve-channel multispectral light source was developed with improved spatial uniformity of spectral irradiance compared to a previous one. Experimental results of spectral matching imaging of human skin and vegetable leaves are demonstrated, as well as a preliminary feasibility test of imaging a reflective object using a test color chart.

  16. Real-time detection of natural objects using AM-coded spectral matching imager

    NASA Astrophysics Data System (ADS)

    Kimachi, Akira

    2005-01-01

    This paper describes application of the amplitude-modulation (AM)-coded spectral matching imager (SMI) to real-time detection of natural objects such as human beings, animals, vegetables, or geological objects or phenomena, which are much more liable to change with time than artificial products while often exhibiting characteristic spectral functions associated with some specific activity states. The AM-SMI produces correlation between spectral functions of the object and a reference at each pixel of the correlation image sensor (CIS) in every frame, based on orthogonal amplitude modulation (AM) of each spectral channel and simultaneous demodulation of all channels on the CIS. This principle makes the SMI suitable to monitoring dynamic behavior of natural objects in real-time by looking at a particular spectral reflectance or transmittance function. A twelve-channel multispectral light source was developed with improved spatial uniformity of spectral irradiance compared to a previous one. Experimental results of spectral matching imaging of human skin and vegetable leaves are demonstrated, as well as a preliminary feasibility test of imaging a reflective object using a test color chart.

  17. Real three-dimensional objects: effects on mental rotation.

    PubMed

    Felix, Michael C; Parker, Joshua D; Lee, Charles; Gabriel, Kara I

    2011-08-01

    The current experiment investigated real three-dimensional (3D) objects with regard to performance on a mental rotation task and whether the appearance of sex differences may be mediated by experiences with spatially related activities. 40 men and 40 women were presented with alternating timed trials consisting of real-3D objects or two-dimensional illustrations of 3D objects. Sex differences in spatially related activities did not significantly influence the finding that men outperformed women on mental rotation of either stimulus type. However, on measures related to spatial activities, self-reported proficiency using maps correlated positively with performance only on trials with illustrations whereas self-reported proficiency using GPS correlated negatively with performance regardless of stimulus dimensionality. Findings may be interpreted as suggesting that rotating real-3D objects utilizes distinct but overlapping spatial skills compared to rotating two-dimensional representations of 3D objects, and real-3D objects can enhance mental rotation performance.

  18. A Real-Time Method to Estimate Speed of Object Based on Object Detection and Optical Flow Calculation

    NASA Astrophysics Data System (ADS)

    Liu, Kaizhan; Ye, Yunming; Li, Xutao; Li, Yan

    2018-04-01

    In recent years Convolutional Neural Network (CNN) has been widely used in computer vision field and makes great progress in lots of contents like object detection and classification. Even so, combining Convolutional Neural Network, which means making multiple CNN frameworks working synchronously and sharing their output information, could figure out useful message that each of them cannot provide singly. Here we introduce a method to real-time estimate speed of object by combining two CNN: YOLOv2 and FlowNet. In every frame, YOLOv2 provides object size; object location and object type while FlowNet providing the optical flow of whole image. On one hand, object size and object location help to select out the object part of optical flow image thus calculating out the average optical flow of every object. On the other hand, object type and object size help to figure out the relationship between optical flow and true speed by means of optics theory and priori knowledge. Therefore, with these two key information, speed of object can be estimated. This method manages to estimate multiple objects at real-time speed by only using a normal camera even in moving status, whose error is acceptable in most application fields like manless driving or robot vision.

  19. Rule-based deduplication of article records from bibliographic databases.

    PubMed

    Jiang, Yu; Lin, Can; Meng, Weiyi; Yu, Clement; Cohen, Aaron M; Smalheiser, Neil R

    2014-01-01

    We recently designed and deployed a metasearch engine, Metta, that sends queries and retrieves search results from five leading biomedical databases: PubMed, EMBASE, CINAHL, PsycINFO and the Cochrane Central Register of Controlled Trials. Because many articles are indexed in more than one of these databases, it is desirable to deduplicate the retrieved article records. This is not a trivial problem because data fields contain a lot of missing and erroneous entries, and because certain types of information are recorded differently (and inconsistently) in the different databases. The present report describes our rule-based method for deduplicating article records across databases and includes an open-source script module that can be deployed freely. Metta was designed to satisfy the particular needs of people who are writing systematic reviews in evidence-based medicine. These users want the highest possible recall in retrieval, so it is important to err on the side of not deduplicating any records that refer to distinct articles, and it is important to perform deduplication online in real time. Our deduplication module is designed with these constraints in mind. Articles that share the same publication year are compared sequentially on parameters including PubMed ID number, digital object identifier, journal name, article title and author list, using text approximation techniques. In a review of Metta searches carried out by public users, we found that the deduplication module was more effective at identifying duplicates than EndNote without making any erroneous assignments.

  20. Rule-based deduplication of article records from bibliographic databases

    PubMed Central

    Jiang, Yu; Lin, Can; Meng, Weiyi; Yu, Clement; Cohen, Aaron M.; Smalheiser, Neil R.

    2014-01-01

    We recently designed and deployed a metasearch engine, Metta, that sends queries and retrieves search results from five leading biomedical databases: PubMed, EMBASE, CINAHL, PsycINFO and the Cochrane Central Register of Controlled Trials. Because many articles are indexed in more than one of these databases, it is desirable to deduplicate the retrieved article records. This is not a trivial problem because data fields contain a lot of missing and erroneous entries, and because certain types of information are recorded differently (and inconsistently) in the different databases. The present report describes our rule-based method for deduplicating article records across databases and includes an open-source script module that can be deployed freely. Metta was designed to satisfy the particular needs of people who are writing systematic reviews in evidence-based medicine. These users want the highest possible recall in retrieval, so it is important to err on the side of not deduplicating any records that refer to distinct articles, and it is important to perform deduplication online in real time. Our deduplication module is designed with these constraints in mind. Articles that share the same publication year are compared sequentially on parameters including PubMed ID number, digital object identifier, journal name, article title and author list, using text approximation techniques. In a review of Metta searches carried out by public users, we found that the deduplication module was more effective at identifying duplicates than EndNote without making any erroneous assignments. PMID:24434031

  1. Implementation of the ground level enhancement alert software at NMDB database

    NASA Astrophysics Data System (ADS)

    Mavromichalaki, Helen; Souvatzoglou, George; Sarlanis, Christos; Mariatos, George; Papaioannou, Athanasios; Belov, Anatoly; Eroshenko, Eugenia; Yanke, Victor; NMDB Team

    2010-11-01

    The European Commission is supporting the real-time database for high-resolution neutron monitor measurements (NMDB) as an e-Infrastructures project in the Seventh Framework Programme in the Capacities section. The realization of the NMDB will provide the opportunity for several applications most of which will be implemented in real-time. An important application will be the establishment of an Alert signal when dangerous solar particle events are heading to the Earth, resulting into a ground level enhancement (GLE) registered by neutron monitors (NMs). The cosmic ray community has been occupied with the question of establishing such an Alert for many years and recently several groups succeeded in creating a proper algorithm capable of detecting space weather threats in an off-line mode. A lot of original work has been done to this direction and every group working in this field performed routine runs for all GLE cases, resulting into statistical analyses of GLE events. The next step was to make this algorithm as accurate as possible and most importantly, working in real-time. This was achieved when, during the last GLE observed so far, a real-time GLE Alert signal was produced. In this work, the steps of this procedure as well as the functionality of this algorithm for both the scientific community and users are being discussed. Nevertheless, the transition of the Alert algorithm to the NMDB is also being discussed.

  2. Second-Tier Database for Ecosystem Focus, 2003-2004 Annual Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    University of Washington, Columbia Basin Research, DART Project Staff,

    2004-12-01

    The Second-Tier Database for Ecosystem Focus (Contract 00004124) provides direct and timely public access to Columbia Basin environmental, operational, fishery and riverine data resources for federal, state, public and private entities essential to sound operational and resource management. The database also assists with juvenile and adult mainstem passage modeling supporting federal decisions affecting the operation of the FCRPS. The Second-Tier Database known as Data Access in Real Time (DART) integrates public data for effective access, consideration and application. DART also provides analysis tools and performance measures for evaluating the condition of Columbia Basin salmonid stocks. These services are critical tomore » BPA's implementation of its fish and wildlife responsibilities under the Endangered Species Act (ESA).« less

  3. A PCR primer bank for quantitative gene expression analysis.

    PubMed

    Wang, Xiaowei; Seed, Brian

    2003-12-15

    Although gene expression profiling by microarray analysis is a useful tool for assessing global levels of transcriptional activity, variability associated with the data sets usually requires that observed differences be validated by some other method, such as real-time quantitative polymerase chain reaction (real-time PCR). However, non-specific amplification of non-target genes is frequently observed in the latter, confounding the analysis in approximately 40% of real-time PCR attempts when primer-specific labels are not used. Here we present an experimentally validated algorithm for the identification of transcript-specific PCR primers on a genomic scale that can be applied to real-time PCR with sequence-independent detection methods. An online database, PrimerBank, has been created for researchers to retrieve primer information for their genes of interest. PrimerBank currently contains 147 404 primers encompassing most known human and mouse genes. The primer design algorithm has been tested by conventional and real-time PCR for a subset of 112 primer pairs with a success rate of 98.2%.

  4. Design of real-time communication system for image recognition based colony picking instrument

    NASA Astrophysics Data System (ADS)

    Wang, Qun; Zhang, Rongfu; Yan, Hua; Wu, Huamin

    2017-11-01

    In order to aachieve autommated observatiion and pickinng of monocloonal colonies, an overall dessign and realizzation of real-time commmunication system based on High-throoughput monooclonal auto-piicking instrumment is propossed. The real-time commmunication system is commposed of PCC-PLC commuunication systtem and Centrral Control CComputer (CCC)-PLC communicatioon system. Bassed on RS232 synchronous serial communnication methood to develop a set of dedicated shoort-range commmunication prootocol betweenn the PC and PPLC. Furthermmore, the systemm uses SQL SSERVER database to rrealize the dataa interaction between PC andd CCC. Moreoover, the commmunication of CCC and PC, adopted Socket Ethernnet communicaation based on TCP/IP protoccol. TCP full-dduplex data cannnel to ensure real-time data eexchange as well as immprove system reliability andd security. We tested the commmunication syystem using sppecially develooped test software, thee test results show that the sysstem can realizze the communnication in an eefficient, safe aand stable way between PLC, PC andd CCC, keep thhe real-time conntrol to PLC annd colony inforrmation collecttion.

  5. Quasi-real-time telemedical checkup system for x-ray examination of UGI tract based on high-speed network

    NASA Astrophysics Data System (ADS)

    Sakano, Toshikazu; Yamaguchi, Takahiro; Fujii, Tatsuya; Okumura, Akira; Furukawa, Isao; Ono, Sadayasu; Suzuki, Junji; Ando, Yutaka; Kohda, Ehiichi; Sugino, Yoshinori; Okada, Yoshiyuki; Amaki, Sachi

    2000-05-01

    We constructed a high-speed medical information network testbed, which is one of the largest testbeds in Japan, and applied it to practical medical checkups for the first time. The constructed testbed, which we call IMPACT, consists of a Super-High Definition Imaging system, a video conferencing system, a remote database system, and a 6 - 135 Mbps ATM network. The interconnected facilities include the School of Medicine in Keio University, a company's clinic, and an NTT R&D center, all in and around Tokyo. We applied IMPACT to the mass screening of the upper gastrointestinal (UGI) tract at the clinic. All 5419 radiographic images acquired at them clinic for 523 employees were digitized (2048 X 1698 X 12 bits) and transferred to a remote database in NTT. We then picked up about 50 images from five patients and sent them to nine radiological specialists at Keio University. The processing, which includes film digitization, image data transfer, and database registration, took 574 seconds per patient in average. The average reading time at Keio Univ. was 207 seconds. The overall processing time was estimated to be 781 seconds per patient. From these experimental results, we conclude that quasi-real time tele-medical checkups are possible with our prototype system.

  6. Ubiquitous Mobile Educational Data Management by Teachers, Students and Parents: Does Technology Change School-Family Communication and Parental Involvement?

    ERIC Educational Resources Information Center

    Blau, Ina; Hameiri, Mira

    2017-01-01

    Digital educational data management has become an integral part of school practices. Accessing school database by teachers, students, and parents from mobile devices promotes data-driven educational interactions based on real-time information. This paper analyses mobile access of educational database in a large sample of 429 schools during an…

  7. Massive Scale Cyber Traffic Analysis: A Driver for Graph Database Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Choudhury, S.; Haglin, David J.

    2013-06-19

    We describe the significance and prominence of network traffic analysis (TA) as a graph- and network-theoretical domain for advancing research in graph database systems. TA involves observing and analyzing the connections between clients, servers, hosts, and actors within IP networks, both at particular times and as extended over times. Towards that end, NetFlow (or more generically, IPFLOW) data are available from routers and servers which summarize coherent groups of IP packets flowing through the network. IPFLOW databases are routinely interrogated statistically and visualized for suspicious patterns. But the ability to cast IPFLOW data as a massive graph and query itmore » interactively, in order to e.g.\\ identify connectivity patterns, is less well advanced, due to a number of factors including scaling, and their hybrid nature combining graph connectivity and quantitative attributes. In this paper, we outline requirements and opportunities for graph-structured IPFLOW analytics based on our experience with real IPFLOW databases. Specifically, we describe real use cases from the security domain, cast them as graph patterns, show how to express them in two graph-oriented query languages SPARQL and Datalog, and use these examples to motivate a new class of "hybrid" graph-relational systems.« less

  8. Study on real-time images compounded using spatial light modulator

    NASA Astrophysics Data System (ADS)

    Xu, Jin; Chen, Zhebo; Ni, Xuxiang; Lu, Zukang

    2007-01-01

    Image compounded technology is often used on film and its facture. In common, image compounded use image processing arithmetic, get useful object, details, background or some other things from the images firstly, then compounding all these information into one image. When using this method, the film system needs a powerful processor, for the process function is very complex, we get the compounded image for a few time delay. In this paper, we introduce a new method of image real-time compounded, use this method, we can do image composite at the same time with movie shot. The whole system is made up of two camera-lens, spatial light modulator array and image sensor. In system, the spatial light modulator could be liquid crystal display (LCD), liquid crystal on silicon (LCoS), thin film transistor liquid crystal display (TFTLCD), Deformable Micro-mirror Device (DMD), and so on. Firstly, one camera-lens images the object on the spatial light modulator's panel, we call this camera-lens as first image lens. Secondly, we output an image to the panel of spatial light modulator. Then, the image of the object and image that output by spatial light modulator will be spatial compounded on the panel of spatial light modulator. Thirdly, the other camera-lens images the compounded image to the image sensor, and we call this camera-lens as second image lens. After these three steps, we will gain the compound images by image sensor. For the spatial light modulator could output the image continuously, then the image will be compounding continuously too, and the compounding procedure is completed in real-time. When using this method to compounding image, if we will put real object into invented background, we can output the invented background scene on the spatial light modulator, and the real object will be imaged by first image lens. Then, we get the compounded images by image sensor in real time. The same way, if we will put real background to an invented object, we can output the invented object on the spatial light modulator and the real background will be imaged by first image lens. Then, we can also get the compounded images by image sensor real time. Commonly, most spatial light modulator only can do modulate light intensity, so we can only do compounding BW images if use only one panel which without color filter. If we will get colorful compounded image, we need use the system like three spatial light modulator panel projection. In the paper, the system's optical system framework we will give out. In all experiment, the spatial light modulator used liquid crystal on silicon (LCoS). At the end of the paper, some original pictures and compounded pictures will be given on it. Although the system has a few shortcomings, we can conclude that, using this system to compounding images has no delay to do mathematic compounding process, it is a really real time images compounding system.

  9. Software architecture for a distributed real-time system in Ada, with application to telerobotics

    NASA Technical Reports Server (NTRS)

    Olsen, Douglas R.; Messiora, Steve; Leake, Stephen

    1992-01-01

    The architecture structure and software design methodology presented is described in the context of telerobotic application in Ada, specifically the Engineering Test Bed (ETB), which was developed to support the Flight Telerobotic Servicer (FTS) Program at GSFC. However, the nature of the architecture is such that it has applications to any multiprocessor distributed real-time system. The ETB architecture, which is a derivation of the NASA/NBS Standard Reference Model (NASREM), defines a hierarchy for representing a telerobot system. Within this hierarchy, a module is a logical entity consisting of the software associated with a set of related hardware components in the robot system. A module is comprised of submodules, which are cyclically executing processes that each perform a specific set of functions. The submodules in a module can run on separate processors. The submodules in the system communicate via command/status (C/S) interface channels, which are used to send commands down and relay status back up the system hierarchy. Submodules also communicate via setpoint data links, which are used to transfer control data from one submodule to another. A submodule invokes submodule algorithms (SMA's) to perform algorithmic operations. Data that describe or models a physical component of the system are stored as objects in the World Model (WM). The WM is a system-wide distributed database that is accessible to submodules in all modules of the system for creating, reading, and writing objects.

  10. Remote sensing and GIS integration: Towards intelligent imagery within a spatial data infrastructure

    NASA Astrophysics Data System (ADS)

    Abdelrahim, Mohamed Mahmoud Hosny

    2001-11-01

    In this research, an "Intelligent Imagery System Prototype" (IISP) was developed. IISP is an integration tool that facilitates the environment for active, direct, and on-the-fly usage of high resolution imagery, internally linked to hidden GIS vector layers, to query the real world phenomena and, consequently, to perform exploratory types of spatial analysis based on a clear/undisturbed image scene. The IISP was designed and implemented using the software components approach to verify the hypothesis that a fully rectified, partially rectified, or even unrectified digital image can be internally linked to a variety of different hidden vector databases/layers covering the end user area of interest, and consequently may be reliably used directly as a base for "on-the-fly" querying of real-world phenomena and for performing exploratory types of spatial analysis. Within IISP, differentially rectified, partially rectified (namely, IKONOS GEOCARTERRA(TM)), and unrectified imagery (namely, scanned aerial photographs and captured video frames) were investigated. The system was designed to handle four types of spatial functions, namely, pointing query, polygon/line-based image query, database query, and buffering. The system was developed using ESRI MapObjects 2.0a as the core spatial component within Visual Basic 6.0. When used to perform the pre-defined spatial queries using different combinations of image and vector data, the IISP provided the same results as those obtained by querying pre-processed vector layers even when the image used was not orthorectified and the vector layers had different parameters. In addition, the real-time pixel location orthorectification technique developed and presented within the IKONOS GEOCARTERRA(TM) case provided a horizontal accuracy (RMSE) of +/- 2.75 metres. This accuracy is very close to the accuracy level obtained when purchasing the orthorectified IKONOS PRECISION products (RMSE of +/- 1.9 metre). The latter cost approximately four times as much as the IKONOS GEOCARTERRA(TM) products. The developed IISP is a step closer towards the direct and active involvement of high-resolution remote sensing imagery in querying the real world and performing exploratory types of spatial analysis. (Abstract shortened by UMI.)

  11. A Distributed Web-based Solution for Ionospheric Model Real-time Management, Monitoring, and Short-term Prediction

    NASA Astrophysics Data System (ADS)

    Kulchitsky, A.; Maurits, S.; Watkins, B.

    2006-12-01

    With the widespread availability of the Internet today, many people can monitor various scientific research activities. It is important to accommodate this interest providing on-line access to dynamic and illustrative Web-resources, which could demonstrate different aspects of ongoing research. It is especially important to explain and these research activities for high school and undergraduate students, thereby providing more information for making decisions concerning their future studies. Such Web resources are also important to clarify scientific research for the general public, in order to achieve better awareness of research progress in various fields. Particularly rewarding is dissemination of information about ongoing projects within Universities and research centers to their local communities. The benefits of this type of scientific outreach are mutual, since development of Web-based automatic systems is prerequisite for many research projects targeting real-time monitoring and/or modeling of natural conditions. Continuous operation of such systems provide ongoing research opportunities for the statistically massive validation of the models, as well. We have developed a Web-based system to run the University of Alaska Fairbanks Polar Ionospheric Model in real-time. This model makes use of networking and computational resources at the Arctic Region Supercomputing Center. This system was designed to be portable among various operating systems and computational resources. Its components can be installed across different computers, separating Web servers and computational engines. The core of the system is a Real-Time Management module (RMM) written Python, which facilitates interactions of remote input data transfers, the ionospheric model runs, MySQL database filling, and PHP scripts for the Web-page preparations. The RMM downloads current geophysical inputs as soon as they become available at different on-line depositories. This information is processed to provide inputs for the next ionospheic model time step and then stored in a MySQL database as the first part of the time-specific record. The RMM then performs synchronization of the input times with the current model time, prepares a decision on initialization for the next model time step, and monitors its execution. Then, as soon as the model completes computations for the next time step, RMM visualizes the current model output into various short-term (about 1-2 hours) forecasting products and compares prior results with available ionospheric measurements. The RMM places prepared images into the MySQL database, which can be located on a different computer node, and then proceeds to the next time interval continuing the time-loop. The upper-level interface of this real-time system is the a PHP-based Web site (http://www.arsc.edu/SpaceWeather/new). This site provides general information about the Earth polar and adjacent mid-latitude ionosphere, allows for monitoring of the current developments and short-term forecasts, and facilitates access to the comparisons archive stored in the database.

  12. Looking inward and back: Real-time monitoring of visual working memories.

    PubMed

    Suchow, Jordan W; Fougnie, Daryl; Alvarez, George A

    2017-04-01

    Confidence in our memories is influenced by many factors, including beliefs about the perceptibility or memorability of certain kinds of objects and events, as well as knowledge about our skill sets, habits, and experiences. Notoriously, our knowledge and beliefs about memory can lead us astray, causing us to be overly confident in eyewitness testimony or to overestimate the frequency of recent experiences. Here, using visual working memory as a case study, we stripped away all these potentially misleading cues, requiring observers to make confidence judgments by directly assessing the quality of their memory representations. We show that individuals can monitor the status of information in working memory as it degrades over time. Our findings suggest that people have access to information reflecting the existence and quality of their working memories, and furthermore, that they can use this information to guide their behavior. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. A novel association rule mining approach using TID intermediate itemset.

    PubMed

    Aqra, Iyad; Herawan, Tutut; Abdul Ghani, Norjihan; Akhunzada, Adnan; Ali, Akhtar; Bin Razali, Ramdan; Ilahi, Manzoor; Raymond Choo, Kim-Kwang

    2018-01-01

    Designing an efficient association rule mining (ARM) algorithm for multilevel knowledge-based transactional databases that is appropriate for real-world deployments is of paramount concern. However, dynamic decision making that needs to modify the threshold either to minimize or maximize the output knowledge certainly necessitates the extant state-of-the-art algorithms to rescan the entire database. Subsequently, the process incurs heavy computation cost and is not feasible for real-time applications. The paper addresses efficiently the problem of threshold dynamic updation for a given purpose. The paper contributes by presenting a novel ARM approach that creates an intermediate itemset and applies a threshold to extract categorical frequent itemsets with diverse threshold values. Thus, improving the overall efficiency as we no longer needs to scan the whole database. After the entire itemset is built, we are able to obtain real support without the need of rebuilding the itemset (e.g. Itemset list is intersected to obtain the actual support). Moreover, the algorithm supports to extract many frequent itemsets according to a pre-determined minimum support with an independent purpose. Additionally, the experimental results of our proposed approach demonstrate the capability to be deployed in any mining system in a fully parallel mode; consequently, increasing the efficiency of the real-time association rules discovery process. The proposed approach outperforms the extant state-of-the-art and shows promising results that reduce computation cost, increase accuracy, and produce all possible itemsets.

  14. A novel association rule mining approach using TID intermediate itemset

    PubMed Central

    Ali, Akhtar; Bin Razali, Ramdan; Ilahi, Manzoor; Raymond Choo, Kim-Kwang

    2018-01-01

    Designing an efficient association rule mining (ARM) algorithm for multilevel knowledge-based transactional databases that is appropriate for real-world deployments is of paramount concern. However, dynamic decision making that needs to modify the threshold either to minimize or maximize the output knowledge certainly necessitates the extant state-of-the-art algorithms to rescan the entire database. Subsequently, the process incurs heavy computation cost and is not feasible for real-time applications. The paper addresses efficiently the problem of threshold dynamic updation for a given purpose. The paper contributes by presenting a novel ARM approach that creates an intermediate itemset and applies a threshold to extract categorical frequent itemsets with diverse threshold values. Thus, improving the overall efficiency as we no longer needs to scan the whole database. After the entire itemset is built, we are able to obtain real support without the need of rebuilding the itemset (e.g. Itemset list is intersected to obtain the actual support). Moreover, the algorithm supports to extract many frequent itemsets according to a pre-determined minimum support with an independent purpose. Additionally, the experimental results of our proposed approach demonstrate the capability to be deployed in any mining system in a fully parallel mode; consequently, increasing the efficiency of the real-time association rules discovery process. The proposed approach outperforms the extant state-of-the-art and shows promising results that reduce computation cost, increase accuracy, and produce all possible itemsets. PMID:29351287

  15. Using LabView for real-time monitoring and tracking of multiple biological objects

    NASA Astrophysics Data System (ADS)

    Nikolskyy, Aleksandr I.; Krasilenko, Vladimir G.; Bilynsky, Yosyp Y.; Starovier, Anzhelika

    2017-04-01

    Today real-time studying and tracking of movement dynamics of various biological objects is important and widely researched. Features of objects, conditions of their visualization and model parameters strongly influence the choice of optimal methods and algorithms for a specific task. Therefore, to automate the processes of adaptation of recognition tracking algorithms, several Labview project trackers are considered in the article. Projects allow changing templates for training and retraining the system quickly. They adapt to the speed of objects and statistical characteristics of noise in images. New functions of comparison of images or their features, descriptors and pre-processing methods will be discussed. The experiments carried out to test the trackers on real video files will be presented and analyzed.

  16. A Prototype Lisp-Based Soft Real-Time Object-Oriented Graphical User Interface for Control System Development

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan; Wong, Edmond; Simon, Donald L.

    1994-01-01

    A prototype Lisp-based soft real-time object-oriented Graphical User Interface for control system development is presented. The Graphical User Interface executes alongside a test system in laboratory conditions to permit observation of the closed loop operation through animation, graphics, and text. Since it must perform interactive graphics while updating the screen in real time, techniques are discussed which allow quick, efficient data processing and animation. Examples from an implementation are included to demonstrate some typical functionalities which allow the user to follow the control system's operation.

  17. Real-time surgical simulation for deformable soft-tissue objects with a tumour using Boundary Element techniques

    NASA Astrophysics Data System (ADS)

    Wang, P.; Becker, A. A.; Jones, I. A.; Glover, A. T.; Benford, S. D.; Vloeberghs, M.

    2009-08-01

    A virtual-reality real-time simulation of surgical operations that incorporates the inclusion of a hard tumour is presented. The software is based on Boundary Element (BE) technique. A review of the BE formulation for real-time analysis of two-domain deformable objects, using the pre-solution technique, is presented. The two-domain BE software is incorporated into a surgical simulation system called VIRS to simulate the initiation of a cut on the surface of the soft tissue and extending the cut deeper until the tumour is reached.

  18. Estimating Real-Time Zenith Tropospheric Delay over Africa Using IGS-RTS Products

    NASA Astrophysics Data System (ADS)

    Abdelazeem, M.

    2017-12-01

    Zenith Tropospheric Delay (ZTD) is a crucial parameter for atmospheric modeling, severe weather monitoring and forecasting applications. Currently, the international global navigation satellite system (GNSS) real-time service (IGS-RTS) products are used extensively in real-time atmospheric modeling applications. The objective of this study is to develop a real time zenith tropospheric delay estimation model over Africa using the IGS-RTS products. The real-time ZTDs are estimated based on the real-time precise point positioning (PPP) solution. GNSS observations from a number of reference stations are processed over a period of 7 days. Then, the estimated real-time ZTDs are compared with the IGS tropospheric products counterparts. The findings indicate that the estimated real-time ZTDs have millimeter level accuracy in comparison with the IGS counterparts.

  19. Object oriented design (OOD) in real-time hardware-in-the-loop (HWIL) simulations

    NASA Astrophysics Data System (ADS)

    Morris, Joe; Richard, Henri; Lowman, Alan; Youngren, Rob

    2006-05-01

    Using Object Oriented Design (OOD) concepts in AMRDEC's Hardware-in-the Loop (HWIL) real-time simulations allows the user to interchange parts of the simulation to meet test requirements. A large-scale three-spectral band simulator connected via a high speed reflective memory ring for time-critical data transfers to PC controllers connected by non real-time Ethernet protocols is used to separate software objects from logical entities close to their respective controlled hardware. Each standalone object does its own dynamic initialization, real-time processing, and end of run processing; therefore it can be easily maintained and updated. A Resource Allocation Program (RAP) is also utilized along with a device table to allocate, organize, and document the communication protocol between the software and hardware components. A GUI display program lists all allocations and deallocations of HWIL memory and hardware resources. This interactive program is also used to clean up defunct allocations of dead processes. Three examples are presented using the OOD and RAP concepts. The first is the control of an ACUTRONICS built three-axis flight table using the same control for calibration and real-time functions. The second is the transportability of a six-degree-of-freedom (6-DOF) simulation from an Onyx residence to a Linux-PC. The third is the replacement of the 6-DOF simulation with a replay program to drive the facility with archived run data for demonstration or analysis purposes.

  20. Online decoding of object-based attention using real-time fMRI.

    PubMed

    Niazi, Adnan M; van den Broek, Philip L C; Klanke, Stefan; Barth, Markus; Poel, Mannes; Desain, Peter; van Gerven, Marcel A J

    2014-01-01

    Visual attention is used to selectively filter relevant information depending on current task demands and goals. Visual attention is called object-based attention when it is directed to coherent forms or objects in the visual field. This study used real-time functional magnetic resonance imaging for moment-to-moment decoding of attention to spatially overlapped objects belonging to two different object categories. First, a whole-brain classifier was trained on pictures of faces and places. Subjects then saw transparently overlapped pictures of a face and a place, and attended to only one of them while ignoring the other. The category of the attended object, face or place, was decoded on a scan-by-scan basis using the previously trained decoder. The decoder performed at 77.6% accuracy indicating that despite competing bottom-up sensory input, object-based visual attention biased neural patterns towards that of the attended object. Furthermore, a comparison between different classification approaches indicated that the representation of faces and places is distributed rather than focal. This implies that real-time decoding of object-based attention requires a multivariate decoding approach that can detect these distributed patterns of cortical activity. © 2013 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  1. Accuracy of LightCycler(R) SeptiFast for the detection and identification of pathogens in the blood of patients with suspected sepsis: a systematic review protocol.

    PubMed

    Dark, Paul; Wilson, Claire; Blackwood, Bronagh; McAuley, Danny F; Perkins, Gavin D; McMullan, Ronan; Gates, Simon; Warhurst, Geoffrey

    2012-01-01

    Background There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probe-based real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care. Registration PROSPERO-NIHR Prospective Register of Systematic Reviews (CRD42011001289).

  2. Information Network Model Query Processing

    NASA Astrophysics Data System (ADS)

    Song, Xiaopu

    Information Networking Model (INM) [31] is a novel database model for real world objects and relationships management. It naturally and directly supports various kinds of static and dynamic relationships between objects. In INM, objects are networked through various natural and complex relationships. INM Query Language (INM-QL) [30] is designed to explore such information network, retrieve information about schema, instance, their attributes, relationships, and context-dependent information, and process query results in the user specified form. INM database management system has been implemented using Berkeley DB, and it supports INM-QL. This thesis is mainly focused on the implementation of the subsystem that is able to effectively and efficiently process INM-QL. The subsystem provides a lexical and syntactical analyzer of INM-QL, and it is able to choose appropriate evaluation strategies and index mechanism to process queries in INM-QL without the user's intervention. It also uses intermediate result structure to hold intermediate query result and other helping structures to reduce complexity of query processing.

  3. Large holographic displays for real-time applications

    NASA Astrophysics Data System (ADS)

    Schwerdtner, A.; Häussler, R.; Leister, N.

    2008-02-01

    Holography is generally accepted as the ultimate approach to display three-dimensional scenes or objects. Principally, the reconstruction of an object from a perfect hologram would appear indistinguishable from viewing the corresponding real-world object. Up to now two main obstacles have prevented large-screen Computer-Generated Holograms (CGH) from achieving a satisfactory laboratory prototype not to mention a marketable one. The reason is a small cell pitch CGH resulting in a huge number of hologram cells and a very high computational load for encoding the CGH. These seemingly inevitable technological hurdles for a long time have not been cleared limiting the use of holography to special applications, such as optical filtering, interference, beam forming, digital holography for capturing the 3-D shape of objects, and others. SeeReal Technologies has developed a new approach for real-time capable CGH using the socalled Tracked Viewing Windows technology to overcome these problems. The paper will show that today's state of the art reconfigurable Spatial Light Modulators (SLM), especially today's feasible LCD panels are suited for reconstructing large 3-D scenes which can be observed from large viewing angles. For this to achieve the original holographic concept of containing information from the entire scene in each part of the CGH has been abandoned. This substantially reduces the hologram resolution and thus the computational load by several orders of magnitude making thus real-time computation possible. A monochrome real-time prototype measuring 20 inches has been built and demonstrated at last year's SID conference and exhibition 2007 and at several other events.

  4. Real-time Astrometry Using Phase Congruency

    NASA Astrophysics Data System (ADS)

    Lambert, A.; Polo, M.; Tang, Y.

    Phase congruency is a computer vision technique that proves to perform well for determining the tracks of optical objects (Flewelling, AMOS 2014). We report on a real-time implementation of this using an FPGA and CMOS Image Sensor, with on-sky data. The lightweight instrument can provide tracking update signals to the mount of the telescope, as well as determine abnormal objects in the scene.

  5. Road Risk Modeling and Cloud-Aided Safety-Based Route Planning.

    PubMed

    Li, Zhaojian; Kolmanovsky, Ilya; Atkins, Ella; Lu, Jianbo; Filev, Dimitar P; Michelini, John

    2016-11-01

    This paper presents a safety-based route planner that exploits vehicle-to-cloud-to-vehicle (V2C2V) connectivity. Time and road risk index (RRI) are considered as metrics to be balanced based on user preference. To evaluate road segment risk, a road and accident database from the highway safety information system is mined with a hybrid neural network model to predict RRI. Real-time factors such as time of day, day of the week, and weather are included as correction factors to the static RRI prediction. With real-time RRI and expected travel time, route planning is formulated as a multiobjective network flow problem and further reduced to a mixed-integer programming problem. A V2C2V implementation of our safety-based route planning approach is proposed to facilitate access to real-time information and computing resources. A real-world case study, route planning through the city of Columbus, Ohio, is presented. Several scenarios illustrate how the "best" route can be adjusted to favor time versus safety metrics.

  6. Ultrafast Target Recognition via Super-Parallel Holograph Based Correlator, RAM and Associative Memory

    DTIC Science & Technology

    2008-03-11

    JTC) 2𔃾 based on a dynamic material answers the challenge of fast correlation with large databases. Images retrieved from the SPHRAM and used as the...transform (JTC) and matched spatial filter or VanderLugt ( VLC ) correlators, either of which can be implemented in real-time by degenerate four wave-mixing in...proposed system, consisting of the SPHROM coupled with a shift-invariant real-time VLC . The correlation is performed in the VLC architecture to

  7. NASA-Langley Web-Based Operational Real-time Cloud Retrieval Products from Geostationary Satellites

    NASA Technical Reports Server (NTRS)

    Palikonda, Rabindra; Minnis, Patrick; Spangenberg, Douglas A.; Khaiyer, Mandana M.; Nordeen, Michele L.; Ayers, Jeffrey K.; Nguyen, Louis; Yi, Yuhong; Chan, P. K.; Trepte, Qing Z.; hide

    2006-01-01

    At NASA Langley Research Center (LaRC), radiances from multiple satellites are analyzed in near real-time to produce cloud products over many regions on the globe. These data are valuable for many applications such as diagnosing aircraft icing conditions and model validation and assimilation. This paper presents an overview of the multiple products available, summarizes the content of the online database, and details web-based satellite browsers and tools to access satellite imagery and products.

  8. Memristive Computational Architecture of an Echo State Network for Real-Time Speech Emotion Recognition

    DTIC Science & Technology

    2015-05-28

    recognition is simpler and requires less computational resources compared to other inputs such as facial expressions . The Berlin database of Emotional ...Processing Magazine, IEEE, vol. 18, no. 1, pp. 32– 80, 2001. [15] K. R. Scherer, T. Johnstone, and G. Klasmeyer, “Vocal expression of emotion ...Network for Real-Time Speech- Emotion Recognition 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62788F 6. AUTHOR(S) Q

  9. Humanoid Robotics: Real-Time Object Oriented Programming

    NASA Technical Reports Server (NTRS)

    Newton, Jason E.

    2005-01-01

    Programming of robots in today's world is often done in a procedural oriented fashion, where object oriented programming is not incorporated. In order to keep a robust architecture allowing for easy expansion of capabilities and a truly modular design, object oriented programming is required. However, concepts in object oriented programming are not typically applied to a real time environment. The Fujitsu HOAP-2 is the test bed for the development of a humanoid robot framework abstracting control of the robot into simple logical commands in a real time robotic system while allowing full access to all sensory data. In addition to interfacing between the motor and sensory systems, this paper discusses the software which operates multiple independently developed control systems simultaneously and the safety measures which keep the humanoid from damaging itself and its environment while running these systems. The use of this software decreases development time and costs and allows changes to be made while keeping results safe and predictable.

  10. Prevention of data duplication for high throughput sequencing repositories

    PubMed Central

    Gabdank, Idan; Chan, Esther T; Davidson, Jean M; Hilton, Jason A; Davis, Carrie A; Baymuradov, Ulugbek K; Narayanan, Aditi; Onate, Kathrina C; Graham, Keenan; Miyasato, Stuart R; Dreszer, Timothy R; Strattan, J Seth; Jolanki, Otto; Tanaka, Forrest Y; Hitz, Benjamin C

    2018-01-01

    Abstract Prevention of unintended duplication is one of the ongoing challenges many databases have to address. Working with high-throughput sequencing data, the complexity of that challenge increases with the complexity of the definition of a duplicate. In a computational data model, a data object represents a real entity like a reagent or a biosample. This representation is similar to how a card represents a book in a paper library catalog. Duplicated data objects not only waste storage, they can mislead users into assuming the model represents more than the single entity. Even if it is clear that two objects represent a single entity, data duplication opens the door to potential inconsistencies between the objects since the content of the duplicated objects can be updated independently, allowing divergence of the metadata associated with the objects. Analogously to a situation in which a catalog in a paper library would contain by mistake two cards for a single copy of a book. If these cards are listing simultaneously two different individuals as current book borrowers, it would be difficult to determine which borrower (out of the two listed) actually has the book. Unfortunately, in a large database with multiple submitters, unintended duplication is to be expected. In this article, we present three principal guidelines the Encyclopedia of DNA Elements (ENCODE) Portal follows in order to prevent unintended duplication of both actual files and data objects: definition of identifiable data objects (I), object uniqueness validation (II) and de-duplication mechanism (III). In addition to explaining our modus operandi, we elaborate on the methods used for identification of sequencing data files. Comparison of the approach taken by the ENCODE Portal vs other widely used biological data repositories is provided. Database URL: https://www.encodeproject.org/ PMID:29688363

  11. RECENT DEVELOPMENTS IN HYDROWEB DATABASE Water level time series on lakes and reservoirs (Invited)

    NASA Astrophysics Data System (ADS)

    Cretaux, J.; Arsen, A.; Calmant, S.

    2013-12-01

    We present the current state of the Hydroweb database as well as developments in progress. It provides offline water level time series on rivers, reservoirs and lakes based on altimetry data from several satellites (Topex/Poseidon, ERS, Jason-1&2, GFO and ENVISAT). The major developments in Hydroweb concerns the development of an operational data centre with automatic acquisition and processing of IGDR data for updating time series in near real time (both for lakes & rivers) and also use of additional remote sensing data, like satellite imagery allowing the calculation of lake's surfaces. A lake data centre is under development at the Legos in coordination with Hydrolare Project leaded by SHI (State Hydrological Institute of the Russian Academy of Science). It will provide the level-surface-volume variations of about 230 lakes and reservoirs, calculated through combination of various satellite images (Modis, Asar, Landsat, Cbers) and radar altimetry (Topex / Poseidon, Jason-1 & 2, GFO, Envisat, ERS2, AltiKa). The final objective is to propose a data centre fully based on remote sensing technique and controlled by in situ infrastructure for the Global Terrestrial Network for Lakes (GTN-L) under the supervision of WMO and GCOS. In a longer perspective, the Hydroweb database will integrate data from future missions (Jason-3, Jason-CS, Sentinel-3A/B) and finally will serve for the design of the SWOT mission. The products of hydroweb will be used as input data for simulation of the SWOT products (water height and surface variations of lakes and rivers). In the future, the SWOT mission will allow to monitor on a sub-monthly basis the worldwide lakes and reservoirs bigger than 250 * 250 m and Hydroweb will host water level and extent products from this

  12. A Dynamic Time Warping Approach to Real-Time Activity Recognition for Food Preparation

    NASA Astrophysics Data System (ADS)

    Pham, Cuong; Plötz, Thomas; Olivier, Patrick

    We present a dynamic time warping based activity recognition system for the analysis of low-level food preparation activities. Accelerometers embedded into kitchen utensils provide continuous sensor data streams while people are using them for cooking. The recognition framework analyzes frames of contiguous sensor readings in real-time with low latency. It thereby adapts to the idiosyncrasies of utensil use by automatically maintaining a template database. We demonstrate the effectiveness of the classification approach by a number of real-world practical experiments on a publically available dataset. The adaptive system shows superior performance compared to a static recognizer. Furthermore, we demonstrate the generalization capabilities of the system by gradually reducing the amount of training samples. The system achieves excellent classification results even if only a small number of training samples is available, which is especially relevant for real-world scenarios.

  13. Predictive Modeling for the Growth of Salmonella Enteritidis in Chicken Juice by Real-Time Polymerase Chain Reaction.

    PubMed

    Noviyanti, Fia; Hosotani, Yukie; Koseki, Shigenobu; Inatsu, Yasuhiro; Kawasaki, Susumu

    2018-04-02

    The goals of this study were to monitor the growth kinetics of Salmonella Enteritidis in chicken juice using real-time polymerase chain reaction (PCR) and to evaluate its efficacy by comparing the results with an experimental database. Salmonella Enteritidis was inoculated in chicken juice samples at an initial inoculum of 10 4 CFU/mL with inoculated samples incubated at six different temperatures (10, 15, 20, 25, 30, and 35°C). Sampling was carried out for 36 h to observe the growth of Salmonella Enteritidis. The total DNA was extracted from the samples, and the copy number of the Salmonella invasion gene (invA) was quantified by real-time PCR and converted to Salmonella Enteritidis cell concentration. Growth kinetics data were analyzed by the Baranyi and Roberts model to obtain growth parameters, whereas the Ratkowsky's square-root model was used to describe the effect of the interactions between growth parameters and temperature on the growth of Salmonella Enteritidis. The growth parameters of Salmonella Enteritidis obtained from an experiment conducted at a constant temperature were validated with growth data from chicken juice samples that were incubated under fluctuating temperature conditions between 5°C and 30°C for 30-min periods. A high correlation was observed between maximum growth rate (μ max ) and storage temperature, indicating that the real-time PCR-monitoring method provides a precise estimation of Salmonella Enteritidis growth in food material with a microbial flora. Moreover, the μ max data reflected data from microbial responses viewer database and ComBase. The results of this study suggested that real-time PCR monitoring provides a precise estimation of Salmonella Enteritidis growth in food materials with a background microbial flora.

  14. Does real-time objective feedback and competition improve performance and quality in manikin CPR training--a prospective observational study from several European EMS.

    PubMed

    Smart, J R; Kranz, K; Carmona, F; Lindner, T W; Newton, A

    2015-10-15

    Previous studies have reported that the quality of cardiopulmonary resuscitation (CPR) is important for patient survival. Real time objective feedback during manikin training has been shown to improve CPR performance. Objective measurement could facilitate competition and help motivate participants to improve their CPR performance. The aims of this study were to investigate whether real time objective feedback on manikins helps improve CPR performance and whether competition between separate European Emergency Medical Services (EMS) and between participants at each EMS helps motivation to train. Ten European EMS took part in the study and was carried out in two stages. At Stage 1, each EMS provided 20 pre-hospital professionals. A questionnaire was completed and standardised assessment scenarios were performed for adult and infant out of hospital cardiac arrest (OHCA). CPR performance was objectively measured and recorded but no feedback given. Between Stage 1 and 2, each EMS was given access to manikins for 6 months and instructed on how to use with objective real-time CPR feedback available. Stage 2 was undertaken and was a repeat of Stage 1 with a questionnaire with additional questions relating to usefulness of feedback and the competition nature of the study (using a 10 point Likert score). The EMS that improved the most from Stage 1 to Stage 2 was declared the winner. An independent samples Student t-test was used to analyse the objective CPR metrics with the significance level taken as p < 0.05. Overall mean Improvement of CPR performance from Stage 1 to Stage 2 was significant. The improvement was greater for the infant assessment. The participants thought the real-time feedback very useful (mean score of 8.5) and very easy to use (mean score of 8.2). Competition between EMS organisations recorded a mean score of 5.8 and competition between participants recorded a mean score of 6.0. The results suggest that the use of real time objective feedback can significantly help improve CPR performance. Competition, especially between participants, appeared to encourage staff to practice and this study suggests that competition might have a useful role to help motivate staff to perform CPR training.

  15. Graphics processing unit (GPU) real-time infrared scene generation

    NASA Astrophysics Data System (ADS)

    Christie, Chad L.; Gouthas, Efthimios (Themie); Williams, Owen M.

    2007-04-01

    VIRSuite, the GPU-based suite of software tools developed at DSTO for real-time infrared scene generation, is described. The tools include the painting of scene objects with radiometrically-associated colours, translucent object generation, polar plot validation and versatile scene generation. Special features include radiometric scaling within the GPU and the presence of zoom anti-aliasing at the core of VIRSuite. Extension of the zoom anti-aliasing construct to cover target embedding and the treatment of translucent objects is described.

  16. Help Me Please!: Designing and Developing Application for Emergencies

    NASA Astrophysics Data System (ADS)

    Hong, Ng Ken; Hafit, Hanayanti; Wahid, Norfaradilla; Kasim, Shahreen; Yusof, Munirah Mohd

    2017-08-01

    Help Me Please! Application is an android platform emergency button application that is designed to transmit emergency messages to target receivers with real time information. The purpose of developing this application is to help people to notify any emergency circumstances via Short Message Service (SMS) in android platform. The application will receive the current location from Global Positioning System (GPS), will obtain the current time from the mobile device and send this information to the receivers when user presses the emergency button. Simultaneously, the application will keep sending the emergency alerts to receivers and will update to database based on the time interval set by user until user stop the function. Object-oriented Software Development model is employed to guide the development of this application with the knowledge of Java language and Android Studio. In conclusion, this application plays an important role in rescuing process when emergency circumstances happen. The rescue process will become more effective by notifying the emergency circumstances and send the current location of user to others in the early hours.

  17. Cost-efficient scheduling of FAST observations

    NASA Astrophysics Data System (ADS)

    Luo, Qi; Zhao, Laiping; Yu, Ce; Xiao, Jian; Sun, Jizhou; Zhu, Ming; Zhong, Yi

    2018-03-01

    A cost-efficient schedule for the Five-hundred-meter Aperture Spherical radio Telescope (FAST) requires to maximize the number of observable proposals and the overall scientific priority, and minimize the overall slew-cost generated by telescope shifting, while taking into account the constraints including the astronomical objects visibility, user-defined observable times, avoiding Radio Frequency Interference (RFI). In this contribution, first we solve the problem of maximizing the number of observable proposals and scientific priority by modeling it as a Minimum Cost Maximum Flow (MCMF) problem. The optimal schedule can be found by any MCMF solution algorithm. Then, for minimizing the slew-cost of the generated schedule, we devise a maximally-matchable edges detection-based method to reduce the problem size, and propose a backtracking algorithm to find the perfect matching with minimum slew-cost. Experiments on a real dataset from NASA/IPAC Extragalactic Database (NED) show that, the proposed scheduler can increase the usage of available times with high scientific priority and reduce the slew-cost significantly in a very short time.

  18. Quantum search of a real unstructured database

    NASA Astrophysics Data System (ADS)

    Broda, Bogusław

    2016-02-01

    A simple circuit implementation of the oracle for Grover's quantum search of a real unstructured classical database is proposed. The oracle contains a kind of quantumly accessible classical memory, which stores the database.

  19. Multiple Solutions of Real-time Tsunami Forecasting Using Short-term Inundation Forecasting for Tsunamis Tool

    NASA Astrophysics Data System (ADS)

    Gica, E.

    2016-12-01

    The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.

  20. Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields

    NASA Astrophysics Data System (ADS)

    Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo

    The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.

  1. Real-time detection of moving objects from moving vehicles using dense stereo and optical flow

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Matthies, Larry

    2004-01-01

    Dynamic scene perception is very important for autonomous vehicles operating around other moving vehicles and humans. Most work on real-time object tracking from moving platforms has used sparse features or assumed flat scene structures. We have recently extended a real-time, dense stereo system to include realtime, dense optical flow, enabling more comprehensive dynamic scene analysis. We describe algorithms to robustly estimate 6-DOF robot egomotion in the presence of moving objects using dense flow and dense stereo. We then use dense stereo and egomotion estimates to identify & other moving objects while the robot itself is moving. We present results showing accurate egomotion estimation and detection of moving people and vehicles under general 6-DOF motion of the robot and independently moving objects. The system runs at 18.3 Hz on a 1.4 GHz Pentium M laptop, computing 160x120 disparity maps and optical flow fields, egomotion, and moving object segmentation. We believe this is a significant step toward general unconstrained dynamic scene analysis for mobile robots, as well as for improved position estimation where GPS is unavailable.

  2. Real-time Detection of Moving Objects from Moving Vehicles Using Dense Stereo and Optical Flow

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Matthies, Larry

    2004-01-01

    Dynamic scene perception is very important for autonomous vehicles operating around other moving vehicles and humans. Most work on real-time object tracking from moving platforms has used sparse features or assumed flat scene structures. We have recently extended a real-time. dense stereo system to include realtime. dense optical flow, enabling more comprehensive dynamic scene analysis. We describe algorithms to robustly estimate 6-DOF robot egomotion in the presence of moving objects using dense flow and dense stereo. We then use dense stereo and egomotion estimates to identify other moving objects while the robot itself is moving. We present results showing accurate egomotion estimation and detection of moving people and vehicles under general 6DOF motion of the robot and independently moving objects. The system runs at 18.3 Hz on a 1.4 GHz Pentium M laptop. computing 160x120 disparity maps and optical flow fields, egomotion, and moving object segmentation. We believe this is a significant step toward general unconstrained dynamic scene analysis for mobile robots, as well as for improved position estimation where GPS is unavailable.

  3. A new method for recognizing quadric surfaces from range data and its application to telerobotics and automation, final phase

    NASA Technical Reports Server (NTRS)

    Mielke, Roland; Dcunha, Ivan; Alvertos, Nicolas

    1994-01-01

    In the final phase of the proposed research a complete top to down three dimensional object recognition scheme has been proposed. The various three dimensional objects included spheres, cones, cylinders, ellipsoids, paraboloids, and hyperboloids. Utilizing a newly developed blob determination technique, a given range scene with several non-cluttered quadric surfaces is segmented. Next, using the earlier (phase 1) developed alignment scheme, each of the segmented objects are then aligned in a desired coordinate system. For each of the quadric surfaces based upon their intersections with certain pre-determined planes, a set of distinct features (curves) are obtained. A database with entities such as the equations of the planes and angular bounds of these planes has been created for each of the quadric surfaces. Real range data of spheres, cones, cylinders, and parallelpipeds have been utilized for the recognition process. The developed algorithm gave excellent results for the real data as well as for several sets of simulated range data.

  4. Scientist/AMPS equipment interface study

    NASA Technical Reports Server (NTRS)

    Anderson, H. R.

    1977-01-01

    The principal objective was to determine for each experiment how the operating procedures and modes of equipment onboard shuttle can be managed in real-time or near-real-time to enhance the quality of results. As part of this determination the data and display devices that a man will need for real-time management are defined. The secondary objectives, as listed in the RFQ and technical proposal, were to: (1) determine what quantities are to be measured (2) determine permissible background levels (3) decide in what portions of space measurements are to be made (4) estimate bit rates (5) establish time-lines for operating the experiments on a mission or set of missions and (6) determine the minimum set of hardware needed for real-time display. Experiment descriptions and requirements were written. The requirements of the various experiments are combined and a minimal set of joint requirements are defined.

  5. Online gaming for learning optimal team strategies in real time

    NASA Astrophysics Data System (ADS)

    Hudas, Gregory; Lewis, F. L.; Vamvoudakis, K. G.

    2010-04-01

    This paper first presents an overall view for dynamical decision-making in teams, both cooperative and competitive. Strategies for team decision problems, including optimal control, zero-sum 2-player games (H-infinity control) and so on are normally solved for off-line by solving associated matrix equations such as the Riccati equation. However, using that approach, players cannot change their objectives online in real time without calling for a completely new off-line solution for the new strategies. Therefore, in this paper we give a method for learning optimal team strategies online in real time as team dynamical play unfolds. In the linear quadratic regulator case, for instance, the method learns the Riccati equation solution online without ever solving the Riccati equation. This allows for truly dynamical team decisions where objective functions can change in real time and the system dynamics can be time-varying.

  6. Implement of the Owner Distinction Function for Healing-Type Pet Robots

    NASA Astrophysics Data System (ADS)

    Nambo, Hidetaka; Kimura, Haruhiko; Hirose, Sadaki

    In recent years, a robotics technology is extremely progressive, and robots are widely applied in many fields. One of the most typical robots is a pet robot. The pet robot is based on an animal pet, such as a dog or a cat. Also, it is known that an animal pet has a healing effect. Therefore, the study to apply pet robots to Animal Assisted Therapy instead of an animal pet has begun to be investigated. We, also, have investigated a method of an owner distinction for pet robot, to emphasize a healing effect of pet robots. In this paper, taking account of implementation into pet robots, a real-time owner distinction method is proposed. In the concrete, the method provides a real-time matching algorithm and an oblivion mechanism. The real-time matching means that a matching and a data acquisition are processed simultaneously. The oblivion mechanism is deleting features of owners in the database of the pet robots. Additionally, the mechanism enables to reduce matching costs or size of database and it enables to follow a change of owners. Furthermore, effectivity and a practicality of the method are evaluated by experiments.

  7. Development of Hydrometeorological Monitoring and Forecasting as AN Essential Component of the Early Flood Warning System:

    NASA Astrophysics Data System (ADS)

    Manukalo, V.

    2012-12-01

    Defining issue The river inundations are the most common and destructive natural hazards in Ukraine. Among non-structural flood management and protection measures a creation of the Early Flood Warning System is extremely important to be able to timely recognize dangerous situations in the flood-prone areas. Hydrometeorological information and forecasts are a core importance in this system. The primary factors affecting reliability and a lead - time of forecasts include: accuracy, speed and reliability with which real - time data are collected. The existing individual conception of monitoring and forecasting resulted in a need in reconsideration of the concept of integrated monitoring and forecasting approach - from "sensors to database and forecasters". Result presentation The Project: "Development of Flood Monitoring and Forecasting in the Ukrainian part of the Dniester River Basin" is presented. The project is developed by the Ukrainian Hydrometeorological Service in a conjunction with the Water Management Agency and the Energy Company "Ukrhydroenergo". The implementation of the Project is funded by the Ukrainian Government and the World Bank. The author is nominated as the responsible person for coordination of activity of organizations involved in the Project. The term of the Project implementation: 2012 - 2014. The principal objectives of the Project are: a) designing integrated automatic hydrometeorological measurement network (including using remote sensing technologies); b) hydrometeorological GIS database construction and coupling with electronic maps for flood risk assessment; c) interface-construction classic numerical database -GIS and with satellite images, and radar data collection; d) providing the real-time data dissemination from observation points to forecasting centers; e) developing hydrometeoroogical forecasting methods; f) providing a flood hazards risk assessment for different temporal and spatial scales; g) providing a dissemination of current information, forecasts and warnings to consumers automatically. Besides scientific and technical issues the implementation of these objectives requires solution of a number of organizational issues. Thus, as a result of the increased complexity of types of hydrometeorological data and in order to develop forecasting methods, a reconsideration of meteorological and hydrological measurement networks should be carried out. The "optimal density of measuring networks" is proposed taking into account principal terms: a) minimizing an uncertainty in characterizing the spacial distribution of hydrometeorological parameters; b) minimizing the Total Life Cycle Cost of creation and maintenance of measurement networks. Much attention will be given to training Ukrainian disaster management authorities from the Ministry of Emergencies and the Water Management Agency to identify the flood hazard risk level and to indicate the best protection measures on the basis of continuous monitoring and forecasts of evolution of meteorological and hydrological conditions in the river basin.

  8. Review and assessment of the database and numerical modeling for turbine heat transfer

    NASA Technical Reports Server (NTRS)

    Gladden, H. J.; Simoneau, R. J.

    1988-01-01

    The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  9. Real-time inspection by submarine images

    NASA Astrophysics Data System (ADS)

    Tascini, Guido; Zingaretti, Primo; Conte, Giuseppe

    1996-10-01

    A real-time application of computer vision concerning tracking and inspection of a submarine pipeline is described. The objective is to develop automatic procedures for supporting human operators in the real-time analysis of images acquired by means of cameras mounted on underwater remotely operated vehicles (ROV) Implementation of such procedures gives rise to a human-machine system for underwater pipeline inspection that can automatically detect and signal the presence of the pipe, of its structural or accessory elements, and of dangerous or alien objects in its neighborhood. The possibility of modifying the image acquisition rate in the simulations performed on video- recorded images is used to prove that the system performs all necessary processing with an acceptable robustness working in real-time up to a speed of about 2.5 kn, widely greater than that the actual ROVs and the security features allow.

  10. Nanocubes for real-time exploration of spatiotemporal datasets.

    PubMed

    Lins, Lauro; Klosowski, James T; Scheidegger, Carlos

    2013-12-01

    Consider real-time exploration of large multidimensional spatiotemporal datasets with billions of entries, each defined by a location, a time, and other attributes. Are certain attributes correlated spatially or temporally? Are there trends or outliers in the data? Answering these questions requires aggregation over arbitrary regions of the domain and attributes of the data. Many relational databases implement the well-known data cube aggregation operation, which in a sense precomputes every possible aggregate query over the database. Data cubes are sometimes assumed to take a prohibitively large amount of space, and to consequently require disk storage. In contrast, we show how to construct a data cube that fits in a modern laptop's main memory, even for billions of entries; we call this data structure a nanocube. We present algorithms to compute and query a nanocube, and show how it can be used to generate well-known visual encodings such as heatmaps, histograms, and parallel coordinate plots. When compared to exact visualizations created by scanning an entire dataset, nanocube plots have bounded screen error across a variety of scales, thanks to a hierarchical structure in space and time. We demonstrate the effectiveness of our technique on a variety of real-world datasets, and present memory, timing, and network bandwidth measurements. We find that the timings for the queries in our examples are dominated by network and user-interaction latencies.

  11. IDA and the Technical Cooperation Program Real-Time Systems and Ada Workshop, 21-23 June 1988

    DTIC Science & Technology

    1988-06-01

    IDA Memorandum Report M-540, IDA and the Technical Cooperation Program Real - Time Systems and Ada Workshop, 21-23 June 1988, documents the results of... time systems , (2) identify and clarify known Ada real-time issues, (3) identify near-term and long-term solutions, and (4) provide assessment and...Technology (ODUSD R&AT). Funding was provided by the STARS Joint Program Office. The objectives were to (1) define requirements for using Ada in real

  12. Understanding and Analyzing Latency of Near Real-time Satellite Data

    NASA Astrophysics Data System (ADS)

    Han, W.; Jochum, M.; Brust, J.

    2016-12-01

    Acquiring and disseminating time-sensitive satellite data in a timely manner is much concerned by researchers and decision makers of weather forecast, severe weather warning, disaster and emergency response, environmental monitoring, and so on. Understanding and analyzing the latency of near real-time satellite data is very useful and helpful to explore the whole data transmission flow, indentify the possible issues, and connect data providers and users better. The STAR (Center for Satellite Applications and Research of NOAA) Central Data Repository (SCDR) is a central repository to acquire, manipulate, and disseminate various types of near real-time satellite datasets to internal and external users. In this system, important timestamps, including observation beginning/end, processing, uploading, downloading, and ingestion, are retrieved and organized in the database, so the time length of each transmission phase can be figured out easily. Open source NoSQL database MongoDB is selected to manage the timestamp information because of features of dynamic schema, aggregation and data processing. A user-friendly user interface is developed to visualize and characterize the latency interactively. Taking the Himawari-8 HSD (Himawari Standard Data) file as an example, the data transmission phases, including creating HSD file from satellite observation, uploading the file to HimawariCloud, updating file link in the webpage, downloading and ingesting the file to SCDR, are worked out from the above mentioned timestamps. The latencies can be observed by time of period, day of week, or hour of day in chart or table format, and the anomaly latencies can be detected and reported through the user interface. Latency analysis provides data providers and users actionable insight on how to improve the data transmission of near real-time satellite data, and enhance its acquisition and management.

  13. The IPAC Image Subtraction and Discovery Pipeline for the Intermediate Palomar Transient Factory

    NASA Astrophysics Data System (ADS)

    Masci, Frank J.; Laher, Russ R.; Rebbapragada, Umaa D.; Doran, Gary B.; Miller, Adam A.; Bellm, Eric; Kasliwal, Mansi; Ofek, Eran O.; Surace, Jason; Shupe, David L.; Grillmair, Carl J.; Jackson, Ed; Barlow, Tom; Yan, Lin; Cao, Yi; Cenko, S. Bradley; Storrie-Lombardi, Lisa J.; Helou, George; Prince, Thomas A.; Kulkarni, Shrinivas R.

    2017-01-01

    We describe the near real-time transient-source discovery engine for the intermediate Palomar Transient Factory (iPTF), currently in operations at the Infrared Processing and Analysis Center (IPAC), Caltech. We coin this system the IPAC/iPTF Discovery Engine (or IDE). We review the algorithms used for PSF-matching, image subtraction, detection, photometry, and machine-learned (ML) vetting of extracted transient candidates. We also review the performance of our ML classifier. For a limiting signal-to-noise ratio of 4 in relatively unconfused regions, bogus candidates from processing artifacts and imperfect image subtractions outnumber real transients by ≃10:1. This can be considerably higher for image data with inaccurate astrometric and/or PSF-matching solutions. Despite this occasionally high contamination rate, the ML classifier is able to identify real transients with an efficiency (or completeness) of ≃97% for a maximum tolerable false-positive rate of 1% when classifying raw candidates. All subtraction-image metrics, source features, ML probability-based real-bogus scores, contextual metadata from other surveys, and possible associations with known Solar System objects are stored in a relational database for retrieval by the various science working groups. We review our efforts in mitigating false-positives and our experience in optimizing the overall system in response to the multitude of science projects underway with iPTF.

  14. The IPAC Image Subtraction and Discovery Pipeline for the Intermediate Palomar Transient Factory

    NASA Technical Reports Server (NTRS)

    Masci, Frank J.; Laher, Russ R.; Rebbapragada, Umaa D.; Doran, Gary B.; Miller, Adam A.; Bellm, Eric; Kasliwal, Mansi; Ofek, Eran O.; Surace, Jason; Shupe, David L.; hide

    2016-01-01

    We describe the near real-time transient-source discovery engine for the intermediate Palomar Transient Factory (iPTF), currently in operations at the Infrared Processing and Analysis Center (IPAC), Caltech. We coin this system the IPAC/iPTF Discovery Engine (or IDE). We review the algorithms used for PSF-matching, image subtraction, detection, photometry, and machine-learned (ML) vetting of extracted transient candidates. We also review the performance of our ML classifier. For a limiting signal-to-noise ratio of 4 in relatively unconfused regions, bogus candidates from processing artifacts and imperfect image subtractions outnumber real transients by approximately equal to 10:1. This can be considerably higher for image data with inaccurate astrometric and/or PSF-matching solutions. Despite this occasionally high contamination rate, the ML classifier is able to identify real transients with an efficiency (or completeness) of approximately equal to 97% for a maximum tolerable false-positive rate of 1% when classifying raw candidates. All subtraction-image metrics, source features, ML probability-based real-bogus scores, contextual metadata from other surveys, and possible associations with known Solar System objects are stored in a relational database for retrieval by the various science working groups. We review our efforts in mitigating false-positives and our experience in optimizing the overall system in response to the multitude of science projects underway with iPTF.

  15. Supporting Building Portfolio Investment and Policy Decision Making through an Integrated Building Utility Data Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena

    The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less

  16. Real-time Human Activity Recognition

    NASA Astrophysics Data System (ADS)

    Albukhary, N.; Mustafah, Y. M.

    2017-11-01

    The traditional Closed-circuit Television (CCTV) system requires human to monitor the CCTV for 24/7 which is inefficient and costly. Therefore, there’s a need for a system which can recognize human activity effectively in real-time. This paper concentrates on recognizing simple activity such as walking, running, sitting, standing and landing by using image processing techniques. Firstly, object detection is done by using background subtraction to detect moving object. Then, object tracking and object classification are constructed so that different person can be differentiated by using feature detection. Geometrical attributes of tracked object, which are centroid and aspect ratio of identified tracked are manipulated so that simple activity can be detected.

  17. Detection and Tracking of Moving Objects with Real-Time Onboard Vision System

    NASA Astrophysics Data System (ADS)

    Erokhin, D. Y.; Feldman, A. B.; Korepanov, S. E.

    2017-05-01

    Detection of moving objects in video sequence received from moving video sensor is a one of the most important problem in computer vision. The main purpose of this work is developing set of algorithms, which can detect and track moving objects in real time computer vision system. This set includes three main parts: the algorithm for estimation and compensation of geometric transformations of images, an algorithm for detection of moving objects, an algorithm to tracking of the detected objects and prediction their position. The results can be claimed to create onboard vision systems of aircraft, including those relating to small and unmanned aircraft.

  18. Romanian Complex Data Center for Dense Seismic network

    NASA Astrophysics Data System (ADS)

    Neagoe, Cristian; Ionescu, Constantin; Marius Manea, Liviu

    2010-05-01

    Since 2002 the National Institute for Earth Physics (NIEP) developed its own real-time digital seismic network: consisting of 96 seismic stations of which 35 are broadband sensors and 24 stations equipped with short period sensors and two arrays earthquakes that transmit data in real time at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for Black Sea tsunamis. Seismic stations are equipped with Quanterra Q330 and K2 digitizers, broadband seismometers (STS2, CMG40T, CMG 3ESP, CMG3T) and acceleration sensors Episensor Kinemetrics (+ / - 2G). SeedLink who is a part of Seiscomp2.5 and Antelope are software packages used for acquisition in real time (RT) and for data exchange. Communication of digital seismic stations to the National Data Center in Bucharest and Seismic Observatory Eforie Nord is assured by 5 providers (GPRS, VPN, satellite radio and Internet communication). For acquisition and data processing at the two centers of reception and processing is used AntelopeTM 4.11 running on 2 workstations: one for real-time and other for offline processing and also a Seiscomp 3 server that works as back-up for Antelope 4.11 Both acquisition and analysis of seismic data systems produced information about local and global parameters of earthquakes, in addition Antelope is used for manual processing (association events, the calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV , etc.), generating ShakeMap products and interacts with global data centers. In order to make all this information easily available across the Web and also lay the grounds for a more modular and flexible development environment the National Data Center developed tools to enable centralizing of data from software such as Antelope which is using a dedicated database system ( Datascope, a database system based on text files ) to a more general-purpose database, MySQL which acts like a hub between the different acquisition and analysis systems used in the data center while also providing better connectivity at no expense in security. Mirroring certain data to MySQL also allows the National Data Center to easily share information to the public via the new application which is being developed and also mix in data collected from the public (e.g. information about the damages observed after an earthquake which intern is being used to produce macroseismic intensity indices which are then stored in the database and also made available via the web application). For internal usage there is also a web application which using data stored in the database displays earthquake information like location, magnitude and depth in semi-real-time thus aiding the personnel on duty. Another usage for the collected data is to create and maintain contact lists to which the datacenter sends notifications (SMS and emails) based on the parameters of the earthquake. For future development, amongst others the Data Center plans to develop the means to crosscheck the generated data between the different acquisition and analysis systems (e.g. comparing data generated by Antelope with data generated by Seiscomp).

  19. An approach for access differentiation design in medical distributed applications built on databases.

    PubMed

    Shoukourian, S K; Vasilyan, A M; Avagyan, A A; Shukurian, A K

    1999-01-01

    A formalized "top to bottom" design approach was described in [1] for distributed applications built on databases, which were considered as a medium between virtual and real user environments for a specific medical application. Merging different components within a unified distributed application posits new essential problems for software. Particularly protection tools, which are sufficient separately, become deficient during the integration due to specific additional links and relationships not considered formerly. E.g., it is impossible to protect a shared object in the virtual operating room using only DBMS protection tools, if the object is stored as a record in DB tables. The solution of the problem should be found only within the more general application framework. Appropriate tools are absent or unavailable. The present paper suggests a detailed outline of a design and testing toolset for access differentiation systems (ADS) in distributed medical applications which use databases. The appropriate formal model as well as tools for its mapping to a DMBS are suggested. Remote users connected via global networks are considered too.

  20. Database and interactive monitoring system for the photonics and electronics of RPC Muon Trigger in CMS experiment

    NASA Astrophysics Data System (ADS)

    Wiacek, Daniel; Kudla, Ignacy M.; Pozniak, Krzysztof T.; Bunkowski, Karol

    2005-02-01

    The main task of the RPC (Resistive Plate Chamber) Muon Trigger monitoring system design for the CMS (Compact Muon Solenoid) experiment (at LHC in CERN Geneva) is the visualization of data that includes the structure of electronic trigger system (e.g. geometry and imagery), the way of its processes and to generate automatically files with VHDL source code used for programming of the FPGA matrix. In the near future, the system will enable the analysis of condition, operation and efficiency of individual Muon Trigger elements, registration of information about some Muon Trigger devices and present previously obtained results in interactive presentation layer. A broad variety of different database and programming concepts for design of Muon Trigger monitoring system was presented in this article. The structure and architecture of the system and its principle of operation were described. One of ideas for building this system is use object-oriented programming and design techniques to describe real electronics systems through abstract object models stored in database and implement these models in Java language.

  1. Determining root correspondence between previously and newly detected objects

    DOEpatents

    Paglieroni, David W.; Beer, N Reginald

    2014-06-17

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  2. RefPrimeCouch—a reference gene primer CouchApp

    PubMed Central

    Silbermann, Jascha; Wernicke, Catrin; Pospisil, Heike; Frohme, Marcus

    2013-01-01

    To support a quantitative real-time polymerase chain reaction standardization project, a new reference gene database application was required. The new database application was built with the explicit goal of simplifying not only the development process but also making the user interface more responsive and intuitive. To this end, CouchDB was used as the backend with a lightweight dynamic user interface implemented client-side as a one-page web application. Data entry and curation processes were streamlined using an OpenRefine-based workflow. The new RefPrimeCouch database application provides its data online under an Open Database License. Database URL: http://hpclife.th-wildau.de:5984/rpc/_design/rpc/view.html PMID:24368831

  3. RefPrimeCouch--a reference gene primer CouchApp.

    PubMed

    Silbermann, Jascha; Wernicke, Catrin; Pospisil, Heike; Frohme, Marcus

    2013-01-01

    To support a quantitative real-time polymerase chain reaction standardization project, a new reference gene database application was required. The new database application was built with the explicit goal of simplifying not only the development process but also making the user interface more responsive and intuitive. To this end, CouchDB was used as the backend with a lightweight dynamic user interface implemented client-side as a one-page web application. Data entry and curation processes were streamlined using an OpenRefine-based workflow. The new RefPrimeCouch database application provides its data online under an Open Database License. Database URL: http://hpclife.th-wildau.de:5984/rpc/_design/rpc/view.html.

  4. Performance assessment of EMR systems based on post-relational database.

    PubMed

    Yu, Hai-Yan; Li, Jing-Song; Zhang, Xiao-Guang; Tian, Yu; Suzuki, Muneou; Araki, Kenji

    2012-08-01

    Post-relational databases provide high performance and are currently widely used in American hospitals. As few hospital information systems (HIS) in either China or Japan are based on post-relational databases, here we introduce a new-generation electronic medical records (EMR) system called Hygeia, which was developed with the post-relational database Caché and the latest platform Ensemble. Utilizing the benefits of a post-relational database, Hygeia is equipped with an "integration" feature that allows all the system users to access data-with a fast response time-anywhere and at anytime. Performance tests of databases in EMR systems were implemented in both China and Japan. First, a comparison test was conducted between a post-relational database, Caché, and a relational database, Oracle, embedded in the EMR systems of a medium-sized first-class hospital in China. Second, a user terminal test was done on the EMR system Izanami, which is based on the identical database Caché and operates efficiently at the Miyazaki University Hospital in Japan. The results proved that the post-relational database Caché works faster than the relational database Oracle and showed perfect performance in the real-time EMR system.

  5. Real-time micro-modelling of city evacuations

    NASA Astrophysics Data System (ADS)

    Löhner, Rainald; Haug, Eberhard; Zinggerling, Claudio; Oñate, Eugenio

    2018-01-01

    A methodology to integrate geographical information system (GIS) data with large-scale pedestrian simulations has been developed. Advances in automatic data acquisition and archiving from GIS databases, automatic input for pedestrian simulations, as well as scalable pedestrian simulation tools have made it possible to simulate pedestrians at the individual level for complete cities in real time. An example that simulates the evacuation of the city of Barcelona demonstrates that this is now possible. This is the first step towards a fully integrated crowd prediction and management tool that takes into account not only data gathered in real time from cameras, cell phones or other sensors, but also merges these with advanced simulation tools to predict the future state of the crowd.

  6. Designing an End-to-End System for Data Storage, Analysis, and Visualization for an Urban Environmental Observatory

    NASA Astrophysics Data System (ADS)

    McGuire, M. P.; Welty, C.; Gangopadhyay, A.; Karabatis, G.; Chen, Z.

    2006-05-01

    The urban environment is formed by complex interactions between natural and human dominated systems, the study of which requires the collection and analysis of very large datasets that span many disciplines. Recent advances in sensor technology and automated data collection have improved the ability to monitor urban environmental systems and are making the idea of an urban environmental observatory a reality. This in turn has created a number of potential challenges in data management and analysis. We present the design of an end-to-end system to store, analyze, and visualize data from a prototype urban environmental observatory based at the Baltimore Ecosystem Study, a National Science Foundation Long Term Ecological Research site (BES LTER). We first present an object-relational design of an operational database to store high resolution spatial datasets as well as data from sensor networks, archived data from the BES LTER, data from external sources such as USGS NWIS, EPA Storet, and metadata. The second component of the system design includes a spatiotemporal data warehouse consisting of a data staging plan and a multidimensional data model designed for the spatiotemporal analysis of monitoring data. The system design also includes applications for multi-resolution exploratory data analysis, multi-resolution data mining, and spatiotemporal visualization based on the spatiotemporal data warehouse. Also the system design includes interfaces with water quality models such as HSPF, SWMM, and SWAT, and applications for real-time sensor network visualization, data discovery, data download, QA/QC, and backup and recovery, all of which are based on the operational database. The system design includes both internet and workstation-based interfaces. Finally we present the design of a laboratory for spatiotemporal analysis and visualization as well as real-time monitoring of the sensor network.

  7. AERONET Version 3 Release: Providing Significant Improvements for Multi-Decadal Global Aerosol Database and Near Real-Time Validation

    NASA Technical Reports Server (NTRS)

    Holben, Brent; Slutsker, Ilya; Giles, David; Eck, Thomas; Smirnov, Alexander; Sinyuk, Aliaksandr; Schafer, Joel; Sorokin, Mikhail; Rodriguez, Jon; Kraft, Jason; hide

    2016-01-01

    Aerosols are highly variable in space, time and properties. Global assessment from satellite platforms and model predictions rely on validation from AERONET, a highly accurate ground-based network. Ver. 3 represents a significant improvement in accuracy and quality.

  8. Space Situational Awareness Data Processing Scalability Utilizing Google Cloud Services

    NASA Astrophysics Data System (ADS)

    Greenly, D.; Duncan, M.; Wysack, J.; Flores, F.

    Space Situational Awareness (SSA) is a fundamental and critical component of current space operations. The term SSA encompasses the awareness, understanding and predictability of all objects in space. As the population of orbital space objects and debris increases, the number of collision avoidance maneuvers grows and prompts the need for accurate and timely process measures. The SSA mission continually evolves to near real-time assessment and analysis demanding the need for higher processing capabilities. By conventional methods, meeting these demands requires the integration of new hardware to keep pace with the growing complexity of maneuver planning algorithms. SpaceNav has implemented a highly scalable architecture that will track satellites and debris by utilizing powerful virtual machines on the Google Cloud Platform. SpaceNav algorithms for processing CDMs outpace conventional means. A robust processing environment for tracking data, collision avoidance maneuvers and various other aspects of SSA can be created and deleted on demand. Migrating SpaceNav tools and algorithms into the Google Cloud Platform will be discussed and the trials and tribulations involved. Information will be shared on how and why certain cloud products were used as well as integration techniques that were implemented. Key items to be presented are: 1.Scientific algorithms and SpaceNav tools integrated into a scalable architecture a) Maneuver Planning b) Parallel Processing c) Monte Carlo Simulations d) Optimization Algorithms e) SW Application Development/Integration into the Google Cloud Platform 2. Compute Engine Processing a) Application Engine Automated Processing b) Performance testing and Performance Scalability c) Cloud MySQL databases and Database Scalability d) Cloud Data Storage e) Redundancy and Availability

  9. A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms

    PubMed Central

    Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein

    2017-01-01

    Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts’ Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2–100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms. PMID:28487831

  10. A New Parallel Approach for Accelerating the GPU-Based Execution of Edge Detection Algorithms.

    PubMed

    Emrani, Zahra; Bateni, Soroosh; Rabbani, Hossein

    2017-01-01

    Real-time image processing is used in a wide variety of applications like those in medical care and industrial processes. This technique in medical care has the ability to display important patient information graphi graphically, which can supplement and help the treatment process. Medical decisions made based on real-time images are more accurate and reliable. According to the recent researches, graphic processing unit (GPU) programming is a useful method for improving the speed and quality of medical image processing and is one of the ways of real-time image processing. Edge detection is an early stage in most of the image processing methods for the extraction of features and object segments from a raw image. The Canny method, Sobel and Prewitt filters, and the Roberts' Cross technique are some examples of edge detection algorithms that are widely used in image processing and machine vision. In this work, these algorithms are implemented using the Compute Unified Device Architecture (CUDA), Open Source Computer Vision (OpenCV), and Matrix Laboratory (MATLAB) platforms. An existing parallel method for Canny approach has been modified further to run in a fully parallel manner. This has been achieved by replacing the breadth- first search procedure with a parallel method. These algorithms have been compared by testing them on a database of optical coherence tomography images. The comparison of results shows that the proposed implementation of the Canny method on GPU using the CUDA platform improves the speed of execution by 2-100× compared to the central processing unit-based implementation using the OpenCV and MATLAB platforms.

  11. SensorDB: a virtual laboratory for the integration, visualization and analysis of varied biological sensor data.

    PubMed

    Salehi, Ali; Jimenez-Berni, Jose; Deery, David M; Palmer, Doug; Holland, Edward; Rozas-Larraondo, Pablo; Chapman, Scott C; Georgakopoulos, Dimitrios; Furbank, Robert T

    2015-01-01

    To our knowledge, there is no software or database solution that supports large volumes of biological time series sensor data efficiently and enables data visualization and analysis in real time. Existing solutions for managing data typically use unstructured file systems or relational databases. These systems are not designed to provide instantaneous response to user queries. Furthermore, they do not support rapid data analysis and visualization to enable interactive experiments. In large scale experiments, this behaviour slows research discovery, discourages the widespread sharing and reuse of data that could otherwise inform critical decisions in a timely manner and encourage effective collaboration between groups. In this paper we present SensorDB, a web based virtual laboratory that can manage large volumes of biological time series sensor data while supporting rapid data queries and real-time user interaction. SensorDB is sensor agnostic and uses web-based, state-of-the-art cloud and storage technologies to efficiently gather, analyse and visualize data. Collaboration and data sharing between different agencies and groups is thereby facilitated. SensorDB is available online at http://sensordb.csiro.au.

  12. Real-Time Mapping: Contemporary Challenges and the Internet of Things as the Way Forward

    NASA Astrophysics Data System (ADS)

    Bęcek, Kazimierz

    2016-12-01

    The Internet of Things (IoT) is an emerging technology that was conceived in 1999. The key components of the IoT are intelligent sensors, which represent objects of interest. The adjective `intelligent' is used here in the information gathering sense, not the psychological sense. Some 30 billion sensors that `know' the current status of objects they represent are already connected to the Internet. Various studies indicate that the number of installed sensors will reach 212 billion by 2020. Various scenarios of IoT projects show sensors being able to exchange data with the network as well as between themselves. In this contribution, we discuss the possibility of deploying the IoT in cartography for real-time mapping. A real-time map is prepared using data harvested through querying sensors representing geographical objects, and the concept of a virtual sensor for abstract objects, such as a land parcel, is presented. A virtual sensor may exist as a data record in the cloud. Sensors are identified by an Internet Protocol address (IP address), which implies that geographical objects through their sensors would also have an IP address. This contribution is an updated version of a conference paper presented by the author during the International Federation of Surveyors 2014 Congress in Kuala Lumpur. The author hopes that the use of the IoT for real-time mapping will be considered by the mapmaking community.

  13. Earlinet database: new design and new products for a wider use of aerosol lidar data

    NASA Astrophysics Data System (ADS)

    Mona, Lucia; D'Amico, Giuseppe; Amato, Francesco; Linné, Holger; Baars, Holger; Wandinger, Ulla; Pappalardo, Gelsomina

    2018-04-01

    The EARLINET database is facing a complete reshaping to meet the wide request for more intuitive products and to face the even wider request related to the new initiatives such as Copernicus, the European Earth observation programme. The new design has been carried out in continuity with the past, to take advantage from long-term database. In particular, the new structure will provide information suitable for synergy with other instruments, near real time (NRT) applications, validation and process studies and climate applications.

  14. Real-Time MENTAT programming language and architecture

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.

    1989-01-01

    Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.

  15. Methods and apparatus for constructing and implementing a universal extension module for processing objects in a database

    NASA Technical Reports Server (NTRS)

    Li, Chung-Sheng (Inventor); Smith, John R. (Inventor); Chang, Yuan-Chi (Inventor); Jhingran, Anant D. (Inventor); Padmanabhan, Sriram K. (Inventor); Hsiao, Hui-I (Inventor); Choy, David Mun-Hien (Inventor); Lin, Jy-Jine James (Inventor); Fuh, Gene Y. C. (Inventor); Williams, Robin (Inventor)

    2004-01-01

    Methods and apparatus for providing a multi-tier object-relational database architecture are disclosed. In one illustrative embodiment of the present invention, a multi-tier database architecture comprises an object-relational database engine as a top tier, one or more domain-specific extension modules as a bottom tier, and one or more universal extension modules as a middle tier. The individual extension modules of the bottom tier operationally connect with the one or more universal extension modules which, themselves, operationally connect with the database engine. The domain-specific extension modules preferably provide such functions as search, index, and retrieval services of images, video, audio, time series, web pages, text, XML, spatial data, etc. The domain-specific extension modules may include one or more IBM DB2 extenders, Oracle data cartridges and/or Informix datablades, although other domain-specific extension modules may be used.

  16. CALS Database Usage and Analysis Tool Study

    DTIC Science & Technology

    1991-09-01

    inference aggregation and cardinality aggregation as two distinct aspects of the aggregation problem. The paper develops the concept of a semantic...aggregation, cardinality aggregation I " CALS Database Usage Analysis Tool Study * Bibliography * Page 7 i NIDX - An Expert System for Real-Time...1989 IEEE Symposium on Research in Security and Privacy, Oakland, CA, May 1989. [21 Baur, D.S.; Eichelman, F.R. 1I; Herrera , R.M.; Irgon, A.E

  17. Genevar: a database and Java application for the analysis and visualization of SNP-gene associations in eQTL studies.

    PubMed

    Yang, Tsun-Po; Beazley, Claude; Montgomery, Stephen B; Dimas, Antigone S; Gutierrez-Arcelus, Maria; Stranger, Barbara E; Deloukas, Panos; Dermitzakis, Emmanouil T

    2010-10-01

    Genevar (GENe Expression VARiation) is a database and Java tool designed to integrate multiple datasets, and provides analysis and visualization of associations between sequence variation and gene expression. Genevar allows researchers to investigate expression quantitative trait loci (eQTL) associations within a gene locus of interest in real time. The database and application can be installed on a standard computer in database mode and, in addition, on a server to share discoveries among affiliations or the broader community over the Internet via web services protocols. http://www.sanger.ac.uk/resources/software/genevar.

  18. Neuromorphic Event-Based 3D Pose Estimation

    PubMed Central

    Reverter Valeiras, David; Orchard, Garrick; Ieng, Sio-Hoi; Benosman, Ryad B.

    2016-01-01

    Pose estimation is a fundamental step in many artificial vision tasks. It consists of estimating the 3D pose of an object with respect to a camera from the object's 2D projection. Current state of the art implementations operate on images. These implementations are computationally expensive, especially for real-time applications. Scenes with fast dynamics exceeding 30–60 Hz can rarely be processed in real-time using conventional hardware. This paper presents a new method for event-based 3D object pose estimation, making full use of the high temporal resolution (1 μs) of asynchronous visual events output from a single neuromorphic camera. Given an initial estimate of the pose, each incoming event is used to update the pose by combining both 3D and 2D criteria. We show that the asynchronous high temporal resolution of the neuromorphic camera allows us to solve the problem in an incremental manner, achieving real-time performance at an update rate of several hundreds kHz on a conventional laptop. We show that the high temporal resolution of neuromorphic cameras is a key feature for performing accurate pose estimation. Experiments are provided showing the performance of the algorithm on real data, including fast moving objects, occlusions, and cases where the neuromorphic camera and the object are both in motion. PMID:26834547

  19. Using real objects to teach about climate change: an ethnographic perspective

    NASA Astrophysics Data System (ADS)

    Conner, L.; Perin, S.; Coats, V.; Sturm, M.

    2017-12-01

    Informal educators frequently use real objects to connect visitors with science content that can otherwise seem abstract. Our NSF-funded project, "Hot Times in Cold Places," leverages this premise to teach about climate change through real objects associated with the nation's only permafrost tunnel, located in Fox, Alaska. We posit that touching real ice, holding Pleistocene bones, and seeing ice wedges in context allows learners to understand climate change in a direct and visceral manner. We are conducting ethnographic research to understand visitor experience at both the tunnel itself and at a permafrost museum exhibit that we are creating as part of the project. Research questions include: 1) What is the nature of visitor talk with respect to explanations about permafrost, tipping points, climate change, and geological time? 2) How do attributes of "realness" (scale, resolution, uniqueness, history and adherence to an original) affect visitor's experience of objects, as perceived through the senses and emotions? We use naturalistic observation, interviews, and videotaping to answer these questions. Analysis focuses on child-to-child talk, reciprocal talk between educator and child, and reciprocal talk between parent and child. Our results elucidate the value of real, vs. replicated and virtual objects, in informal learning, especially in the context of climate change education. An understanding of these factors can help informal learning educators make informed choices about program and exhibit design.

  20. LivePhantom: Retrieving Virtual World Light Data to Real Environments.

    PubMed

    Kolivand, Hoshang; Billinghurst, Mark; Sunar, Mohd Shahrizal

    2016-01-01

    To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera's position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems.

  1. LivePhantom: Retrieving Virtual World Light Data to Real Environments

    PubMed Central

    2016-01-01

    To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera’s position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems. PMID:27930663

  2. Real time eye tracking using Kalman extended spatio-temporal context learning

    NASA Astrophysics Data System (ADS)

    Munir, Farzeen; Minhas, Fayyaz ul Amir Asfar; Jalil, Abdul; Jeon, Moongu

    2017-06-01

    Real time eye tracking has numerous applications in human computer interaction such as a mouse cursor control in a computer system. It is useful for persons with muscular or motion impairments. However, tracking the movement of the eye is complicated by occlusion due to blinking, head movement, screen glare, rapid eye movements, etc. In this work, we present the algorithmic and construction details of a real time eye tracking system. Our proposed system is an extension of Spatio-Temporal context learning through Kalman Filtering. Spatio-Temporal Context Learning offers state of the art accuracy in general object tracking but its performance suffers due to object occlusion. Addition of the Kalman filter allows the proposed method to model the dynamics of the motion of the eye and provide robust eye tracking in cases of occlusion. We demonstrate the effectiveness of this tracking technique by controlling the computer cursor in real time by eye movements.

  3. The ATLAS conditions database architecture for the Muon spectrometer

    NASA Astrophysics Data System (ADS)

    Verducci, Monica; ATLAS Muon Collaboration

    2010-04-01

    The Muon System, facing the challenge requirement of the conditions data storage, has extensively started to use the conditions database project 'COOL' as the basis for all its conditions data storage both at CERN and throughout the worldwide collaboration as decided by the ATLAS Collaboration. The management of the Muon COOL conditions database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored. The Muon conditions database is responsible for almost all of the 'non event' data and detector quality flags storage needed for debugging of the detector operations and for performing reconstruction and analysis. The COOL database allows database applications to be written independently of the underlying database technology and ensures long term compatibility with the entire ATLAS Software. COOL implements an interval of validity database, i.e. objects stored or referenced in COOL have an associated start and end time between which they are valid, the data is stored in folders, which are themselves arranged in a hierarchical structure of folder sets. The structure is simple and mainly optimized to store and retrieve object(s) associated with a particular time. In this work, an overview of the entire Muon conditions database architecture is given, including the different sources of the data and the storage model used. In addiction the software interfaces used to access to the conditions data are described, more emphasis is given to the Offline Reconstruction framework ATHENA and the services developed to provide the conditions data to the reconstruction.

  4. Object Recognition and Localization: The Role of Tactile Sensors

    PubMed Central

    Aggarwal, Achint; Kirchner, Frank

    2014-01-01

    Tactile sensors, because of their intrinsic insensitivity to lighting conditions and water turbidity, provide promising opportunities for augmenting the capabilities of vision sensors in applications involving object recognition and localization. This paper presents two approaches for haptic object recognition and localization for ground and underwater environments. The first approach called Batch Ransac and Iterative Closest Point augmented Particle Filter (BRICPPF) is based on an innovative combination of particle filters, Iterative-Closest-Point algorithm, and a feature-based Random Sampling and Consensus (RANSAC) algorithm for database matching. It can handle a large database of 3D-objects of complex shapes and performs a complete six-degree-of-freedom localization of static objects. The algorithms are validated by experimentation in ground and underwater environments using real hardware. To our knowledge this is the first instance of haptic object recognition and localization in underwater environments. The second approach is biologically inspired, and provides a close integration between exploration and recognition. An edge following exploration strategy is developed that receives feedback from the current state of recognition. A recognition by parts approach is developed which uses the BRICPPF for object sub-part recognition. Object exploration is either directed to explore a part until it is successfully recognized, or is directed towards new parts to endorse the current recognition belief. This approach is validated by simulation experiments. PMID:24553087

  5. Real-time object recognition in multidimensional images based on joined extended structural tensor and higher-order tensor decomposition methods

    NASA Astrophysics Data System (ADS)

    Cyganek, Boguslaw; Smolka, Bogdan

    2015-02-01

    In this paper a system for real-time recognition of objects in multidimensional video signals is proposed. Object recognition is done by pattern projection into the tensor subspaces obtained from the factorization of the signal tensors representing the input signal. However, instead of taking only the intensity signal the novelty of this paper is first to build the Extended Structural Tensor representation from the intensity signal that conveys information on signal intensities, as well as on higher-order statistics of the input signals. This way the higher-order input pattern tensors are built from the training samples. Then, the tensor subspaces are built based on the Higher-Order Singular Value Decomposition of the prototype pattern tensors. Finally, recognition relies on measurements of the distance of a test pattern projected into the tensor subspaces obtained from the training tensors. Due to high-dimensionality of the input data, tensor based methods require high memory and computational resources. However, recent achievements in the technology of the multi-core microprocessors and graphic cards allows real-time operation of the multidimensional methods as is shown and analyzed in this paper based on real examples of object detection in digital images.

  6. Real-Time Processing System for the JET Hard X-Ray and Gamma-Ray Profile Monitor Enhancement

    NASA Astrophysics Data System (ADS)

    Fernandes, Ana M.; Pereira, Rita C.; Neto, André; Valcárcel, Daniel F.; Alves, Diogo; Sousa, Jorge; Carvalho, Bernardo B.; Kiptily, Vasily; Syme, Brian; Blanchard, Patrick; Murari, Andrea; Correia, Carlos M. B. A.; Varandas, Carlos A. F.; Gonçalves, Bruno

    2014-06-01

    The Joint European Torus (JET) is currently undertaking an enhancement program which includes tests of relevant diagnostics with real-time processing capabilities for the International Thermonuclear Experimental Reactor (ITER). Accordingly, a new real-time processing system was developed and installed at JET for the gamma-ray and hard X-ray profile monitor diagnostic. The new system is connected to 19 CsI(Tl) photodiodes in order to obtain the line-integrated profiles of the gamma-ray and hard X-ray emissions. Moreover, it was designed to overcome the former data acquisition (DAQ) limitations while exploiting the required real-time features. The new DAQ hardware, based on the Advanced Telecommunication Computer Architecture (ATCA) standard, includes reconfigurable digitizer modules with embedded field-programmable gate array (FPGA) devices capable of acquiring and simultaneously processing data in real-time from the 19 detectors. A suitable algorithm was developed and implemented in the FPGAs, which are able to deliver the corresponding energy of the acquired pulses. The processed data is sent periodically, during the discharge, through the JET real-time network and stored in the JET scientific databases at the end of the pulse. The interface between the ATCA digitizers, the JET control and data acquisition system (CODAS), and the JET real-time network is provided by the Multithreaded Application Real-Time executor (MARTe). The work developed allowed attaining two of the major milestones required by next fusion devices: the ability to process and simultaneously supply high volume data rates in real-time.

  7. Lexical leverage: Category knowledge boosts real-time novel word recognition in two-year- olds

    PubMed Central

    Borovsky, Arielle; Ellis, Erica M.; Evans, Julia L.; Elman, Jeffrey L.

    2016-01-01

    Recent research suggests that infants tend to add words to their vocabulary that are semantically related to other known words, though it is not clear why this pattern emerges. In this paper, we explore whether infants to leverage their existing vocabulary and semantic knowledge when interpreting novel label-object mappings in real-time. We initially identified categorical domains for which individual 24-month-old infants have relatively higher and lower levels of knowledge, irrespective of overall vocabulary size. Next, we taught infants novel words in these higher and lower knowledge domains and then asked if their subsequent real-time recognition of these items varied as a function of their category knowledge. While our participants successfully acquired the novel label -object mappings in our task, there were important differences in the way infants recognized these words in real time. Namely, infants showed more robust recognition of high (vs. low) domain knowledge words. These findings suggest that dense semantic structure facilitates early word learning and real-time novel word recognition. PMID:26452444

  8. Bats' avoidance of real and virtual objects: implications for the sonar coding of object size.

    PubMed

    Goerlitz, Holger R; Genzel, Daria; Wiegrebe, Lutz

    2012-01-01

    Fast movement in complex environments requires the controlled evasion of obstacles. Sonar-based obstacle evasion involves analysing the acoustic features of object-echoes (e.g., echo amplitude) that correlate with this object's physical features (e.g., object size). Here, we investigated sonar-based obstacle evasion in bats emerging in groups from their day roost. Using video-recordings, we first show that the bats evaded a small real object (ultrasonic loudspeaker) despite the familiar flight situation. Secondly, we studied the sonar coding of object size by adding a larger virtual object. The virtual object echo was generated by real-time convolution of the bats' calls with the acoustic impulse response of a large spherical disc and played from the loudspeaker. Contrary to the real object, the virtual object did not elicit evasive flight, despite the spectro-temporal similarity of real and virtual object echoes. Yet, their spatial echo features differ: virtual object echoes lack the spread of angles of incidence from which the echoes of large objects arrive at a bat's ears (sonar aperture). We hypothesise that this mismatch of spectro-temporal and spatial echo features caused the lack of virtual object evasion and suggest that the sonar aperture of object echoscapes contributes to the sonar coding of object size. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Virtual acoustics displays

    NASA Technical Reports Server (NTRS)

    Wenzel, Elizabeth M.; Fisher, Scott S.; Stone, Philip K.; Foster, Scott H.

    1991-01-01

    The real time acoustic display capabilities are described which were developed for the Virtual Environment Workstation (VIEW) Project at NASA-Ames. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory 'objects' or 'icons', can be designed using ACE (Auditory Cue Editor), which links both discrete and continuously varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically coordinated in real time with 3-D visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events.

  10. Virtual acoustics displays

    NASA Astrophysics Data System (ADS)

    Wenzel, Elizabeth M.; Fisher, Scott S.; Stone, Philip K.; Foster, Scott H.

    1991-03-01

    The real time acoustic display capabilities are described which were developed for the Virtual Environment Workstation (VIEW) Project at NASA-Ames. The acoustic display is capable of generating localized acoustic cues in real time over headphones. An auditory symbology, a related collection of representational auditory 'objects' or 'icons', can be designed using ACE (Auditory Cue Editor), which links both discrete and continuously varying acoustic parameters with information or events in the display. During a given display scenario, the symbology can be dynamically coordinated in real time with 3-D visual objects, speech, and gestural displays. The types of displays feasible with the system range from simple warnings and alarms to the acoustic representation of multidimensional data or events.

  11. Attribute and topology based change detection in a constellation of previously detected objects

    DOEpatents

    Paglieroni, David W.; Beer, Reginald N.

    2016-01-19

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  12. Automated data processing architecture for the Gemini Planet Imager Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Maire, Jérôme; Marchis, Franck; Graham, James R.; Macintosh, Bruce; Ammons, S. Mark; Bailey, Vanessa P.; Barman, Travis S.; Bruzzone, Sebastian; Bulger, Joanna; Cotten, Tara; Doyon, René; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Goodsell, Stephen; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn M.; Larkin, James E.; Marley, Mark S.; Metchev, Stanimir; Nielsen, Eric L.; Oppenheimer, Rebecca; Palmer, David W.; Patience, Jennifer; Poyneer, Lisa A.; Pueyo, Laurent; Rajan, Abhijith; Rantakyrö, Fredrik T.; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane J.

    2018-01-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multiyear direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the Data Cruncher, combines multiple data reduction pipelines (DRPs) together to process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our DRPs. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  13. Second Annual Conference on Astronomical Data Analysis Software and Systems. Abstracts

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Abstracts from the conference are presented. The topics covered include the following: next generation software systems and languages; databases, catalogs, and archives; user interfaces/visualization; real-time data acquisition/scheduling; and IRAF/STSDAS/PROS status reports.

  14. Reactive Aggregate Model Protecting Against Real-Time Threats

    DTIC Science & Technology

    2014-09-01

    on the underlying functionality of three core components. • MS SQL server 2008 backend database. • Microsoft IIS running on Windows server 2008...services. The capstone tested a Linux-based Apache web server with the following software implementations: • MySQL as a Linux-based backend server for...malicious compromise. 1. Assumptions • GINA could connect to a backend MS SQL database through proper configuration of DotNetNuke. • GINA had access

  15. Physical Oceanography Program Science Abstracts.

    DTIC Science & Technology

    1985-04-01

    substantial part of the database used by the U.S. Navy and the U.S. National Weather Service to generate, in real-time, subsurface tempera- ture maps...quality, 1ST database which incorporates GTS bathymessagss and on-sbip recordings from the Pacific for the period 1979 through 1983. Access to these data...Investigator: Stanley M. FlattE Frank S. Henyey INTERNAL-WAVE NONLINEAR INTERACTIONS BY THE EIKONAL METHOD We have been involved in the study of

  16. An efficient sequential approach to tracking multiple objects through crowds for real-time intelligent CCTV systems.

    PubMed

    Li, Liyuan; Huang, Weimin; Gu, Irene Yu-Hua; Luo, Ruijiang; Tian, Qi

    2008-10-01

    Efficiency and robustness are the two most important issues for multiobject tracking algorithms in real-time intelligent video surveillance systems. We propose a novel 2.5-D approach to real-time multiobject tracking in crowds, which is formulated as a maximum a posteriori estimation problem and is approximated through an assignment step and a location step. Observing that the occluding object is usually less affected by the occluded objects, sequential solutions for the assignment and the location are derived. A novel dominant color histogram (DCH) is proposed as an efficient object model. The DCH can be regarded as a generalized color histogram, where dominant colors are selected based on a given distance measure. Comparing with conventional color histograms, the DCH only requires a few color components (31 on average). Furthermore, our theoretical analysis and evaluation on real data have shown that DCHs are robust to illumination changes. Using the DCH, efficient implementations of sequential solutions for the assignment and location steps are proposed. The assignment step includes the estimation of the depth order for the objects in a dispersing group, one-by-one assignment, and feature exclusion from the group representation. The location step includes the depth-order estimation for the objects in a new group, the two-phase mean-shift location, and the exclusion of tracked objects from the new position in the group. Multiobject tracking results and evaluation from public data sets are presented. Experiments on image sequences captured from crowded public environments have shown good tracking results, where about 90% of the objects have been successfully tracked with the correct identification numbers by the proposed method. Our results and evaluation have indicated that the method is efficient and robust for tracking multiple objects (>or= 3) in complex occlusion for real-world surveillance scenarios.

  17. Deep Learning for Real-Time Capable Object Detection and Localization on Mobile Platforms

    NASA Astrophysics Data System (ADS)

    Particke, F.; Kolbenschlag, R.; Hiller, M.; Patiño-Studencki, L.; Thielecke, J.

    2017-10-01

    Industry 4.0 is one of the most formative terms in current times. Subject of research are particularly smart and autonomous mobile platforms, which enormously lighten the workload and optimize production processes. In order to interact with humans, the platforms need an in-depth knowledge of the environment. Hence, it is required to detect a variety of static and non-static objects. Goal of this paper is to propose an accurate and real-time capable object detection and localization approach for the use on mobile platforms. A method is introduced to use the powerful detection capabilities of a neural network for the localization of objects. Therefore, detection information of a neural network is combined with depth information from a RGB-D camera, which is mounted on a mobile platform. As detection network, YOLO Version 2 (YOLOv2) is used on a mobile robot. In order to find the detected object in the depth image, the bounding boxes, predicted by YOLOv2, are mapped to the corresponding regions in the depth image. This provides a powerful and extremely fast approach for establishing a real-time-capable Object Locator. In the evaluation part, the localization approach turns out to be very accurate. Nevertheless, it is dependent on the detected object itself and some additional parameters, which are analysed in this paper.

  18. Real-time ECG monitoring and arrhythmia detection using Android-based mobile devices.

    PubMed

    Gradl, Stefan; Kugler, Patrick; Lohmuller, Clemens; Eskofier, Bjoern

    2012-01-01

    We developed an application for Android™-based mobile devices that allows real-time electrocardiogram (ECG) monitoring and automated arrhythmia detection by analyzing ECG parameters. ECG data provided by pre-recorded files or acquired live by accessing a Shimmer™ sensor node via Bluetooth™ can be processed and evaluated. The application is based on the Pan-Tompkins algorithm for QRS-detection and contains further algorithm blocks to detect abnormal heartbeats. The algorithm was validated using the MIT-BIH Arrhythmia and MIT-BIH Supraventricular Arrhythmia databases. More than 99% of all QRS complexes were detected correctly by the algorithm. Overall sensitivity for abnormal beat detection was 89.5% with a specificity of 80.6%. The application is available for download and may be used for real-time ECG-monitoring on mobile devices.

  19. Image correlation method for DNA sequence alignment.

    PubMed

    Curilem Saldías, Millaray; Villarroel Sassarini, Felipe; Muñoz Poblete, Carlos; Vargas Vásquez, Asticio; Maureira Butler, Iván

    2012-01-01

    The complexity of searches and the volume of genomic data make sequence alignment one of bioinformatics most active research areas. New alignment approaches have incorporated digital signal processing techniques. Among these, correlation methods are highly sensitive. This paper proposes a novel sequence alignment method based on 2-dimensional images, where each nucleic acid base is represented as a fixed gray intensity pixel. Query and known database sequences are coded to their pixel representation and sequence alignment is handled as object recognition in a scene problem. Query and database become object and scene, respectively. An image correlation process is carried out in order to search for the best match between them. Given that this procedure can be implemented in an optical correlator, the correlation could eventually be accomplished at light speed. This paper shows an initial research stage where results were "digitally" obtained by simulating an optical correlation of DNA sequences represented as images. A total of 303 queries (variable lengths from 50 to 4500 base pairs) and 100 scenes represented by 100 x 100 images each (in total, one million base pair database) were considered for the image correlation analysis. The results showed that correlations reached very high sensitivity (99.01%), specificity (98.99%) and outperformed BLAST when mutation numbers increased. However, digital correlation processes were hundred times slower than BLAST. We are currently starting an initiative to evaluate the correlation speed process of a real experimental optical correlator. By doing this, we expect to fully exploit optical correlation light properties. As the optical correlator works jointly with the computer, digital algorithms should also be optimized. The results presented in this paper are encouraging and support the study of image correlation methods on sequence alignment.

  20. A Smarter Pathway for Delivering Cue Exposure Therapy? The Design and Development of a Smartphone App Targeting Alcohol Use Disorder

    PubMed Central

    Stenager, Elsebeth; Nielsen, Bent; Nielsen, Anette Søgaard; Yu, Fei

    2017-01-01

    Background Although the number of alcohol-related treatments in app stores is proliferating, none of them are based on a psychological framework and supported by empirical evidence. Cue exposure treatment (CET) with urge-specific coping skills (USCS) is often used in Danish treatment settings. It is an evidence-based psychological approach that focuses on promoting “confrontation with alcohol cues” as a means of reducing urges and the likelihood of relapse. Objective The objective of this study was to describe the design and development of a CET-based smartphone app; an innovative delivery pathway for treating alcohol use disorder (AUD). Methods The treatment is based on Monty and coworkers’ manual for CET with USCS (2002). It was created by a multidisciplinary team of psychiatrists, psychologists, programmers, and graphic designers as well as patients with AUD. A database was developed for the purpose of registering and monitoring training activities. A final version of the CET app and database was developed after several user tests. Results The final version of the CET app includes an introduction, 4 sessions featuring USCS, 8 alcohol exposure videos promoting the use of one of the USCS, and a results component providing an overview of training activities and potential progress. Real-time urges are measured before, during, and after exposure to alcohol cues and are registered in the app together with other training activity variables. Data packages are continuously sent in encrypted form to an external database and will be merged with other data (in an internal database) in the future. Conclusions The CET smartphone app is currently being tested at a large-scale, randomized controlled trial with the aim of clarifying whether it can be classified as an evidence-based treatment solution. The app has the potential to augment the reach of psychological treatment for AUD. PMID:28137701

  1. Using decision-tree classifier systems to extract knowledge from databases

    NASA Technical Reports Server (NTRS)

    St.clair, D. C.; Sabharwal, C. L.; Hacke, Keith; Bond, W. E.

    1990-01-01

    One difficulty in applying artificial intelligence techniques to the solution of real world problems is that the development and maintenance of many AI systems, such as those used in diagnostics, require large amounts of human resources. At the same time, databases frequently exist which contain information about the process(es) of interest. Recently, efforts to reduce development and maintenance costs of AI systems have focused on using machine learning techniques to extract knowledge from existing databases. Research is described in the area of knowledge extraction using a class of machine learning techniques called decision-tree classifier systems. Results of this research suggest ways of performing knowledge extraction which may be applied in numerous situations. In addition, a measurement called the concept strength metric (CSM) is described which can be used to determine how well the resulting decision tree can differentiate between the concepts it has learned. The CSM can be used to determine whether or not additional knowledge needs to be extracted from the database. An experiment involving real world data is presented to illustrate the concepts described.

  2. Virtual time and time warp on the JPL hypercube. [operating system implementation for distributed simulation

    NASA Technical Reports Server (NTRS)

    Jefferson, David; Beckman, Brian

    1986-01-01

    This paper describes the concept of virtual time and its implementation in the Time Warp Operating System at the Jet Propulsion Laboratory. Virtual time is a distributed synchronization paradigm that is appropriate for distributed simulation, database concurrency control, real time systems, and coordination of replicated processes. The Time Warp Operating System is targeted toward the distributed simulation application and runs on a 32-node JPL Mark II Hypercube.

  3. Key technology research of HILS based on real-time operating system

    NASA Astrophysics Data System (ADS)

    Wang, Fankai; Lu, Huiming; Liu, Che

    2018-03-01

    In order to solve the problems that the long development cycle of traditional simulation and digital simulation doesn't have the characteristics of real time, this paper designed a HILS(Hardware In the Loop Simulation) system based on the real-time operating platform xPC. This system solved the communication problems between HMI and Simulink models through the MATLAB engine interface, and realized the functions of system setting, offline simulation, model compiling and downloading, etc. Using xPC application interface and integrating the TeeChart ActiveX chart component to realize the monitoring function of real-time target application; Each functional block in the system is encapsulated in the form of DLL, and the data interaction between modules was realized by MySQL database technology. When the HILS system runs, search the address of the online xPC target by means of the Ping command, to establish the Tcp/IP communication between the two machines. The technical effectiveness of the developed system is verified through the typical power station control system.

  4. Real-time depth camera tracking with geometrically stable weight algorithm

    NASA Astrophysics Data System (ADS)

    Fu, Xingyin; Zhu, Feng; Qi, Feng; Wang, Mingming

    2017-03-01

    We present an approach for real-time camera tracking with depth stream. Existing methods are prone to drift in sceneries without sufficient geometric information. First, we propose a new weight method for an iterative closest point algorithm commonly used in real-time dense mapping and tracking systems. By detecting uncertainty in pose and increasing weight of points that constrain unstable transformations, our system achieves accurate and robust trajectory estimation results. Our pipeline can be fully parallelized with GPU and incorporated into the current real-time depth camera tracking system seamlessly. Second, we compare the state-of-the-art weight algorithms and propose a weight degradation algorithm according to the measurement characteristics of a consumer depth camera. Third, we use Nvidia Kepler Shuffle instructions during warp and block reduction to improve the efficiency of our system. Results on the public TUM RGB-D database benchmark demonstrate that our camera tracking system achieves state-of-the-art results both in accuracy and efficiency.

  5. Real-time detection of small and dim moving objects in IR video sequences using a robust background estimator and a noise-adaptive double thresholding

    NASA Astrophysics Data System (ADS)

    Zingoni, Andrea; Diani, Marco; Corsini, Giovanni

    2016-10-01

    We developed an algorithm for automatically detecting small and poorly contrasted (dim) moving objects in real-time, within video sequences acquired through a steady infrared camera. The algorithm is suitable for different situations since it is independent of the background characteristics and of changes in illumination. Unlike other solutions, small objects of any size (up to single-pixel), either hotter or colder than the background, can be successfully detected. The algorithm is based on accurately estimating the background at the pixel level and then rejecting it. A novel approach permits background estimation to be robust to changes in the scene illumination and to noise, and not to be biased by the transit of moving objects. Care was taken in avoiding computationally costly procedures, in order to ensure the real-time performance even using low-cost hardware. The algorithm was tested on a dataset of 12 video sequences acquired in different conditions, providing promising results in terms of detection rate and false alarm rate, independently of background and objects characteristics. In addition, the detection map was produced frame by frame in real-time, using cheap commercial hardware. The algorithm is particularly suitable for applications in the fields of video-surveillance and computer vision. Its reliability and speed permit it to be used also in critical situations, like in search and rescue, defence and disaster monitoring.

  6. The 'Real Welfare' scheme: benchmarking welfare outcomes for commercially farmed pigs.

    PubMed

    Pandolfi, F; Stoddart, K; Wainwright, N; Kyriazakis, I; Edwards, S A

    2017-10-01

    Animal welfare standards have been incorporated in EU legislation and in farm assurance schemes, based on scientific information and aiming to safeguard the welfare of the species concerned. Recently, emphasis has shifted from resource-based measures of welfare to animal-based measures, which are considered to assess more accurately the welfare status. The data used in this analysis were collected from April 2013 to May 2016 through the 'Real Welfare' scheme in order to assess on-farm pig welfare, as required for those finishing pigs under the UK Red Tractor Assurance scheme. The assessment involved five main measures (percentage of pigs requiring hospitalization, percentage of lame pigs, percentage of pigs with severe tail lesions, percentage of pigs with severe body marks and enrichment use ratio) and optional secondary measures (percentage of pigs with mild tail lesions, percentage of pigs with dirty tails, percentage of pigs with mild body marks, percentage of pigs with dirty bodies), with associated information about the environment and the enrichment in the farms. For the complete database, a sample of pens was assessed from 1928 farm units. Repeated measures were taken in the same farm unit over time, giving 112 240 records at pen level. These concerned a total of 13 480 289 pigs present on the farm during the assessments, with 5 463 348 pigs directly assessed using the 'Real Welfare' protocol. The three most common enrichment types were straw, chain and plastic objects. The main substrate was straw which was present in 67.9% of the farms. Compared with 2013, a significant increase of pens with undocked-tail pigs, substrates and objects was observed over time (P0.3). The results from the first 3 years of the scheme demonstrate a reduction of the prevalence of animal-based measures of welfare problems and highlight the value of this initiative.

  7. Kinect Posture Reconstruction Based on a Local Mixture of Gaussian Process Models.

    PubMed

    Liu, Zhiguang; Zhou, Liuyang; Leung, Howard; Shum, Hubert P H

    2016-11-01

    Depth sensor based 3D human motion estimation hardware such as Kinect has made interactive applications more popular recently. However, it is still challenging to accurately recognize postures from a single depth camera due to the inherently noisy data derived from depth images and self-occluding action performed by the user. In this paper, we propose a new real-time probabilistic framework to enhance the accuracy of live captured postures that belong to one of the action classes in the database. We adopt the Gaussian Process model as a prior to leverage the position data obtained from Kinect and marker-based motion capture system. We also incorporate a temporal consistency term into the optimization framework to constrain the velocity variations between successive frames. To ensure that the reconstructed posture resembles the accurate parts of the observed posture, we embed a set of joint reliability measurements into the optimization framework. A major drawback of Gaussian Process is its cubic learning complexity when dealing with a large database due to the inverse of a covariance matrix. To solve the problem, we propose a new method based on a local mixture of Gaussian Processes, in which Gaussian Processes are defined in local regions of the state space. Due to the significantly decreased sample size in each local Gaussian Process, the learning time is greatly reduced. At the same time, the prediction speed is enhanced as the weighted mean prediction for a given sample is determined by the nearby local models only. Our system also allows incrementally updating a specific local Gaussian Process in real time, which enhances the likelihood of adapting to run-time postures that are different from those in the database. Experimental results demonstrate that our system can generate high quality postures even under severe self-occlusion situations, which is beneficial for real-time applications such as motion-based gaming and sport training.

  8. Accuracy of LightCycler® SeptiFast for the detection and identification of pathogens in the blood of patients with suspected sepsis: a systematic review protocol

    PubMed Central

    Wilson, Claire; Blackwood, Bronagh; McAuley, Danny F; Perkins, Gavin D; McMullan, Ronan; Gates, Simon; Warhurst, Geoffrey

    2012-01-01

    Background There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probe-based real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). Study selection: diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. Data extraction: three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care. Registration PROSPERO—NIHR Prospective Register of Systematic Reviews (CRD42011001289). PMID:22240646

  9. Review and assessment of the database and numerical modeling for turbine heat transfer

    NASA Technical Reports Server (NTRS)

    Gladden, H. J.; Simoneau, R. J.

    1989-01-01

    The objectives of the NASA Hot Section Technology (HOST) Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.

  10. Fusion of Building Information and Range Imaging for Autonomous Location Estimation in Indoor Environments

    PubMed Central

    Kohoutek, Tobias K.; Mautz, Rainer; Wegner, Jan D.

    2013-01-01

    We present a novel approach for autonomous location estimation and navigation in indoor environments using range images and prior scene knowledge from a GIS database (CityGML). What makes this task challenging is the arbitrary relative spatial relation between GIS and Time-of-Flight (ToF) range camera further complicated by a markerless configuration. We propose to estimate the camera's pose solely based on matching of GIS objects and their detected location in image sequences. We develop a coarse-to-fine matching strategy that is able to match point clouds without any initial parameters. Experiments with a state-of-the-art ToF point cloud show that our proposed method delivers an absolute camera position with decimeter accuracy, which is sufficient for many real-world applications (e.g., collision avoidance). PMID:23435055

  11. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database.

    PubMed

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-28

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m.

  12. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database

    PubMed Central

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-01

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m. PMID:26828496

  13. REVIEW OF THE RADNET AIR MONITORING NETWORK ...

    EPA Pesticide Factsheets

    RadNet, formerly known as ERAMS, has been operating since the 1970's, monitoring environmental radiation across the country, supporting responses to radiological emergencies, and providing important information on background levels of radiation in the environment. The original purpose of the system was to monitor fallout from weapons testing. Even though upgrades to and reconfiguration of the system have been planned for some time, the events of 9/11/01 gave impetus to a thorough upgrade of RadNet, primarily directed at providing more timely data and covering a larger portion of the nation's population. Moreover, the demands upon RadNet are now based upon homeland security support in addition to existing EPA monitoring responsibilities. Beginning in FY05 and continuing into FY13 up to135 near real-time air monitors will be put into operation across the country to provide decision making-data to EPA officials. Data will be transmitted from the monitors in all 50 states to a central database at the National Air and Radiation Environmental Laboratory (NAREL) in Montgomery, Alabama. The data will then be assessed and verified and made available to federal and state officials and, eventually, the public. A data flow model is being constructed to provide the most effective and efficient use of verified data obtained from the new radNet system The objective of the near-real time air monitoring component of RadNet is to provide verified decision-making data to fed

  14. Observer Interface Analysis for Standardization to a Cloud Based Real-Time Space Situational Awareness (SSA)

    NASA Astrophysics Data System (ADS)

    Eilers, J.

    2013-09-01

    The interface analysis from an observer of space objects makes a standard necessary. This standardized dataset serves as input for a cloud based service, which aimed for a near real-time Space Situational Awareness (SSA) system. The system contains all advantages of a cloud based solution, like redundancy, scalability and an easy way to distribute information. For the standard based on the interface analysis of the observer, the information can be separated in three parts. One part is the information about the observer e.g. a ground station. The next part is the information about the sensors that are used by the observer. And the last part is the data from the detected object. Backbone of the SSA System is the cloud based service which includes the consistency check for the observed objects, a database for the objects, the algorithms and analysis as well as the visualization of the results. This paper also provides an approximation of the needed computational power, data storage and a financial approach to deliver this service to a broad community. In this context cloud means, neither the user nor the observer has to think about the infrastructure of the calculation environment. The decision if the IT-infrastructure will be built by a conglomerate of different nations or rented on the marked should be based on an efficiency analysis. Also combinations are possible like starting on a rented cloud and then go to a private cloud owned by the government. One of the advantages of a cloud solution is the scalability. There are about 3000 satellites in space, 900 of them are active, and in total there are about ~17.000 detected space objects orbiting earth. But for the computation it is not a N(active) to N problem it is more N(active) to N(apo peri) quantity of N(all). Instead of 15.3 million possible collisions to calculate a computation of only approx. 2.3 million possible collisions must be done. In general, this Space Situational Awareness System can be used as a tool for satellite system owner for collision avoidance.

  15. Effective real-time vehicle tracking using discriminative sparse coding on local patches

    NASA Astrophysics Data System (ADS)

    Chen, XiangJun; Ye, Feiyue; Ruan, Yaduan; Chen, Qimei

    2016-01-01

    A visual tracking framework that provides an object detector and tracker, which focuses on effective and efficient visual tracking in surveillance of real-world intelligent transport system applications, is proposed. The framework casts the tracking task as problems of object detection, feature representation, and classification, which is different from appearance model-matching approaches. Through a feature representation of discriminative sparse coding on local patches called DSCLP, which trains a dictionary on local clustered patches sampled from both positive and negative datasets, the discriminative power and robustness has been improved remarkably, which makes our method more robust to a complex realistic setting with all kinds of degraded image quality. Moreover, by catching objects through one-time background subtraction, along with offline dictionary training, computation time is dramatically reduced, which enables our framework to achieve real-time tracking performance even in a high-definition sequence with heavy traffic. Experiment results show that our work outperforms some state-of-the-art methods in terms of speed, accuracy, and robustness and exhibits increased robustness in a complex real-world scenario with degraded image quality caused by vehicle occlusion, image blur of rain or fog, and change in viewpoint or scale.

  16. The NMDB collaboration

    NASA Astrophysics Data System (ADS)

    Steigies, C. T.

    2015-12-01

    Since the International Geophysical Year (IGY) in 1957-58 cosmic rays areroutinely measured by many ground-based Neutron Monitors (NM) around theworld. The World Data Center for Cosmic Rays (WDCCR) was established as apart of this activity and is providing a database of cosmic-ray neutronobservations in unified formats. However, that standard data comprises onlyof one hour averages, whereas most NM stations have been enhanced at the endof the 20th century to provide data in one minute resolution or even better.This data was only available on the web-sites of the institutes operatingthe station, and every station invented their own data format for thehigh-resolution measurements. There were some efforts to collect data fromseveral stations, to make this data available on FTP servers, however noneof these efforts could provide real-time data for all stations.The EU FP7 project NMDB (real-time database for high-resolution NeutronMonitor measurements, http://nmdb.eu) was funded by the European Commission,and a new database was set up by several Neutron Monitor stations in Europeand Asia to store high-resolution data and to provide access to the data inreal-time (i.e. less than five minute delay). By storing the measurements ina database, a standard format for the high-resolution measurements isenforced. This database is complementary to the WDCCR, as it does not (yet)provide all historical data, but the creation of this effort has spurred anew collaboration between Neutron Monitor scientists worldwide, (new)stations have gone online (again), new projects are building on the resultsof NMDB, new users outside of the Cosmic Ray community are starting to useNM data for new applications like soil moisture measurements using cosmicrays. These applications are facilitated by the easy access to the data withthe http://nest.nmdb.eu interface that offers access to all NMDB data forall users.

  17. Target tracking and surveillance by fusing stereo and RFID information

    NASA Astrophysics Data System (ADS)

    Raza, Rana H.; Stockman, George C.

    2012-06-01

    Ensuring security in high risk areas such as an airport is an important but complex problem. Effectively tracking personnel, containers, and machines is a crucial task. Moreover, security and safety require understanding the interaction of persons and objects. Computer vision (CV) has been a classic tool; however, variable lighting, imaging, and random occlusions present difficulties for real-time surveillance, resulting in erroneous object detection and trajectories. Determining object ID via CV at any instance of time in a crowded area is computationally prohibitive, yet the trajectories of personnel and objects should be known in real time. Radio Frequency Identification (RFID) can be used to reliably identify target objects and can even locate targets at coarse spatial resolution, while CV provides fuzzy features for target ID at finer resolution. Our research demonstrates benefits obtained when most objects are "cooperative" by being RFID tagged. Fusion provides a method to simplify the correspondence problem in 3D space. A surveillance system can query for unique object ID as well as tag ID information, such as target height, texture, shape and color, which can greatly enhance scene analysis. We extend geometry-based tracking so that intermittent information on ID and location can be used in determining a set of trajectories of N targets over T time steps. We show that partial-targetinformation obtained through RFID can reduce computation time (by 99.9% in some cases) and also increase the likelihood of producing correct trajectories. We conclude that real-time decision-making should be possible if the surveillance system can integrate information effectively between the sensor level and activity understanding level.

  18. The First Ground-Level Enhancement of Solar Cycle 24 on 17 May 2012 and Its Real-Time Detection

    NASA Astrophysics Data System (ADS)

    Papaioannou, A.; Souvatzoglou, G.; Paschalis, P.; Gerontidou, M.; Mavromichalaki, H.

    2014-01-01

    Ground-level enhancements (GLEs) are defined as sudden increases in the recorded intensity of cosmic-ray particles, usually by neutron monitors (NMs). In this work we present a time-shifting analysis (TSA) for the first arriving particles that were detected at Earth by NMs. We also present an automated real-time GLE alert that has been developed and is operating via the Neutron Monitor Database (NMDB), which successfully identified the 17 May 2012 event, designated as GLE71. We discuss the time evolution of the real-time GLE alert that was issued for GLE71 and present the event onset-time for NMs that contributed to this GLE alert based on their archived data. A comparison with their real-time time-stamp was made to illustrate the necessity for high-resolution data ( e.g. 1-min time resolution) made available at every minute. The first results on the propagation of relativistic protons that have been recorded by NMs, as inferred by the TSA, imply that they are most probably accelerated by the coronal-mass-ejection-driven shock. Furthermore, the successful usage of NM data and the corresponding achievement of issuing a timely GLE alert are discussed.

  19. Genevar: a database and Java application for the analysis and visualization of SNP-gene associations in eQTL studies

    PubMed Central

    Yang, Tsun-Po; Beazley, Claude; Montgomery, Stephen B.; Dimas, Antigone S.; Gutierrez-Arcelus, Maria; Stranger, Barbara E.; Deloukas, Panos; Dermitzakis, Emmanouil T.

    2010-01-01

    Summary: Genevar (GENe Expression VARiation) is a database and Java tool designed to integrate multiple datasets, and provides analysis and visualization of associations between sequence variation and gene expression. Genevar allows researchers to investigate expression quantitative trait loci (eQTL) associations within a gene locus of interest in real time. The database and application can be installed on a standard computer in database mode and, in addition, on a server to share discoveries among affiliations or the broader community over the Internet via web services protocols. Availability: http://www.sanger.ac.uk/resources/software/genevar Contact: emmanouil.dermitzakis@unige.ch PMID:20702402

  20. The Long Valley Caldera GIS database

    USGS Publications Warehouse

    Battaglia, Maurizio; Williams, M.J.; Venezky, D.Y.; Hill, D.P.; Langbein, J.O.; Farrar, C.D.; Howle, J.F.; Sneed, M.; Segall, P.

    2003-01-01

    This database provides an overview of the studies being conducted by the Long Valley Observatory in eastern California from 1975 to 2001. The database includes geologic, monitoring, and topographic datasets related to Long Valley caldera. The CD-ROM contains a scan of the original geologic map of the Long Valley region by R. Bailey. Real-time data of the current activity of the caldera (including earthquakes, ground deformation and the release of volcanic gas), information about volcanic hazards and the USGS response plan are available online at the Long Valley observatory web page (http://lvo.wr.usgs.gov). If you have any comments or questions about this database, please contact the Scientist in Charge of the Long Valley observatory.

  1. The application of holography as a real-time three-dimensional motion picture camera

    NASA Technical Reports Server (NTRS)

    Kurtz, R. L.

    1973-01-01

    A historical introduction to holography is presented, as well as a basic description of sideband holography for stationary objects. A brief theoretical development of both time-dependent and time-independent holography is also provided, along with an analytical and intuitive discussion of a unique holographic arrangement which allows the resolution of front surface detail from an object moving at high speeds. As an application of such a system, a real-time three-dimensional motion picture camera system is discussed and the results of a recent demonstration of the world's first true three-dimensional motion picture are given.

  2. High speed optical object recognition processor with massive holographic memory

    NASA Technical Reports Server (NTRS)

    Chao, T.; Zhou, H.; Reyes, G.

    2002-01-01

    Real-time object recognition using a compact grayscale optical correlator will be introduced. A holographic memory module for storing a large bank of optimum correlation filters, to accommodate the large data throughput rate needed for many real-world applications, has also been developed. System architecture of the optical processor and the holographic memory will be presented. Application examples of this object recognition technology will also be demonstrated.

  3. ANZA Seismic Network- From Monitoring to Science

    NASA Astrophysics Data System (ADS)

    Vernon, F.; Eakin, J.; Martynov, V.; Newman, R.; Offield, G.; Hindley, A.; Astiz, L.

    2007-05-01

    The ANZA Seismic Network (http:eqinfo.ucsd.edu) utilizes broadband and strong motion sensors with 24-bit dataloggers combined with real-time telemetry to monitor local and regional seismicity in southernmost California. The ANZA network provides real-time data to the IRIS DMC, California Integrated Seismic Network (CISN), other regional networks, and the Advanced National Seismic System (ANSS), in addition to providing near real-time information and monitoring to the greater San Diego community. Twelve high dynamic range broadband and strong motion sensors adjacent to the San Jacinto Fault zone contribute data for earthquake source studies and continue the monitoring of the seismic activity of the San Jacinto fault initiated 24 years ago. Five additional stations are located in the San Diego region with one more station on San Clemente Island. The ANZA network uses the advance wireless networking capabilities of the NSF High Performance Wireless Research and Education Network (http:hpwren.ucsd.edu) to provide the communication infrastructure for the real-time telemetry of Anza seismic stations. The ANZA network uses the Antelope data acquisition software. The combination of high quality hardware, communications, and software allow for an annual network uptime in excess of 99.5% with a median annual station real-time data return rate of 99.3%. Approximately 90,000 events, dominantly local sources but including regional and teleseismic events, comprise the ANZA network waveform database. All waveform data and event data are managed using the Datascope relational database. The ANZA network data has been used in a variety of scientific research including detailed structure of the San Jacinto Fault Zone, earthquake source physics, spatial and temporal studies of aftershocks, array studies of teleseismic body waves, and array studies on the source of microseisms. To augment the location, detection, and high frequency observations of the seismic source spectrum from local earthquakes, the ANZA network is receiving real-time data from borehole arrays located at the UCSD Thornton Hospital, and from UCSB's Borrego Valley and Garner Valley Downhole Arrays. Finally the ANZA network is acquiring data from seven PBO sites each with 300 meter deep MEMs accelerometers, passive seismometers, and a borehole strainmeter.

  4. Near real-time measurement of forces applied by an optical trap to a rigid cylindrical object

    NASA Astrophysics Data System (ADS)

    Glaser, Joseph; Hoeprich, David; Resnick, Andrew

    2014-07-01

    An automated data acquisition and processing system is established to measure the force applied by an optical trap to an object of unknown composition in real time. Optical traps have been in use for the past 40 years to manipulate microscopic particles, but the magnitude of applied force is often unknown and requires extensive instrument characterization. Measuring or calculating the force applied by an optical trap to nonspherical particles presents additional difficulties which are also overcome with our system. Extensive experiments and measurements using well-characterized objects were performed to verify the system performance.

  5. A Distributed Operating System for BMD Applications.

    DTIC Science & Technology

    1982-01-01

    Defense) applications executing on distributed hardware with local and shared memories. The objective was to develop real - time operating system functions...make the Basic Real - Time Operating System , and the set of new EPL language primitives that provide BMD application processes with efficient mechanisms

  6. REAL TIME CONTROL OF URBAN DRAINAGE NETWORKS

    EPA Science Inventory

    Real-time control (RTC) is a custom-designed, computer-assisted management technology for a specific sewerage network to meet the operational objectives of its collection/conveyance system. RTC can operate in several modes, including a mode that is activated during a wet weather ...

  7. Adaptive route choice modeling in uncertain traffic networks with real-time information.

    DOT National Transportation Integrated Search

    2013-03-01

    The objective of the research is to study travelers' route choice behavior in uncertain traffic networks : with real-time information. The research is motivated by two observations of the traffic system: 1) : the system is inherently uncertain with r...

  8. Top-down modulation of visual processing and knowledge after 250 ms supports object constancy of category decisions

    PubMed Central

    Schendan, Haline E.; Ganis, Giorgio

    2015-01-01

    People categorize objects more slowly when visual input is highly impoverished instead of optimal. While bottom-up models may explain a decision with optimal input, perceptual hypothesis testing (PHT) theories implicate top-down processes with impoverished input. Brain mechanisms and the time course of PHT are largely unknown. This event-related potential study used a neuroimaging paradigm that implicated prefrontal cortex in top-down modulation of occipitotemporal cortex. Subjects categorized more impoverished and less impoverished real and pseudo objects. PHT theories predict larger impoverishment effects for real than pseudo objects because top-down processes modulate knowledge only for real objects, but different PHT variants predict different timing. Consistent with parietal-prefrontal PHT variants, around 250 ms, the earliest impoverished real object interaction started on an N3 complex, which reflects interactive cortical activity for object cognition. N3 impoverishment effects localized to both prefrontal and occipitotemporal cortex for real objects only. The N3 also showed knowledge effects by 230 ms that localized to occipitotemporal cortex. Later effects reflected (a) word meaning in temporal cortex during the N400, (b) internal evaluation of prior decision and memory processes and secondary higher-order memory involving anterotemporal parts of a default mode network during posterior positivity (P600), and (c) response related activity in posterior cingulate during an anterior slow wave (SW) after 700 ms. Finally, response activity in supplementary motor area during a posterior SW after 900 ms showed impoverishment effects that correlated with RTs. Convergent evidence from studies of vision, memory, and mental imagery which reflects purely top-down inputs, indicates that the N3 reflects the critical top-down processes of PHT. A hybrid multiple-state interactive, PHT and decision theory best explains the visual constancy of object cognition. PMID:26441701

  9. Automatic helmet-wearing detection for law enforcement using CCTV cameras

    NASA Astrophysics Data System (ADS)

    Wonghabut, P.; Kumphong, J.; Satiennam, T.; Ung-arunyawee, R.; Leelapatra, W.

    2018-04-01

    The objective of this research is to develop an application for enforcing helmet wearing using CCTV cameras. The developed application aims to help law enforcement by police, and eventually resulting in changing risk behaviours and consequently reducing the number of accidents and its severity. Conceptually, the application software implemented using C++ language and OpenCV library uses two different angle of view CCTV cameras. Video frames recorded by the wide-angle CCTV camera are used to detect motorcyclists. If any motorcyclist without helmet is found, then the zoomed (narrow-angle) CCTV is activated to capture image of the violating motorcyclist and the motorcycle license plate in real time. Captured images are managed by database implemented using MySQL for ticket issuing. The results show that the developed program is able to detect 81% of motorcyclists on various motorcycle types during daytime and night-time. The validation results reveal that the program achieves 74% accuracy in detecting the motorcyclist without helmet.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duchaineau, M.; Wolinsky, M.; Sigeti, D.E.

    Real-time terrain rendering for interactive visualization remains a demanding task. We present a novel algorithm with several advantages over previous methods: our method is unusually stingy with polygons yet achieves real-time performance and is scalable to arbitrary regions and resolutions. The method provides a continuous terrain mesh of specified triangle count having provably minimum error in restricted but reasonably general classes of permissible meshes and error metrics. Our method provides an elegant solution to guaranteeing certain elusive types of consistency in scenes produced by multiple scene generators which share a common finest-resolution database but which otherwise operate entirely independently. Thismore » consistency is achieved by exploiting the freedom of choice of error metric allowed by the algorithm to provide, for example, multiple exact lines-of-sight in real-time. Our methods rely on an off-line pre-processing phase to construct a multi-scale data structure consisting of triangular terrain approximations enhanced ({open_quotes}thickened{close_quotes}) with world-space error information. In real time, this error data is efficiently transformed into screen-space where it is used to guide a greedy top-down triangle subdivision algorithm which produces the desired minimal error continuous terrain mesh. Our algorithm has been implemented and it operates at real-time rates.« less

  11. Remote collection and analysis of witness reports on flash floods

    NASA Astrophysics Data System (ADS)

    Gourley, Jonathan; Erlingis, Jessica; Smith, Travis; Ortega, Kiel; Hong, Yang

    2010-05-01

    Typically, flash floods are studied ex post facto in response to a major impact event. A complement to field investigations is developing a detailed database of flash flood events, including minor events and null reports (i.e., where heavy rain occurred but there was no flash flooding), based on public survey questions conducted in near-real time. The Severe Hazards Analysis and Verification Experiment (SHAVE) has been in operation at the National Severe Storms Laboratory (NSSL) in Norman, OK, USA during the summers since 2006. The experiment employs undergraduate students to analyse real-time products from weather radars, target specific regions within the conterminous US, and poll public residences and businesses regarding the occurrence and severity of hail, wind, tornadoes, and now flash floods. In addition to providing a rich learning experience for students, SHAVE has been successful in creating high-resolution datasets of severe hazards used for algorithm and model verification. This talk describes the criteria used to initiate the flash flood survey, the specific questions asked and information entered to the database, and then provides an analysis of results for flash flood data collected during the summer of 2008. It is envisioned that specific details provided by the SHAVE flash flood observation database will complement databases collected by operational agencies and thus lead to better tools to predict the likelihood of flash floods and ultimately reduce their impacts on society.

  12. GETPrime: a gene- or transcript-specific primer database for quantitative real-time PCR.

    PubMed

    Gubelmann, Carine; Gattiker, Alexandre; Massouras, Andreas; Hens, Korneel; David, Fabrice; Decouttere, Frederik; Rougemont, Jacques; Deplancke, Bart

    2011-01-01

    The vast majority of genes in humans and other organisms undergo alternative splicing, yet the biological function of splice variants is still very poorly understood in large part because of the lack of simple tools that can map the expression profiles and patterns of these variants with high sensitivity. High-throughput quantitative real-time polymerase chain reaction (qPCR) is an ideal technique to accurately quantify nucleic acid sequences including splice variants. However, currently available primer design programs do not distinguish between splice variants and also differ substantially in overall quality, functionality or throughput mode. Here, we present GETPrime, a primer database supported by a novel platform that uniquely combines and automates several features critical for optimal qPCR primer design. These include the consideration of all gene splice variants to enable either gene-specific (covering the majority of splice variants) or transcript-specific (covering one splice variant) expression profiling, primer specificity validation, automated best primer pair selection according to strict criteria and graphical visualization of the latter primer pairs within their genomic context. GETPrime primers have been extensively validated experimentally, demonstrating high transcript specificity in complex samples. Thus, the free-access, user-friendly GETPrime database allows fast primer retrieval and visualization for genes or groups of genes of most common model organisms, and is available at http://updepla1srv1.epfl.ch/getprime/. Database URL: http://deplanckelab.epfl.ch.

  13. GETPrime: a gene- or transcript-specific primer database for quantitative real-time PCR

    PubMed Central

    Gubelmann, Carine; Gattiker, Alexandre; Massouras, Andreas; Hens, Korneel; David, Fabrice; Decouttere, Frederik; Rougemont, Jacques; Deplancke, Bart

    2011-01-01

    The vast majority of genes in humans and other organisms undergo alternative splicing, yet the biological function of splice variants is still very poorly understood in large part because of the lack of simple tools that can map the expression profiles and patterns of these variants with high sensitivity. High-throughput quantitative real-time polymerase chain reaction (qPCR) is an ideal technique to accurately quantify nucleic acid sequences including splice variants. However, currently available primer design programs do not distinguish between splice variants and also differ substantially in overall quality, functionality or throughput mode. Here, we present GETPrime, a primer database supported by a novel platform that uniquely combines and automates several features critical for optimal qPCR primer design. These include the consideration of all gene splice variants to enable either gene-specific (covering the majority of splice variants) or transcript-specific (covering one splice variant) expression profiling, primer specificity validation, automated best primer pair selection according to strict criteria and graphical visualization of the latter primer pairs within their genomic context. GETPrime primers have been extensively validated experimentally, demonstrating high transcript specificity in complex samples. Thus, the free-access, user-friendly GETPrime database allows fast primer retrieval and visualization for genes or groups of genes of most common model organisms, and is available at http://updepla1srv1.epfl.ch/getprime/. Database URL: http://deplanckelab.epfl.ch. PMID:21917859

  14. Multi-Level Pre-Correlation RFI Flagging for Real-Time Implementation on UniBoard

    NASA Astrophysics Data System (ADS)

    Dumez-Viou, Cédric; Weber, Rodolphe; Ravier, Philippe

    2016-03-01

    Because of the denser active use of the spectrum, and because of radio telescopes higher sensitivity, radio frequency interference (RFI) mitigation has become a sensitive topic for current and future radio telescope designs. Even if quite sophisticated approaches have been proposed in the recent years, the majority of RFI mitigation operational procedures are based on post-correlation corrupted data flagging. Moreover, given the huge amount of data delivered by current and next generation radio telescopes, all these RFI detection procedures have to be at least automatic and, if possible, real-time. In this paper, the implementation of a real-time pre-correlation RFI detection and flagging procedure into generic high-performance computing platforms based on field programmable gate arrays (FPGA) is described, simulated and tested. One of these boards, UniBoard, developed under a Joint Research Activity in the RadioNet FP7 European programme is based on eight FPGAs interconnected by a high speed transceiver mesh. It provides up to 4 TMACs with ®Altera Stratix IV FPGA and 160 Gbps data rate for the input data stream. The proposed concept is to continuously monitor the data quality at different stages in the digital preprocessing pipeline between the antennas and the correlator, at the station level and the core level. In this way, the detectors are applied at stages where different time-frequency resolutions can be achieved and where the interference-to-noise ratio (INR) is maximum right before any dilution of RFI characteristics by subsequent channelizations or signal recombinations. The detection decisions could be linked to a RFI statistics database or could be attached to the data for later stage flagging. Considering the high in-out data rate in the pre-correlation stages, only real-time and go-through detectors (i.e. no iterative processing) can be implemented. In this paper, a real-time and adaptive detection scheme is described. An ongoing case study has been set up with the Electronic Multi-Beam Radio Astronomy Concept (EMBRACE) radio telescope facility at Nançay Observatory. The objective is to evaluate the performances of this concept in term of hardware complexity, detection efficiency and additional RFI metadata rate cost. The UniBoard implementation scheme is described.

  15. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  16. Objective assessment of MPEG-2 video quality

    NASA Astrophysics Data System (ADS)

    Gastaldo, Paolo; Zunino, Rodolfo; Rovetta, Stefano

    2002-07-01

    The increasing use of video compression standards in broadcasting television systems has required, in recent years, the development of video quality measurements that take into account artifacts specifically caused by digital compression techniques. In this paper we present a methodology for the objective quality assessment of MPEG video streams by using circular back-propagation feedforward neural networks. Mapping neural networks can render nonlinear relationships between objective features and subjective judgments, thus avoiding any simplifying assumption on the complexity of the model. The neural network processes an instantaneous set of input values, and yields an associated estimate of perceived quality. Therefore, the neural-network approach turns objective quality assessment into adaptive modeling of subjective perception. The objective features used for the estimate are chosen according to the assessed relevance to perceived quality and are continuously extracted in real time from compressed video streams. The overall system mimics perception but does not require any analytical model of the underlying physical phenomenon. The capability to process compressed video streams represents an important advantage over existing approaches, like avoiding the stream-decoding process greatly enhances real-time performance. Experimental results confirm that the system provides satisfactory, continuous-time approximations for actual scoring curves concerning real test videos.

  17. Real-time range generation for ladar hardware-in-the-loop testing

    NASA Astrophysics Data System (ADS)

    Olson, Eric M.; Coker, Charles F.

    1996-05-01

    Real-time closed loop simulation of LADAR seekers in a hardware-in-the-loop facility can reduce program risk and cost. This paper discusses an implementation of real-time range imagery generated in a synthetic environment at the Kinetic Kill Vehicle Hardware-in-the Loop facility at Eglin AFB, for the stimulation of LADAR seekers and algorithms. The computer hardware platform used was a Silicon Graphics Incorporated Onyx Reality Engine. This computer contains graphics hardware, and is optimized for generating visible or infrared imagery in real-time. A by-produce of the rendering process, in the form of a depth buffer, is generated from all objects in view during its rendering process. The depth buffer is an array of integer values that contributes to the proper rendering of overlapping objects and can be converted to range values using a mathematical formula. This paper presents an optimized software approach to the generation of the scenes, calculation of the range values, and outputting the range data for a LADAR seeker.

  18. New technique for real-time distortion-invariant multiobject recognition and classification

    NASA Astrophysics Data System (ADS)

    Hong, Rutong; Li, Xiaoshun; Hong, En; Wang, Zuyi; Wei, Hongan

    2001-04-01

    A real-time hybrid distortion-invariant OPR system was established to make 3D multiobject distortion-invariant automatic pattern recognition. Wavelet transform technique was used to make digital preprocessing of the input scene, to depress the noisy background and enhance the recognized object. A three-layer backpropagation artificial neural network was used in correlation signal post-processing to perform multiobject distortion-invariant recognition and classification. The C-80 and NOA real-time processing ability and the multithread programming technology were used to perform high speed parallel multitask processing and speed up the post processing rate to ROIs. The reference filter library was constructed for the distortion version of 3D object model images based on the distortion parameter tolerance measuring as rotation, azimuth and scale. The real-time optical correlation recognition testing of this OPR system demonstrates that using the preprocessing, post- processing, the nonlinear algorithm os optimum filtering, RFL construction technique and the multithread programming technology, a high possibility of recognition and recognition rate ere obtained for the real-time multiobject distortion-invariant OPR system. The recognition reliability and rate was improved greatly. These techniques are very useful to automatic target recognition.

  19. Interactive MPEG-4 low-bit-rate speech/audio transmission over the Internet

    NASA Astrophysics Data System (ADS)

    Liu, Fang; Kim, JongWon; Kuo, C.-C. Jay

    1999-11-01

    The recently developed MPEG-4 technology enables the coding and transmission of natural and synthetic audio-visual data in the form of objects. In an effort to extend the object-based functionality of MPEG-4 to real-time Internet applications, architectural prototypes of multiplex layer and transport layer tailored for transmission of MPEG-4 data over IP are under debate among Internet Engineering Task Force (IETF), and MPEG-4 systems Ad Hoc group. In this paper, we present an architecture for interactive MPEG-4 speech/audio transmission system over the Internet. It utilities a framework of Real Time Streaming Protocol (RTSP) over Real-time Transport Protocol (RTP) to provide controlled, on-demand delivery of real time speech/audio data. Based on a client-server model, a couple of low bit-rate bit streams (real-time speech/audio, pre- encoded speech/audio) are multiplexed and transmitted via a single RTP channel to the receiver. The MPEG-4 Scene Description (SD) and Object Descriptor (OD) bit streams are securely sent through the RTSP control channel. Upon receiving, an initial MPEG-4 audio- visual scene is constructed after de-multiplexing, decoding of bit streams, and scene composition. A receiver is allowed to manipulate the initial audio-visual scene presentation locally, or interactively arrange scene changes by sending requests to the server. A server may also choose to update the client with new streams and list of contents for user selection.

  20. Development of real-time voltage stability monitoring tool for power system transmission network using Synchrophasor data

    NASA Astrophysics Data System (ADS)

    Pulok, Md Kamrul Hasan

    Intelligent and effective monitoring of power system stability in control centers is one of the key issues in smart grid technology to prevent unwanted power system blackouts. Voltage stability analysis is one of the most important requirements for control center operation in smart grid era. With the advent of Phasor Measurement Unit (PMU) or Synchrophasor technology, real time monitoring of voltage stability of power system is now a reality. This work utilizes real-time PMU data to derive a voltage stability index to monitor the voltage stability related contingency situation in power systems. The developed tool uses PMU data to calculate voltage stability index that indicates relative closeness of the instability by producing numerical indices. The IEEE 39 bus, New England power system was modeled and run on a Real-time Digital Simulator that stream PMU data over the Internet using IEEE C37.118 protocol. A Phasor data concentrator (PDC) is setup that receives streaming PMU data and stores them in Microsoft SQL database server. Then the developed voltage stability monitoring (VSM) tool retrieves phasor measurement data from SQL server, performs real-time state estimation of the whole network, calculate voltage stability index, perform real-time ranking of most vulnerable transmission lines, and finally shows all the results in a graphical user interface. All these actions are done in near real-time. Control centers can easily monitor the systems condition by using this tool and can take precautionary actions if needed.

  1. Simultaneous real-time data collection methods

    NASA Technical Reports Server (NTRS)

    Klincsek, Thomas

    1992-01-01

    This paper describes the development of electronic test equipment which executes, supervises, and reports on various tests. This validation process uses computers to analyze test results and report conclusions. The test equipment consists of an electronics component and the data collection and reporting unit. The PC software, display screens, and real-time data-base are described. Pass-fail procedures and data replay are discussed. The OS2 operating system and Presentation Manager user interface system were used to create a highly interactive automated system. The system outputs are hardcopy printouts and MS DOS format files which may be used as input for other PC programs.

  2. 41 CFR 102-75.210 - What must a transferee agency include in its request for an exception from the 100 percent...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 75-REAL PROPERTY DISPOSAL Utilization of Excess Real... exception would further essential agency program objectives and at the same time be consistent with...

  3. Pathological speech signal analysis and classification using empirical mode decomposition.

    PubMed

    Kaleem, Muhammad; Ghoraani, Behnaz; Guergachi, Aziz; Krishnan, Sridhar

    2013-07-01

    Automated classification of normal and pathological speech signals can provide an objective and accurate mechanism for pathological speech diagnosis, and is an active area of research. A large part of this research is based on analysis of acoustic measures extracted from sustained vowels. However, sustained vowels do not reflect real-world attributes of voice as effectively as continuous speech, which can take into account important attributes of speech such as rapid voice onset and termination, changes in voice frequency and amplitude, and sudden discontinuities in speech. This paper presents a methodology based on empirical mode decomposition (EMD) for classification of continuous normal and pathological speech signals obtained from a well-known database. EMD is used to decompose randomly chosen portions of speech signals into intrinsic mode functions, which are then analyzed to extract meaningful temporal and spectral features, including true instantaneous features which can capture discriminative information in signals hidden at local time-scales. A total of six features are extracted, and a linear classifier is used with the feature vector to classify continuous speech portions obtained from a database consisting of 51 normal and 161 pathological speakers. A classification accuracy of 95.7 % is obtained, thus demonstrating the effectiveness of the methodology.

  4. Research and design of photovoltaic power monitoring system based on Zig Bee

    NASA Astrophysics Data System (ADS)

    Zhu, Lijuan; Yun, Zhonghua; Bianbawangdui; Bianbaciren

    2018-01-01

    In order to monitor and study the impact of environmental parameters on photovoltaic cells, a photovoltaic cell monitoring system based on ZigBee is designed. The system uses ZigBee wireless communication technology to achieve real-time acquisition of P-I-V curves and environmental parameters of terminal nodes, and transfer the data to the coordinator, the coordinator communicates with the STM32 through the serial port. In addition, STM32 uses the serial port to transfer data to the host computer written by LabVIEW, and the collected data is displayed in real time, as well as stored in the background database. The experimental results show that the system has a stable performance, accurate measurement, high sensitivity, high reliability, can better realize real-time collection of photovoltaic cell characteristics and environmental parameters.

  5. Virtual reality for mobility devices: training applications and clinical results: a review.

    PubMed

    Erren-Wolters, Catelijne Victorien; van Dijk, Henk; de Kort, Alexander C; Ijzerman, Maarten J; Jannink, Michiel J

    2007-06-01

    Virtual reality technology is an emerging technology that possibly can address the problems encountered in training (elderly) people to handle a mobility device. The objective of this review was to study different virtual reality training applications as well as their clinical implication for patients with mobility problems. Computerized literature searches were performed using the MEDLINE, Cochrane, CIRRIE and REHABDATA databases. This resulted in eight peer reviewed journal articles. The included studies could be divided into three categories, on the basis of their study objective. Five studies were related to training driving skills, two to physical exercise training and one to leisure activity. This review suggests that virtual reality is a potentially useful means to improve the use of a mobility device, in training one's driving skills, for keeping up the physical condition and also in a way of leisure time activity. Although this field of research appears to be in its early stages, the included studies pointed out a promising transfer of training in a virtual environment to the real-life use of mobility devices.

  6. Characterization of real-time computers

    NASA Technical Reports Server (NTRS)

    Shin, K. G.; Krishna, C. M.

    1984-01-01

    A real-time system consists of a computer controller and controlled processes. Despite the synergistic relationship between these two components, they have been traditionally designed and analyzed independently of and separately from each other; namely, computer controllers by computer scientists/engineers and controlled processes by control scientists. As a remedy for this problem, in this report real-time computers are characterized by performance measures based on computer controller response time that are: (1) congruent to the real-time applications, (2) able to offer an objective comparison of rival computer systems, and (3) experimentally measurable/determinable. These measures, unlike others, provide the real-time computer controller with a natural link to controlled processes. In order to demonstrate their utility and power, these measures are first determined for example controlled processes on the basis of control performance functionals. They are then used for two important real-time multiprocessor design applications - the number-power tradeoff and fault-masking and synchronization.

  7. Reflective and refractive objects for mixed reality.

    PubMed

    Knecht, Martin; Traxler, Christoph; Winklhofer, Christoph; Wimmer, Michael

    2013-04-01

    In this paper, we present a novel rendering method which integrates reflective or refractive objects into a differential instant radiosity (DIR) framework usable for mixed-reality (MR) applications. This kind of objects are very special from the light interaction point of view, as they reflect and refract incident rays. Therefore they may cause high-frequency lighting effects known as caustics. Using instant-radiosity (IR) methods to approximate these high-frequency lighting effects would require a large amount of virtual point lights (VPLs) and is therefore not desirable due to real-time constraints. Instead, our approach combines differential instant radiosity with three other methods. One method handles more accurate reflections compared to simple cubemaps by using impostors. Another method is able to calculate two refractions in real-time, and the third method uses small quads to create caustic effects. Our proposed method replaces parts in light paths that belong to reflective or refractive objects using these three methods and thus tightly integrates into DIR. In contrast to previous methods which introduce reflective or refractive objects into MR scenarios, our method produces caustics that also emit additional indirect light. The method runs at real-time frame rates, and the results show that reflective and refractive objects with caustics improve the overall impression for MR scenarios.

  8. Finite-Element Methods for Real-Time Simulation of Surgery

    NASA Technical Reports Server (NTRS)

    Basdogan, Cagatay

    2003-01-01

    Two finite-element methods have been developed for mathematical modeling of the time-dependent behaviors of deformable objects and, more specifically, the mechanical responses of soft tissues and organs in contact with surgical tools. These methods may afford the computational efficiency needed to satisfy the requirement to obtain computational results in real time for simulating surgical procedures as described in Simulation System for Training in Laparoscopic Surgery (NPO-21192) on page 31 in this issue of NASA Tech Briefs. Simulation of the behavior of soft tissue in real time is a challenging problem because of the complexity of soft-tissue mechanics. The responses of soft tissues are characterized by nonlinearities and by spatial inhomogeneities and rate and time dependences of material properties. Finite-element methods seem promising for integrating these characteristics of tissues into computational models of organs, but they demand much central-processing-unit (CPU) time and memory, and the demand increases with the number of nodes and degrees of freedom in a given finite-element model. Hence, as finite-element models become more realistic, it becomes more difficult to compute solutions in real time. In both of the present methods, one uses approximate mathematical models trading some accuracy for computational efficiency and thereby increasing the feasibility of attaining real-time up36 NASA Tech Briefs, October 2003 date rates. The first of these methods is based on modal analysis. In this method, one reduces the number of differential equations by selecting only the most significant vibration modes of an object (typically, a suitable number of the lowest-frequency modes) for computing deformations of the object in response to applied forces.

  9. Using Amazon Web Services (AWS) to enable real-time, remote sensing of biophysical and anthropogenic conditions in green infrastructure systems in Philadelphia, an ultra-urban application of the Internet of Things (IoT)

    NASA Astrophysics Data System (ADS)

    Montalto, F. A.; Yu, Z.; Soldner, K.; Israel, A.; Fritch, M.; Kim, Y.; White, S.

    2017-12-01

    Urban stormwater utilities are increasingly using decentralized "green" infrastructure (GI) systems to capture stormwater and achieve compliance with regulations. Because environmental conditions, and design varies by GSI facility, monitoring of GSI systems under a range of conditions is essential. Conventional monitoring efforts can be costly because in-field data logging requires intense data transmission rates. The Internet of Things (IoT) can be used to more cost-effectively collect, store, and publish GSI monitoring data. Using 3G mobile networks, a cloud-based database was built on an Amazon Web Services (AWS) EC2 virtual machine to store and publish data collected with environmental sensors deployed in the field. This database can store multi-dimensional time series data, as well as photos and other observations logged by citizen scientists through a public engagement mobile app through a new Application Programming Interface (API). Also on the AWS EC2 virtual machine, a real-time QAQC flagging algorithm was developed to validate the sensor data streams.

  10. Handheld Devices with Wide-Area Wireless Connectivity: Applications in Astronomy Educational Technology and Remote Computational Control

    NASA Astrophysics Data System (ADS)

    Budiardja, R. D.; Lingerfelt, E. J.; Guidry, M. W.

    2003-05-01

    Wireless technology implemented with handheld devices has attractive features because of the potential to access large amounts of data and the prospect of on-the-fly computational analysis from a device that can be carried in a shirt pocket. We shall describe applications of such technology to the general paradigm of making digital wireless connections from the field to upload information and queries to network servers, executing (potentially complex) programs and controlling data analysis and/or database operations on fast network computers, and returning real-time information from this analysis to the handheld device in the field. As illustration, we shall describe several client/server programs that we have written for applications in teaching introductory astronomy. For example, one program allows static and dynamic properties of astronomical objects to be accessed in a remote observation laboratory setting using a digital cell phone or PDA. Another implements interactive quizzing over a cell phone or PDA using a 700-question introductory astronomy quiz database, thus permitting students to study for astronomy quizzes in any environment in which they have a few free minutes and a digital cell phone or wireless PDA. Another allows one to control and monitor a computation done on a Beowulf cluster by changing the parameters of the computation remotely and retrieving the result when the computation is done. The presentation will include hands-on demonstrations with real devices. *Managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.

  11. BanTeC: a software tool for management of corneal transplantation.

    PubMed

    López-Alvarez, P; Caballero, F; Trias, J; Cortés, U; López-Navidad, A

    2005-11-01

    Until recently, all cornea information at our tissue bank was managed manually, no specific database or computer tool had been implemented to provide electronic versions of documents and medical reports. The main objective of the BanTeC project was therefore to create a computerized system to integrate and classify all the information and documents used in the center in order to facilitate management of retrieved, transplanted corneal tissues. We used the Windows platform to develop the project. Microsoft Access and Microsoft Jet Engine were used at the database level and Data Access Objects was the chosen data access technology. In short, the BanTeC software seeks to computerize the tissue bank. All the initial stages of the development have now been completed, from specification of needs, program design and implementation of the software components, to the total integration of the final result in the real production environment. BanTeC will allow the generation of statistical reports for analysis to improve our performance.

  12. Real-Time Optical Surveillance of LEO/MEO with Small Telescopes

    NASA Astrophysics Data System (ADS)

    Zimmer, P.; McGraw, J.; Ackermann, M.

    J.T. McGraw and Associates, LLC operates two proof-of-concept wide-field imaging systems to test novel techniques for uncued surveillance of LEO/MEO/GEO and, in collaboration with the University of New Mexico (UNM), uses a third small telescope for rapidly queued same-orbit follow-up observations. Using our GPU-accelerated detection scheme, the proof-of-concept systems operating at sites near and within Albuquerque, NM, have detected objects fainter than V=13 at greater than 6 sigma significance. This detection approximately corresponds to a 16 cm object with albedo of 0.12 at 1000 km altitude. Dozens of objects are measured during each operational twilight period, many of which have no corresponding catalog object. The two proof-of-concept systems, separated by ~30km, work together by taking simultaneous images of the same orbital volume to constrain the orbits of detected objects using parallax measurements. These detections are followed-up by imaging photometric observations taken at UNM to confirm and further constrain the initial orbit determination and independently assess the objects and verify the quality of the derived orbits. This work continues to demonstrate that scalable optical systems designed for real-time detection of fast moving objects, which can be then handed off to other instruments capable of tracking and characterizing them, can provide valuable real-time surveillance data at LEO and beyond, which substantively informs the SSA process.

  13. Analysis of Information Requirements and Design of the Consolidated AFIT Database and Information System (CADIS) with an AFIT/CI Implementation Design.

    DTIC Science & Technology

    1982-12-01

    management, plus the comments received from the faculty and staff. A major assumption in this thesis is that automated database tech- niques offer the...and major advantage of a DBMS is that of real-time, on- line data accessibility. Routine queries, reports and ad hoc queries caii be performed...used or as applications programs evolve. Such changes can have a major impact on the organization and storage of data and ultimately on the response

  14. Real-time monitoring system of composite aircraft wings utilizing Fibre Bragg Grating sensor

    NASA Astrophysics Data System (ADS)

    Vorathin, E.; Hafizi, Z. M.; Che Ghani, S. A.; Lim, K. S.

    2016-10-01

    Embedment of Fibre Bragg Grating (FBG) sensor in composite aircraft wings leads to the advancement of structural condition monitoring. The monitored aircraft wings have the capability to give real-time response under critical loading circumstances. The main objective of this paper is to develop a real-time FBG monitoring system for composite aircraft wings to view real-time changes when the structure undergoes some static loadings and dynamic impact. The implementation of matched edge filter FBG interrogation system to convert wavelength variations to strain readings shows that the structure is able to response instantly in real-time when undergoing few loadings and dynamic impact. This smart monitoring system is capable of updating the changes instantly in real-time and shows the weight induced on the composite aircraft wings instantly without any error. It also has a good agreement with acoustic emission (AE) sensor in the dynamic test.

  15. A Context-Aware Method for Authentically Simulating Outdoors Shadows for Mobile Augmented Reality.

    PubMed

    Barreira, Joao; Bessa, Maximino; Barbosa, Luis; Magalhaes, Luis

    2018-03-01

    Visual coherence between virtual and real objects is a major issue in creating convincing augmented reality (AR) applications. To achieve this seamless integration, actual light conditions must be determined in real time to ensure that virtual objects are correctly illuminated and cast consistent shadows. In this paper, we propose a novel method to estimate daylight illumination and use this information in outdoor AR applications to render virtual objects with coherent shadows. The illumination parameters are acquired in real time from context-aware live sensor data. The method works under unprepared natural conditions. We also present a novel and rapid implementation of a state-of-the-art skylight model, from which the illumination parameters are derived. The Sun's position is calculated based on the user location and time of day, with the relative rotational differences estimated from a gyroscope, compass and accelerometer. The results illustrated that our method can generate visually credible AR scenes with consistent shadows rendered from recovered illumination.

  16. Real-time photo-magnetic imaging.

    PubMed

    Nouizi, Farouk; Erkol, Hakan; Luk, Alex; Unlu, Mehmet B; Gulsen, Gultekin

    2016-10-01

    We previously introduced a new high resolution diffuse optical imaging modality termed, photo-magnetic imaging (PMI). PMI irradiates the object under investigation with near-infrared light and monitors the variations of temperature using magnetic resonance thermometry (MRT). In this paper, we present a real-time PMI image reconstruction algorithm that uses analytic methods to solve the forward problem and assemble the Jacobian matrix much faster. The new algorithm is validated using real MRT measured temperature maps. In fact, it accelerates the reconstruction process by more than 250 times compared to a single iteration of the FEM-based algorithm, which opens the possibility for the real-time PMI.

  17. Monitoring SLAC High Performance UNIX Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lettsome, Annette K.; /Bethune-Cookman Coll. /SLAC

    2005-12-15

    Knowledge of the effectiveness and efficiency of computers is important when working with high performance systems. The monitoring of such systems is advantageous in order to foresee possible misfortunes or system failures. Ganglia is a software system designed for high performance computing systems to retrieve specific monitoring information. An alternative storage facility for Ganglia's collected data is needed since its default storage system, the round-robin database (RRD), struggles with data integrity. The creation of a script-driven MySQL database solves this dilemma. This paper describes the process took in the creation and implementation of the MySQL database for use by Ganglia.more » Comparisons between data storage by both databases are made using gnuplot and Ganglia's real-time graphical user interface.« less

  18. Global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  19. Patient privacy protection using anonymous access control techniques.

    PubMed

    Weerasinghe, D; Rajarajan, M; Elmufti, K; Rakocevic, V

    2008-01-01

    The objective of this study is to develop a solution to preserve security and privacy in a healthcare environment where health-sensitive information will be accessed by many parties and stored in various distributed databases. The solution should maintain anonymous medical records and it should be able to link anonymous medical information in distributed databases into a single patient medical record with the patient identity. In this paper we present a protocol that can be used to authenticate and authorize patients to healthcare services without providing the patient identification. Healthcare service can identify the patient using separate temporary identities in each identification session and medical records are linked to these temporary identities. Temporary identities can be used to enable record linkage and reverse track real patient identity in critical medical situations. The proposed protocol provides main security and privacy services such as user anonymity, message privacy, message confidentiality, user authentication, user authorization and message replay attacks. The medical environment validates the patient at the healthcare service as a real and registered patient for the medical services. Using the proposed protocol, the patient anonymous medical records at different healthcare services can be linked into one single report and it is possible to securely reverse track anonymous patient into the real identity. The protocol protects the patient privacy with a secure anonymous authentication to healthcare services and medical record registries according to the European and the UK legislations, where the patient real identity is not disclosed with the distributed patient medical records.

  20. Vision-based overlay of a virtual object into real scene for designing room interior

    NASA Astrophysics Data System (ADS)

    Harasaki, Shunsuke; Saito, Hideo

    2001-10-01

    In this paper, we introduce a geometric registration method for augmented reality (AR) and an application system, interior simulator, in which a virtual (CG) object can be overlaid into a real world space. Interior simulator is developed as an example of an AR application of the proposed method. Using interior simulator, users can visually simulate the location of virtual furniture and articles in the living room so that they can easily design the living room interior without placing real furniture and articles, by viewing from many different locations and orientations in real-time. In our system, two base images of a real world space are captured from two different views for defining a projective coordinate of object 3D space. Then each projective view of a virtual object in the base images are registered interactively. After such coordinate determination, an image sequence of a real world space is captured by hand-held camera with tracking non-metric measured feature points for overlaying a virtual object. Virtual objects can be overlaid onto the image sequence by taking each relationship between the images. With the proposed system, 3D position tracking device, such as magnetic trackers, are not required for the overlay of virtual objects. Experimental results demonstrate that 3D virtual furniture can be overlaid into an image sequence of the scene of a living room nearly at video rate (20 frames per second).

  1. STC synthesis of real-time driver information for congestion management : research project capsule.

    DOT National Transportation Integrated Search

    2014-02-01

    The main focus of this synthesis report is to compile a technical summary of past and current research, as : well as the state of the practice, on the role of real-time information in congestion mitigation programs. The : speci c objectives are to...

  2. A DBMS-based medical teleconferencing system.

    PubMed

    Chun, J; Kim, H; Lee, S; Choi, J; Cho, H

    2001-01-01

    This article presents the design of a medical teleconferencing system that is integrated with a multimedia patient database and incorporates easy-to-use tools and functions to effectively support collaborative work between physicians in remote locations. The design provides a virtual workspace that allows physicians to collectively view various kinds of patient data. By integrating the teleconferencing function into this workspace, physicians are able to conduct conferences using the same interface and have real-time access to the database during conference sessions. The authors have implemented a prototype based on this design. The prototype uses a high-speed network test bed and a manually created substitute for the integrated patient database.

  3. A DBMS-based Medical Teleconferencing System

    PubMed Central

    Chun, Jonghoon; Kim, Hanjoon; Lee, Sang-goo; Choi, Jinwook; Cho, Hanik

    2001-01-01

    This article presents the design of a medical teleconferencing system that is integrated with a multimedia patient database and incorporates easy-to-use tools and functions to effectively support collaborative work between physicians in remote locations. The design provides a virtual workspace that allows physicians to collectively view various kinds of patient data. By integrating the teleconferencing function into this workspace, physicians are able to conduct conferences using the same interface and have real-time access to the database during conference sessions. The authors have implemented a prototype based on this design. The prototype uses a high-speed network test bed and a manually created substitute for the integrated patient database. PMID:11522766

  4. Development of a real-time clinical decision support system upon the web mvc-based architecture for prostate cancer treatment

    PubMed Central

    2011-01-01

    Background A real-time clinical decision support system (RTCDSS) with interactive diagrams enables clinicians to instantly and efficiently track patients' clinical records (PCRs) and improve their quality of clinical care. We propose a RTCDSS to process online clinical informatics from multiple databases for clinical decision making in the treatment of prostate cancer based on Web Model-View-Controller (MVC) architecture, by which the system can easily be adapted to different diseases and applications. Methods We designed a framework upon the Web MVC-based architecture in which the reusable and extractable models can be conveniently adapted to other hospital information systems and which allows for efficient database integration. Then, we determined the clinical variables of the prostate cancer treatment based on participating clinicians' opinions and developed a computational model to determine the pretreatment parameters. Furthermore, the components of the RTCDSS integrated PCRs and decision factors for real-time analysis to provide evidence-based diagrams upon the clinician-oriented interface for visualization of treatment guidance and health risk assessment. Results The resulting system can improve quality of clinical treatment by allowing clinicians to concurrently analyze and evaluate the clinical markers of prostate cancer patients with instantaneous clinical data and evidence-based diagrams which can automatically identify pretreatment parameters. Moreover, the proposed RTCDSS can aid interactions between patients and clinicians. Conclusions Our proposed framework supports online clinical informatics, evaluates treatment risks, offers interactive guidance, and provides real-time reference for decision making in the treatment of prostate cancer. The developed clinician-oriented interface can assist clinicians in conveniently presenting evidence-based information to patients and can be readily adapted to an existing hospital information system and be easily applied in other chronic diseases. PMID:21385459

  5. Development of a real-time clinical decision support system upon the Web MVC-based architecture for prostate cancer treatment.

    PubMed

    Lin, Hsueh-Chun; Wu, Hsi-Chin; Chang, Chih-Hung; Li, Tsai-Chung; Liang, Wen-Miin; Wang, Jong-Yi Wang

    2011-03-08

    A real-time clinical decision support system (RTCDSS) with interactive diagrams enables clinicians to instantly and efficiently track patients' clinical records (PCRs) and improve their quality of clinical care. We propose a RTCDSS to process online clinical informatics from multiple databases for clinical decision making in the treatment of prostate cancer based on Web Model-View-Controller (MVC) architecture, by which the system can easily be adapted to different diseases and applications. We designed a framework upon the Web MVC-based architecture in which the reusable and extractable models can be conveniently adapted to other hospital information systems and which allows for efficient database integration. Then, we determined the clinical variables of the prostate cancer treatment based on participating clinicians' opinions and developed a computational model to determine the pretreatment parameters. Furthermore, the components of the RTCDSS integrated PCRs and decision factors for real-time analysis to provide evidence-based diagrams upon the clinician-oriented interface for visualization of treatment guidance and health risk assessment. The resulting system can improve quality of clinical treatment by allowing clinicians to concurrently analyze and evaluate the clinical markers of prostate cancer patients with instantaneous clinical data and evidence-based diagrams which can automatically identify pretreatment parameters. Moreover, the proposed RTCDSS can aid interactions between patients and clinicians. Our proposed framework supports online clinical informatics, evaluates treatment risks, offers interactive guidance, and provides real-time reference for decision making in the treatment of prostate cancer. The developed clinician-oriented interface can assist clinicians in conveniently presenting evidence-based information to patients and can be readily adapted to an existing hospital information system and be easily applied in other chronic diseases.

  6. Time Series Discord Detection in Medical Data using a Parallel Relational Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Rintoul, Mark Daniel; Wilson, Andrew T.

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  7. Time Series Discord Detection in Medical Data using a Parallel Relational Database [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodbridge, Diane; Wilson, Andrew T.; Rintoul, Mark Daniel

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithmsmore » on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.« less

  8. Real-time teleophthalmology versus face-to-face consultation: A systematic review.

    PubMed

    Tan, Irene J; Dobson, Lucy P; Bartnik, Stephen; Muir, Josephine; Turner, Angus W

    2017-08-01

    Introduction Advances in imaging capabilities and the evolution of real-time teleophthalmology have the potential to provide increased coverage to areas with limited ophthalmology services. However, there is limited research assessing the diagnostic accuracy of face-to-face teleophthalmology consultation. This systematic review aims to determine if real-time teleophthalmology provides comparable accuracy to face-to-face consultation for the diagnosis of common eye health conditions. Methods A search of PubMed, Embase, Medline and Cochrane databases and manual citation review was conducted on 6 February and 7 April 2016. Included studies involved real-time telemedicine in the field of ophthalmology or optometry, and assessed diagnostic accuracy against gold-standard face-to-face consultation. The revised quality assessment of diagnostic accuracy studies (QUADAS-2) tool assessed risk of bias. Results Twelve studies were included, with participants ranging from four to 89 years old. A broad number of conditions were assessed and include corneal and retinal pathologies, strabismus, oculoplastics and post-operative review. Quality assessment identified a high or unclear risk of bias in patient selection (75%) due to an undisclosed recruitment processes. The index test showed high risk of bias in the included studies, due to the varied interpretation and conduct of real-time teleophthalmology methods. Reference standard risk was overall low (75%), as was the risk due to flow and timing (75%). Conclusion In terms of diagnostic accuracy, real-time teleophthalmology was considered superior to face-to-face consultation in one study and comparable in six studies. Store-and-forward image transmission coupled with real-time videoconferencing is a suitable alternative to overcome poor internet transmission speeds.

  9. Object tracking with stereo vision

    NASA Technical Reports Server (NTRS)

    Huber, Eric

    1994-01-01

    A real-time active stereo vision system incorporating gaze control and task directed vision is described. Emphasis is placed on object tracking and object size and shape determination. Techniques include motion-centroid tracking, depth tracking, and contour tracking.

  10. A novel framework for intelligent surveillance system based on abnormal human activity detection in academic environments.

    PubMed

    Al-Nawashi, Malek; Al-Hazaimeh, Obaida M; Saraee, Mohamad

    2017-01-01

    Abnormal activity detection plays a crucial role in surveillance applications, and a surveillance system that can perform robustly in an academic environment has become an urgent need. In this paper, we propose a novel framework for an automatic real-time video-based surveillance system which can simultaneously perform the tracking, semantic scene learning, and abnormality detection in an academic environment. To develop our system, we have divided the work into three phases: preprocessing phase, abnormal human activity detection phase, and content-based image retrieval phase. For motion object detection, we used the temporal-differencing algorithm and then located the motions region using the Gaussian function. Furthermore, the shape model based on OMEGA equation was used as a filter for the detected objects (i.e., human and non-human). For object activities analysis, we evaluated and analyzed the human activities of the detected objects. We classified the human activities into two groups: normal activities and abnormal activities based on the support vector machine. The machine then provides an automatic warning in case of abnormal human activities. It also embeds a method to retrieve the detected object from the database for object recognition and identification using content-based image retrieval. Finally, a software-based simulation using MATLAB was performed and the results of the conducted experiments showed an excellent surveillance system that can simultaneously perform the tracking, semantic scene learning, and abnormality detection in an academic environment with no human intervention.

  11. The programming language HAL: A specification

    NASA Technical Reports Server (NTRS)

    1971-01-01

    HAL accomplishes three significant objectives: (1) increased readability, through the use of a natural two-dimensional mathematical format; (2) increased reliability, by providing for selective recognition of common data and subroutines, and by incorporating specific data-protect features; (3) real-time control facility, by including a comprehensive set of real-time control commands and signal conditions. Although HAL is designed primarily for programming on-board computers, it is general enough to meet nearly all the needs in the production, verification and support of aerospace, and other real-time applications.

  12. A near-real-time full-parallax holographic display for remote operations

    NASA Technical Reports Server (NTRS)

    Iavecchia, Helene P.; Huff, Lloyd; Marzwell, Neville I.

    1991-01-01

    A near real-time, full parallax holographic display system was developed that has the potential to provide a 3-D display for remote handling operations in hazardous environments. The major components of the system consist of a stack of three spatial light modulators which serves as the object source of the hologram; a near real-time holographic recording material (such as thermoplastic and photopolymer); and an optical system for relaying SLM images to the holographic recording material and to the observer for viewing.

  13. Converting optical scanning holograms of real objects to binary Fourier holograms using an iterative direct binary search algorithm.

    PubMed

    Leportier, Thibault; Park, Min Chul; Kim, You Seok; Kim, Taegeun

    2015-02-09

    In this paper, we present a three-dimensional holographic imaging system. The proposed approach records a complex hologram of a real object using optical scanning holography, converts the complex form to binary data, and then reconstructs the recorded hologram using a spatial light modulator (SLM). The conversion from the recorded hologram to a binary hologram is achieved using a direct binary search algorithm. We present experimental results that verify the efficacy of our approach. To the best of our knowledge, this is the first time that a hologram of a real object has been reconstructed using a binary SLM.

  14. Real-time database drawn from an electronic health record for a thoracic surgery unit: high-quality clinical data saving time and human resources.

    PubMed

    Salati, Michele; Pompili, Cecilia; Refai, Majed; Xiumè, Francesco; Sabbatini, Armando; Brunelli, Alessandro

    2014-06-01

    The aim of the present study was to verify whether the implementation of an electronic health record (EHR) in our thoracic surgery unit allows creation of a high-quality clinical database saving time and costs. Before August 2011, multiple individuals compiled the on-paper documents/records and a single data manager inputted selected data into the database (traditional database, tDB). Since the adoption of an EHR in August 2011, multiple individuals have been responsible for compiling the EHR, which automatically generates a real-time database (EHR-based database, eDB), without the need for a data manager. During the initial period of implementation of the EHR, periodic meetings were held with all physicians involved in the use of the EHR in order to monitor and standardize the data registration process. Data quality of the first 100 anatomical lung resections recorded in the eDB was assessed by measuring the total number of missing values (MVs: existing non-reported value) and inaccurate values (wrong data) occurring in 95 core variables. The average MV of the eDB was compared with the one occurring in the same variables of the last 100 records registered in the tDB. A learning curve was constructed by plotting the number of MVs in the electronic database and tDB with the patients arranged by the date of registration. The tDB and eDB had similar MVs (0.74 vs 1, P = 0.13). The learning curve showed an initial phase including about 35 records, where MV in the eDB was higher than that in the tDB (1.9 vs 0.74, P = 0.03), and a subsequent phase, where the MV was similar in the two databases (0.7 vs 0.74, P = 0.6). The inaccuracy rate of these two phases in the eDB was stable (0.5 vs 0.3, P = 0.3). Using EHR saved an average of 9 min per patient, totalling 15 h saved for obtaining a dataset of 100 patients with respect to the tDB. The implementation of EHR allowed streamlining the process of clinical data recording. It saved time and human resource costs, without compromising the quality of data. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  15. Object-oriented integrated approach for the design of scalable ECG systems.

    PubMed

    Boskovic, Dusanka; Besic, Ingmar; Avdagic, Zikrija

    2009-01-01

    The paper presents the implementation of Object-Oriented (OO) integrated approaches to the design of scalable Electro-Cardio-Graph (ECG) Systems. The purpose of this methodology is to preserve real-world structure and relations with the aim to minimize the information loss during the process of modeling, especially for Real-Time (RT) systems. We report on a case study of the design that uses the integration of OO and RT methods and the Unified Modeling Language (UML) standard notation. OO methods identify objects in the real-world domain and use them as fundamental building blocks for the software system. The gained experience based on the strongly defined semantics of the object model is discussed and related problems are analyzed.

  16. An Adaptive Navigation Support System for Conducting Context-Aware Ubiquitous Learning in Museums

    ERIC Educational Resources Information Center

    Chiou, Chuang-Kai; Tseng, Judy C. R.; Hwang, Gwo-Jen; Heller, Shelly

    2010-01-01

    In context-aware ubiquitous learning, students are guided to learn in the real world with personalized supports from the learning system. As the learning resources are realistic objects in the real world, certain physical constraints, such as the limitation of stream of people who visit the same learning object, the time for moving from one object…

  17. XTCE GOVSAT Tool Suite 1.0

    NASA Technical Reports Server (NTRS)

    Rice, J. Kevin

    2013-01-01

    The XTCE GOVSAT software suite contains three tools: validation, search, and reporting. The Extensible Markup Language (XML) Telemetric and Command Exchange (XTCE) GOVSAT Tool Suite is written in Java for manipulating XTCE XML files. XTCE is a Consultative Committee for Space Data Systems (CCSDS) and Object Management Group (OMG) specification for describing the format and information in telemetry and command packet streams. These descriptions are files that are used to configure real-time telemetry and command systems for mission operations. XTCE s purpose is to exchange database information between different systems. XTCE GOVSAT consists of rules for narrowing the use of XTCE for missions. The Validation Tool is used to syntax check GOVSAT XML files. The Search Tool is used to search (i.e. command and telemetry mnemonics) the GOVSAT XML files and view the results. Finally, the Reporting Tool is used to create command and telemetry reports. These reports can be displayed or printed for use by the operations team.

  18. Molecular surface representation using 3D Zernike descriptors for protein shape comparison and docking.

    PubMed

    Kihara, Daisuke; Sael, Lee; Chikhi, Rayan; Esquivel-Rodriguez, Juan

    2011-09-01

    The tertiary structures of proteins have been solved in an increasing pace in recent years. To capitalize the enormous efforts paid for accumulating the structure data, efficient and effective computational methods need to be developed for comparing, searching, and investigating interactions of protein structures. We introduce the 3D Zernike descriptor (3DZD), an emerging technique to describe molecular surfaces. The 3DZD is a series expansion of mathematical three-dimensional function, and thus a tertiary structure is represented compactly by a vector of coefficients of terms in the series. A strong advantage of the 3DZD is that it is invariant to rotation of target object to be represented. These two characteristics of the 3DZD allow rapid comparison of surface shapes, which is sufficient for real-time structure database screening. In this article, we review various applications of the 3DZD, which have been recently proposed.

  19. A computational approach to real-time image processing for serial time-encoded amplified microscopy

    NASA Astrophysics Data System (ADS)

    Oikawa, Minoru; Hiyama, Daisuke; Hirayama, Ryuji; Hasegawa, Satoki; Endo, Yutaka; Sugie, Takahisa; Tsumura, Norimichi; Kuroshima, Mai; Maki, Masanori; Okada, Genki; Lei, Cheng; Ozeki, Yasuyuki; Goda, Keisuke; Shimobaba, Tomoyoshi

    2016-03-01

    High-speed imaging is an indispensable technique, particularly for identifying or analyzing fast-moving objects. The serial time-encoded amplified microscopy (STEAM) technique was proposed to enable us to capture images with a frame rate 1,000 times faster than using conventional methods such as CCD (charge-coupled device) cameras. The application of this high-speed STEAM imaging technique to a real-time system, such as flow cytometry for a cell-sorting system, requires successively processing a large number of captured images with high throughput in real time. We are now developing a high-speed flow cytometer system including a STEAM camera. In this paper, we describe our approach to processing these large amounts of image data in real time. We use an analog-to-digital converter that has up to 7.0G samples/s and 8-bit resolution for capturing the output voltage signal that involves grayscale images from the STEAM camera. Therefore the direct data output from the STEAM camera generates 7.0G byte/s continuously. We provided a field-programmable gate array (FPGA) device as a digital signal pre-processor for image reconstruction and finding objects in a microfluidic channel with high data rates in real time. We also utilized graphics processing unit (GPU) devices for accelerating the calculation speed of identification of the reconstructed images. We built our prototype system, which including a STEAM camera, a FPGA device and a GPU device, and evaluated its performance in real-time identification of small particles (beads), as virtual biological cells, owing through a microfluidic channel.

  20. Eye center localization and gaze gesture recognition for human-computer interaction.

    PubMed

    Zhang, Wenhao; Smith, Melvyn L; Smith, Lyndon N; Farooq, Abdul

    2016-03-01

    This paper introduces an unsupervised modular approach for accurate and real-time eye center localization in images and videos, thus allowing a coarse-to-fine, global-to-regional scheme. The trajectories of eye centers in consecutive frames, i.e., gaze gestures, are further analyzed, recognized, and employed to boost the human-computer interaction (HCI) experience. This modular approach makes use of isophote and gradient features to estimate the eye center locations. A selective oriented gradient filter has been specifically designed to remove strong gradients from eyebrows, eye corners, and shadows, which sabotage most eye center localization methods. A real-world implementation utilizing these algorithms has been designed in the form of an interactive advertising billboard to demonstrate the effectiveness of our method for HCI. The eye center localization algorithm has been compared with 10 other algorithms on the BioID database and six other algorithms on the GI4E database. It outperforms all the other algorithms in comparison in terms of localization accuracy. Further tests on the extended Yale Face Database b and self-collected data have proved this algorithm to be robust against moderate head poses and poor illumination conditions. The interactive advertising billboard has manifested outstanding usability and effectiveness in our tests and shows great potential for benefiting a wide range of real-world HCI applications.

  1. A Framework of Simple Event Detection in Surveillance Video

    NASA Astrophysics Data System (ADS)

    Xu, Weiguang; Zhang, Yafei; Lu, Jianjiang; Tian, Yulong; Wang, Jiabao

    Video surveillance is playing more and more important role in people's social life. Real-time alerting of threaten events and searching interesting content in stored large scale video footage needs human operator to pay full attention on monitor for long time. The labor intensive mode has limit the effectiveness and efficiency of the system. A framework of simple event detection is presented advance the automation of video surveillance. An improved inner key point matching approach is used to compensate motion of background in real-time; frame difference are used to detect foreground; HOG based classifiers are used to classify foreground object into people and car; mean-shift is used to tracking the recognized objects. Events are detected based on predefined rules. The maturity of the algorithms guarantee the robustness of the framework, and the improved approach and the easily checked rules enable the framework to work in real-time. Future works to be done are also discussed.

  2. The Internet Compendium: Subject Guides to Humanities Resources.

    ERIC Educational Resources Information Center

    Rosenfeld, Louis; And Others

    This guide describes and evaluates the Internet's humanities resources by subject. It offers information on a multitude of listservs; Usenet newsgroups; forums; electronic journals; topical mailing lists; text archives; Freenets; bulletin boards; FAQs; newsletters; real-time chats; databases; and library catalogs. Internet users can draw upon…

  3. Stereoscopic wide field of view imaging system

    NASA Technical Reports Server (NTRS)

    Prechtl, Eric F. (Inventor); Sedwick, Raymond J. (Inventor); Jonas, Eric M. (Inventor)

    2011-01-01

    A stereoscopic imaging system incorporates a plurality of imaging devices or cameras to generate a high resolution, wide field of view image database from which images can be combined in real time to provide wide field of view or panoramic or omni-directional still or video images.

  4. Single-vehicle crashes along rural mountainous highways in Malaysia: An application of random parameters negative binomial model.

    PubMed

    Rusli, Rusdi; Haque, Md Mazharul; King, Mark; Voon, Wong Shaw

    2017-05-01

    Mountainous highways generally associate with complex driving environment because of constrained road geometries, limited cross-section elements, inappropriate roadside features, and adverse weather conditions. As a result, single-vehicle (SV) crashes are overrepresented along mountainous roads, particularly in developing countries, but little attention is known about the roadway geometric, traffic and weather factors contributing to these SV crashes. As such, the main objective of the present study is to investigate SV crashes using detailed data obtained from a rigorous site survey and existing databases. The final dataset included a total of 56 variables representing road geometries including horizontal and vertical alignment, traffic characteristics, real-time weather condition, cross-sectional elements, roadside features, and spatial characteristics. To account for structured heterogeneities resulting from multiple observations within a site and other unobserved heterogeneities, the study applied a random parameters negative binomial model. Results suggest that rainfall during the crash is positively associated with SV crashes, but real-time visibility is negatively associated. The presence of a road shoulder, particularly a bitumen shoulder or wider shoulders, along mountainous highways is associated with less SV crashes. While speeding along downgrade slopes increases the likelihood of SV crashes, proper delineation decreases the likelihood. Findings of this study have significant implications for designing safer highways in mountainous areas, particularly in the context of a developing country. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. The automated data processing architecture for the GPI Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce

    2017-09-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  6. Real-time target tracking of soft tissues in 3D ultrasound images based on robust visual information and mechanical simulation.

    PubMed

    Royer, Lucas; Krupa, Alexandre; Dardenne, Guillaume; Le Bras, Anthony; Marchand, Eric; Marchal, Maud

    2017-01-01

    In this paper, we present a real-time approach that allows tracking deformable structures in 3D ultrasound sequences. Our method consists in obtaining the target displacements by combining robust dense motion estimation and mechanical model simulation. We perform evaluation of our method through simulated data, phantom data, and real-data. Results demonstrate that this novel approach has the advantage of providing correct motion estimation regarding different ultrasound shortcomings including speckle noise, large shadows and ultrasound gain variation. Furthermore, we show the good performance of our method with respect to state-of-the-art techniques by testing on the 3D databases provided by MICCAI CLUST'14 and CLUST'15 challenges. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Implicit Multibody Penalty-BasedDistributed Contact.

    PubMed

    Xu, Hongyi; Zhao, Yili; Barbic, Jernej

    2014-09-01

    The penalty method is a simple and popular approach to resolving contact in computer graphics and robotics. Penalty-based contact, however, suffers from stability problems due to the highly variable and unpredictable net stiffness, and this is particularly pronounced in simulations with time-varying distributed geometrically complex contact. We employ semi-implicit integration, exact analytical contact gradients, symbolic Gaussian elimination and a SVD solver to simulate stable penalty-based frictional contact with large, time-varying contact areas, involving many rigid objects and articulated rigid objects in complex conforming contact and self-contact. We also derive implicit proportional-derivative control forces for real-time control of articulated structures with loops. We present challenging contact scenarios such as screwing a hexbolt into a hole, bowls stacked in perfectly conforming configurations, and manipulating many objects using actively controlled articulated mechanisms in real time.

  8. Real-Time Point Positioning Performance Evaluation of Single-Frequency Receivers Using NASA's Global Differential GPS System

    NASA Technical Reports Server (NTRS)

    Muellerschoen, Ronald J.; Iijima, Byron; Meyer, Robert; Bar-Sever, Yoaz; Accad, Elie

    2004-01-01

    This paper evaluates the performance of a single-frequency receiver using the 1-Hz differential corrections as provided by NASA's global differential GPS system. While the dual-frequency user has the ability to eliminate the ionosphere error by taking a linear combination of observables, the single-frequency user must remove or calibrate this error by other means. To remove the ionosphere error we take advantage of the fact that the magnitude of the group delay in range observable and the carrier phase advance have the same magnitude but are opposite in sign. A way to calibrate this error is to use a real-time database of grid points computed by JPL's RTI (Real-Time Ionosphere) software. In both cases we evaluate the positional accuracy of a kinematic carrier phase based point positioning method on a global extent.

  9. Implementation of a data packet generator using pattern matching for wearable ECG monitoring systems.

    PubMed

    Noh, Yun Hong; Jeong, Do Un

    2014-07-15

    In this paper, a packet generator using a pattern matching algorithm for real-time abnormal heartbeat detection is proposed. The packet generator creates a very small data packet which conveys sufficient crucial information for health condition analysis. The data packet envelopes real time ECG signals and transmits them to a smartphone via Bluetooth. An Android application was developed specifically to decode the packet and extract ECG information for health condition analysis. Several graphical presentations are displayed and shown on the smartphone. We evaluate the performance of abnormal heartbeat detection accuracy using the MIT/BIH Arrhythmia Database and real time experiments. The experimental result confirm our finding that abnormal heart beat detection is practically possible. We also performed data compression ratio and signal restoration performance evaluations to establish the usefulness of the proposed packet generator and the results were excellent.

  10. Georgia's Surface-Water Resources and Streamflow Monitoring Network, 2006

    USGS Publications Warehouse

    Nobles, Patricia L.; ,

    2006-01-01

    The U.S. Geological Survey (USGS) network of 223 real-time monitoring stations, the 'Georgia HydroWatch,' provides real-time water-stage data, with streamflow computed at 198 locations, and rainfall recorded at 187 stations. These sites continuously record data on 15-minute intervals and transmit the data via satellite to be incorporated into the USGS National Water Information System database. These data are automatically posted to the USGS Web site for public dissemination (http://waterdata.usgs.gov/ga/nwis/nwis). The real-time capability of this network provides information to help emergency-management officials protect human life and property during floods, and mitigate the effects of prolonged drought. The map at right shows the USGS streamflow monitoring network for Georgia and major watersheds. Streamflow is monitored at 198 sites statewide, more than 80 percent of which include precipitation gages. Various Federal, State, and local agencies fund these streamflow monitoring stations.

  11. Design and deployment of hybrid-telemedicine applications

    NASA Astrophysics Data System (ADS)

    Ikhu-Omoregbe, N. A.; Atayero, A. A.; Ayo, C. K.; Olugbara, O. O.

    2005-01-01

    With advances and availability of information and communication technology infrastructures in some nations and institutions, patients are now able to receive healthcare services from doctors and healthcare centers even when they are physically separated. The availability and transfer of patient data which often include medical images for specialist opinion is invaluable both to the patient and the medical practitioner in a telemedicine session. Two existing approaches to telemedicine are real-time and stored-and-forward. The real-time requires the availability or development of video-conferencing infrastructures which are expensive especially for most developing nations of the world while stored-and-forward could allow data transmission between any hospital with computer and telephone by landline link which is less expensive but with delays. We therefore propose a hybrid design of applications using hypermedia database capable of harnessing the features of real-time and stored-and-forward deployed over a wireless Virtual Private Network for the participating centers and healthcare providers.

  12. Detecting changes in real-world objects: The relationship between visual long-term memory and change blindness.

    PubMed

    Brady, Timothy F; Konkle, Talia; Oliva, Aude; Alvarez, George A

    2009-01-01

    A large body of literature has shown that observers often fail to notice significant changes in visual scenes, even when these changes happen right in front of their eyes. For instance, people often fail to notice if their conversation partner is switched to another person, or if large background objects suddenly disappear.1,2 These 'change blindness' studies have led to the inference that the amount of information we remember about each item in a visual scene may be quite low.1 However, in recent work we have demonstrated that long-term memory is capable of storing a massive number of visual objects with significant detail about each item.3 In the present paper we attempt to reconcile these findings by demonstrating that observers do not experience 'change blindness' with the real world objects used in our previous experiment if they are given sufficient time to encode each item. The results reported here suggest that one of the major causes of change blindness for real-world objects is a lack of encoding time or attention to each object (see also refs. 4 and 5).

  13. Reporting to Improve Reproducibility and Facilitate Validity Assessment for Healthcare Database Studies V1.0.

    PubMed

    Wang, Shirley V; Schneeweiss, Sebastian; Berger, Marc L; Brown, Jeffrey; de Vries, Frank; Douglas, Ian; Gagne, Joshua J; Gini, Rosa; Klungel, Olaf; Mullins, C Daniel; Nguyen, Michael D; Rassen, Jeremy A; Smeeth, Liam; Sturkenboom, Miriam

    2017-09-01

    Defining a study population and creating an analytic dataset from longitudinal healthcare databases involves many decisions. Our objective was to catalogue scientific decisions underpinning study execution that should be reported to facilitate replication and enable assessment of validity of studies conducted in large healthcare databases. We reviewed key investigator decisions required to operate a sample of macros and software tools designed to create and analyze analytic cohorts from longitudinal streams of healthcare data. A panel of academic, regulatory, and industry experts in healthcare database analytics discussed and added to this list. Evidence generated from large healthcare encounter and reimbursement databases is increasingly being sought by decision-makers. Varied terminology is used around the world for the same concepts. Agreeing on terminology and which parameters from a large catalogue are the most essential to report for replicable research would improve transparency and facilitate assessment of validity. At a minimum, reporting for a database study should provide clarity regarding operational definitions for key temporal anchors and their relation to each other when creating the analytic dataset, accompanied by an attrition table and a design diagram. A substantial improvement in reproducibility, rigor and confidence in real world evidence generated from healthcare databases could be achieved with greater transparency about operational study parameters used to create analytic datasets from longitudinal healthcare databases. © 2017 The Authors. Pharmacoepidemiology & Drug Safety Published by John Wiley & Sons Ltd.

  14. Data Rods: High Speed, Time-Series Analysis of Massive Cryospheric Data Sets Using Object-Oriented Database Methods

    NASA Astrophysics Data System (ADS)

    Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.

    2011-12-01

    Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical calculations such as running averages are easily implemented against the time-centric columns of data.

  15. CB Database: A change blindness database for objects in natural indoor scenes.

    PubMed

    Sareen, Preeti; Ehinger, Krista A; Wolfe, Jeremy M

    2016-12-01

    Change blindness has been a topic of interest in cognitive sciences for decades. Change detection experiments are frequently used for studying various research topics such as attention and perception. However, creating change detection stimuli is tedious and there is no open repository of such stimuli using natural scenes. We introduce the Change Blindness (CB) Database with object changes in 130 colored images of natural indoor scenes. The size and eccentricity are provided for all the changes as well as reaction time data from a baseline experiment. In addition, we have two specialized satellite databases that are subsets of the 130 images. In one set, changes are seen in rooms or in mirrors in those rooms (Mirror Change Database). In the other, changes occur in a room or out a window (Window Change Database). Both the sets have controlled background, change size, and eccentricity. The CB Database is intended to provide researchers with a stimulus set of natural scenes with defined stimulus parameters that can be used for a wide range of experiments. The CB Database can be found at http://search.bwh.harvard.edu/new/CBDatabase.html .

  16. The influence of the Bible geographic objects peculiarities on the concept of the spatiotemporal geoinformation system

    NASA Astrophysics Data System (ADS)

    Linsebarth, A.; Moscicka, A.

    2010-01-01

    The article describes the infl uence of the Bible geographic object peculiarities on the spatiotemporal geoinformation system of the Bible events. In the proposed concept of this system the special attention was concentrated to the Bible geographic objects and interrelations between the names of these objects and their location in the geospace. In the Bible, both in the Old and New Testament, there are hundreds of geographical names, but the selection of these names from the Bible text is not so easy. The same names are applied for the persons and geographic objects. The next problem which arises is the classification of the geographical object, because in several cases the same name is used for the towns, mountains, hills, valleys etc. Also very serious problem is related to the time-changes of the names. The interrelation between the object name and its location is also complicated. The geographic object of this same name is located in various places which should be properly correlated with the Bible text. Above mentioned peculiarities of Bible geographic objects infl uenced the concept of the proposed system which consists of three databases: reference, geographic object, and subject/thematic. The crucial component of this system is proper architecture of the geographic object database. In the paper very detailed description of this database is presented. The interrelation between the databases allows to the Bible readers to connect the Bible text with the geography of the terrain on which the Bible events occurred and additionally to have access to the other geographical and historical information related to the geographic objects.

  17. Smartphone Mobile Application Delivering Personalized, Real-Time Sun Protection Advice: A Randomized Clinical Trial

    PubMed Central

    Buller, David B.; Berwick, Marianne; Lantz, Kathy; Buller, Mary Klein; Shane, James; Kane, Ilima; Liu, Xia

    2014-01-01

    Importance Mobile smart phones are rapidly emerging as an effective means of communicating with many Americans. Using mobile applications, they can access remote databases, track time and location, and integrate user input to provide tailored health information. Objective A smart phone mobile application providing personalized, real-time sun protection advice was evaluated in a randomized trial. Design The trial was conducted in 2012 and had a randomized pretest-posttest controlled design with a 10-week follow-up. Setting Data was collected from a nationwide population-based survey panel. Participants The trial enrolled a sample of n=604 non-Hispanic and Hispanic adults from the Knowledge Panel® aged 18 or older who owned an Android smart phone. Intervention The mobile application provided advice on sun protection (i.e., protection practices and risk of sunburn) and alerts (to apply/reapply sunscreen and get out of the sun), hourly UV Index, and vitamin D production based on the forecast UV Index, phone's time and location, and user input. Main Outcomes and Measures Percent of days using sun protection and time spent outdoors (days and minutes) in the midday sun and number of sunburns in the past 3 months were collected. Results Individuals in the treatment group reported more shade use but less sunscreen use than controls. Those who used the mobile app reported spending less time in the sun and using all protection behaviors combined more. Conclusions and Relevance The mobile application improved some sun protection. Use of the mobile application was lower than expected but associated with increased sun protection. Providing personalized advice when and where people are in the sun may help reduce sun exposure. PMID:25629710

  18. Using Enabling Technologies to Advance Data Intensive Analysis Tools in the JPL Tropical Cyclone Information System

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M. E.; Hristova-Veleva, S. M.; Kim, R. M.; Lambrigtsen, B.; Li, P.; Niamsuwan, N.; Shen, T. P. J.; Turk, F. J.; Vu, Q. A.

    2014-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data related to tropical cyclones. The TCIS has been supporting specific science field campaigns, such as the Genesis and Rapid Intensification Processes (GRIP) campaign and the Hurricane and Severe Storm Sentinel (HS3) campaign, by creating near real-time (NRT) data visualization portals. These portals are intended to assist in mission planning, enhance the understanding of current physical processes, and improve model data by comparing it to satellite and aircraft observations. The TCIS NRT portals allow the user to view plots on a Google Earth interface. To compliment these visualizations, the team has been working on developing data analysis tools to let the user actively interrogate areas of Level 2 swath and two-dimensional plots they see on their screen. As expected, these observation and model data are quite voluminous and bottlenecks in the system architecture can occur when the databases try to run geospatial searches for data files that need to be read by the tools. To improve the responsiveness of the data analysis tools, the TCIS team has been conducting studies on how to best store Level 2 swath footprints and run sub-second geospatial searches to discover data. The first objective was to improve the sampling accuracy of the footprints being stored in the TCIS database by comparing the Java-based NASA PO.DAAC Level 2 Swath Generator with a TCIS Python swath generator. The second objective was to compare the performance of four database implementations - MySQL, MySQL+Solr, MongoDB, and PostgreSQL - to see which database management system would yield the best geospatial query and storage performance. The final objective was to integrate our chosen technologies with our Joint Probability Density Function (Joint PDF), Wave Number Analysis, and Automated Rotational Center Hurricane Eye Retrieval (ARCHER) tools. In this presentation, we will compare the enabling technologies we tested and discuss which ones we selected for integration into the TCIS' data analysis tool architecture. We will also show how these techniques have been automated to provide access to NRT data through our analysis tools.

  19. Real-time object-to-features vectorisation via Siamese neural networks

    NASA Astrophysics Data System (ADS)

    Fedorenko, Fedor; Usilin, Sergey

    2017-03-01

    Object-to-features vectorisation is a hard problem to solve for objects that can be hard to distinguish. Siamese and Triplet neural networks are one of the more recent tools used for such task. However, most networks used are very deep networks that prove to be hard to compute in the Internet of Things setting. In this paper, a computationally efficient neural network is proposed for real-time object-to-features vectorisation into a Euclidean metric space. We use L2 distance to reflect feature vector similarity during both training and testing. In this way, feature vectors we develop can be easily classified using K-Nearest Neighbours classifier. Such approach can be used to train networks to vectorise such "problematic" objects like images of human faces, keypoint image patches, like keypoints on Arctic maps and surrounding marine areas.

  20. Predicting Loss-of-Control Boundaries Toward a Piloting Aid

    NASA Technical Reports Server (NTRS)

    Barlow, Jonathan; Stepanyan, Vahram; Krishnakumar, Kalmanje

    2012-01-01

    This work presents an approach to predicting loss-of-control with the goal of providing the pilot a decision aid focused on maintaining the pilot's control action within predicted loss-of-control boundaries. The predictive architecture combines quantitative loss-of-control boundaries, a data-based predictive control boundary estimation algorithm and an adaptive prediction method to estimate Markov model parameters in real-time. The data-based loss-of-control boundary estimation algorithm estimates the boundary of a safe set of control inputs that will keep the aircraft within the loss-of-control boundaries for a specified time horizon. The adaptive prediction model generates estimates of the system Markov Parameters, which are used by the data-based loss-of-control boundary estimation algorithm. The combined algorithm is applied to a nonlinear generic transport aircraft to illustrate the features of the architecture.

  1. Remote collection and analysis of witness reports on flash floods

    NASA Astrophysics Data System (ADS)

    Gourley, J. J.; Erlingis, J. M.; Smith, T. M.; Ortega, K. L.; Hong, Y.

    2010-11-01

    SummaryTypically, flash floods are studied ex post facto in response to a major impact event. A complement to field investigations is developing a detailed database of flash flood events, including minor events and null reports (i.e., where heavy rain occurred but there was no flash flooding), based on public survey questions conducted in near-real time. The Severe hazards analysis and verification experiment (SHAVE) has been in operation at the National Severe Storms Laboratory (NSSL) in Norman, OK, USA during the summers since 2006. The experiment employs undergraduate students to analyse real-time products from weather radars, target specific regions within the conterminous US, and poll public residences and businesses regarding the occurrence and severity of hail, wind, tornadoes, and now flash floods. In addition to providing a rich learning experience for students, SHAVE has also been successful in creating high-resolution datasets of severe hazards used for algorithm and model verification. This paper describes the criteria used to initiate the flash flood survey, the specific questions asked and information entered to the database, and then provides an analysis of results for flash flood data collected during the summer of 2008. It is envisioned that specific details provided by the SHAVE flash flood observation database will complement databases collected by operational agencies (i.e., US National Weather Service Storm Data reports) and thus lead to better tools to predict the likelihood of flash floods and ultimately reduce their impacts on society.

  2. Machine learning for real time remote detection

    NASA Astrophysics Data System (ADS)

    Labbé, Benjamin; Fournier, Jérôme; Henaff, Gilles; Bascle, Bénédicte; Canu, Stéphane

    2010-10-01

    Infrared systems are key to providing enhanced capability to military forces such as automatic control of threats and prevention from air, naval and ground attacks. Key requirements for such a system to produce operational benefits are real-time processing as well as high efficiency in terms of detection and false alarm rate. These are serious issues since the system must deal with a large number of objects and categories to be recognized (small vehicles, armored vehicles, planes, buildings, etc.). Statistical learning based algorithms are promising candidates to meet these requirements when using selected discriminant features and real-time implementation. This paper proposes a new decision architecture benefiting from recent advances in machine learning by using an effective method for level set estimation. While building decision function, the proposed approach performs variable selection based on a discriminative criterion. Moreover, the use of level set makes it possible to manage rejection of unknown or ambiguous objects thus preserving the false alarm rate. Experimental evidences reported on real world infrared images demonstrate the validity of our approach.

  3. Building strategies for tsunami scenarios databases to be used in a tsunami early warning decision support system: an application to western Iberia

    NASA Astrophysics Data System (ADS)

    Tinti, S.; Armigliato, A.; Pagnoni, G.; Zaniboni, F.

    2012-04-01

    One of the most challenging goals that the geo-scientific community is facing after the catastrophic tsunami occurred on December 2004 in the Indian Ocean is to develop the so-called "next generation" Tsunami Early Warning Systems (TEWS). Indeed, the meaning of "next generation" does not refer to the aim of a TEWS, which obviously remains to detect whether a tsunami has been generated or not by a given source and, in the first case, to send proper warnings and/or alerts in a suitable time to all the countries and communities that can be affected by the tsunami. Instead, "next generation" identifies with the development of a Decision Support System (DSS) that, in general terms, relies on 1) an integrated set of seismic, geodetic and marine sensors whose objective is to detect and characterise the possible tsunamigenic sources and to monitor instrumentally the time and space evolution of the generated tsunami, 2) databases of pre-computed numerical tsunami scenarios to be suitably combined based on the information coming from the sensor environment and to be used to forecast the degree of exposition of different coastal places both in the near- and in the far-field, 3) a proper overall (software) system architecture. The EU-FP7 TRIDEC Project aims at developing such a DSS and has selected two test areas in the Euro-Mediterranean region, namely the western Iberian margin and the eastern Mediterranean (Turkish coasts). In this study, we discuss the strategies that are being adopted in TRIDEC to build the databases of pre-computed tsunami scenarios and we show some applications to the western Iberian margin. In particular, two different databases are being populated, called "Virtual Scenario Database" (VSDB) and "Matching Scenario Database" (MSDB). The VSDB contains detailed simulations of few selected earthquake-generated tsunamis. The cases provided by the members of the VSDB are computed "real events"; in other words, they represent the unknowns that the TRIDEC platform must be able to recognise and match during the early crisis management phase. The MSDB contains a very large number (order of thousands) of tsunami simulations performed starting from many different simple earthquake sources of different magnitudes and located in the "vicinity" of the virtual scenario earthquake. Examples from both databases will be presented.

  4. Millisecond timing on PCs and Macs.

    PubMed

    MacInnes, W J; Taylor, T L

    2001-05-01

    A real-time, object-oriented solution for displaying stimuli on Windows 95/98, MacOS and Linux platforms is presented. The program, written in C++, utilizes a special-purpose window class (GLWindow), OpenGL, and 32-bit graphics acceleration; it avoids display timing uncertainty by substituting the new window class for the default window code for each system. We report the outcome of tests for real-time capability across PC and Mac platforms running a variety of operating systems. The test program, which can be used as a shell for programming real-time experiments and testing specific processors, is available at http://www.cs.dal.ca/~macinnwj. We propose to provide researchers with a sense of the usefulness of our program, highlight the ability of many multitasking environments to achieve real time, as well as caution users about systems that may not achieve real time, even under optimal conditions.

  5. Kinota: An Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring

    NASA Astrophysics Data System (ADS)

    Miles, B.; Chepudira, K.; LaBar, W.

    2017-12-01

    The Open Geospatial Consortium (OGC) SensorThings API (STA) specification, ratified in 2016, is a next-generation open standard for enabling real-time communication of sensor data. Building on over a decade of OGC Sensor Web Enablement (SWE) Standards, STA offers a rich data model that can represent a range of sensor and phenomena types (e.g. fixed sensors sensing fixed phenomena, fixed sensors sensing moving phenomena, mobile sensors sensing fixed phenomena, and mobile sensors sensing moving phenomena) and is data agnostic. Additionally, and in contrast to previous SWE standards, STA is developer-friendly, as is evident from its convenient JSON serialization, and expressive OData-based query language (with support for geospatial queries); with its Message Queue Telemetry Transport (MQTT), STA is also well-suited to efficient real-time data publishing and discovery. All these attributes make STA potentially useful for use in environmental monitoring sensor networks. Here we present Kinota(TM), an Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring. Kinota, which roughly stands for Knowledge from Internet of Things Analyses, relies on Cassandra its underlying data store, which is a horizontally scalable, fault-tolerant open-source database that is often used to store time-series data for Big Data applications (though integration with other NoSQL or rational databases is possible). With this foundation, Kinota can scale to store data from an arbitrary number of sensors collecting data every 500 milliseconds. Additionally, Kinota architecture is very modular allowing for customization by adopters who can choose to replace parts of the existing implementation when desirable. The architecture is also highly portable providing the flexibility to choose between cloud providers like azure, amazon, google etc. The scalable, flexible and cloud friendly architecture of Kinota makes it ideal for use in next-generation large-scale and high-resolution real-time environmental monitoring networks used in domains such as hydrology, geomorphology, and geophysics, as well as management applications such as flood early warning, and regulatory enforcement.

  6. Applications of TRMM-based Multi-Satellite Precipitation Estimation for Global Runoff Simulation: Prototyping a Global Flood Monitoring System

    NASA Technical Reports Server (NTRS)

    Hong, Yang; Adler, Robert F.; Huffman, George J.; Pierce, Harold

    2008-01-01

    Advances in flood monitoring/forecasting have been constrained by the difficulty in estimating rainfall continuously over space (catchment-, national-, continental-, or even global-scale areas) and flood-relevant time scale. With the recent availability of satellite rainfall estimates at fine time and space resolution, this paper describes a prototype research framework for global flood monitoring by combining real-time satellite observations with a database of global terrestrial characteristics through a hydrologically relevant modeling scheme. Four major components included in the framework are (1) real-time precipitation input from NASA TRMM-based Multi-satellite Precipitation Analysis (TMPA); (2) a central geospatial database to preprocess the land surface characteristics: water divides, slopes, soils, land use, flow directions, flow accumulation, drainage network etc.; (3) a modified distributed hydrological model to convert rainfall to runoff and route the flow through the stream network in order to predict the timing and severity of the flood wave, and (4) an open-access web interface to quickly disseminate flood alerts for potential decision-making. Retrospective simulations for 1998-2006 demonstrate that the Global Flood Monitor (GFM) system performs consistently at both station and catchment levels. The GFM website (experimental version) has been running at near real-time in an effort to offer a cost-effective solution to the ultimate challenge of building natural disaster early warning systems for the data-sparse regions of the world. The interactive GFM website shows close-up maps of the flood risks overlaid on topography/population or integrated with the Google-Earth visualization tool. One additional capability, which extends forecast lead-time by assimilating QPF into the GFM, also will be implemented in the future.

  7. MACS-Mar: a real-time remote sensing system for maritime security applications

    NASA Astrophysics Data System (ADS)

    Brauchle, Jörg; Bayer, Steven; Hein, Daniel; Berger, Ralf; Pless, Sebastian

    2018-04-01

    The modular aerial camera system (MACS) is a development platform for optical remote sensing concepts, algorithms and special environments. For real-time services for maritime security (EMSec joint project), a new multi-sensor configuration MACS-Mar was realized. It consists of four co-aligned sensor heads in the visible RGB, near infrared (NIR, 700-950 nm), hyperspectral (HS, 450-900 nm) and thermal infrared (TIR, 7.5-14 µm) spectral range, a mid-cost navigation system, a processing unit and two data links. On-board image projection, cropping of redundant data and compression enable the instant generation of direct-georeferenced high-resolution image mosaics, automatic object detection, vectorization and annotation of floating objects on the water surface. The results were transmitted over a distance up to 50 km in real-time via narrow and broadband data links and were visualized in a maritime situation awareness system. For the automatic onboard detection of floating objects, a segmentation and classification workflow based on RGB, IR and TIR information was developed and tested. The completeness of the object detection in the experiment resulted in 95%, the correctness in 53%. Mostly, bright backwash of ships lead to an overestimation of the number of objects, further refinement using water homogeneity in the TIR, as implemented in the workflow, couldn't be carried out due to problems with the TIR sensor, else distinctly better results could have been expected. The absolute positional accuracy of the projected real-time imagery resulted in 2 m without postprocessing of images or navigation data, the relative measurement accuracy of distances is in the range of the image resolution, which is about 12 cm for RGB imagery in the EMSec experiment.

  8. DFACS - DATABASE, FORMS AND APPLICATIONS FOR CABLING AND SYSTEMS, VERSION 3.30

    NASA Technical Reports Server (NTRS)

    Billitti, J. W.

    1994-01-01

    DFACS is an interactive multi-user computer-aided engineering tool for system level electrical integration and cabling engineering. The purpose of the program is to provide the engineering community with a centralized database for entering and accessing system functional definitions, subsystem and instrument-end circuit pinout details, and harnessing data. The primary objective is to provide an instantaneous single point of information interchange, thus avoiding error-prone, time-consuming, and costly multiple-path data shuttling. The DFACS program, which is centered around a single database, has built-in menus that provide easy data input and access for all involved system, subsystem, and cabling personnel. The DFACS program allows parallel design of circuit data sheets and harness drawings. It also recombines raw information to automatically generate various project documents and drawings including the Circuit Data Sheet Index, the Electrical Interface Circuits List, Assembly and Equipment Lists, Electrical Ground Tree, Connector List, Cable Tree, Cabling Electrical Interface and Harness Drawings, Circuit Data Sheets, and ECR List of Affected Interfaces/Assemblies. Real time automatic production of harness drawings and circuit data sheets from the same data reservoir ensures instant system and cabling engineering design harmony. DFACS also contains automatic wire routing procedures and extensive error checking routines designed to minimize the possibility of engineering error. DFACS is designed to run on DEC VAX series computers under VMS using Version 6.3/01 of INGRES QUEL/OSL, a relational database system which is available through Relational Technology, Inc. The program is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard media) or a TK50 tape cartridge. DFACS was developed in 1987 and last updated in 1990. DFACS is a copyrighted work with all copyright vested in NASA. DEC, VAX and VMS are trademarks of Digital Equipment Corporation. INGRES QUEL/OSL is a trademark of Relational Technology, Inc.

  9. Real-time detection, quantification, warning, and control of epileptic seizures: the foundations for a scientific epileptology.

    PubMed

    Osorio, I; Frei, M G

    2009-11-01

    Substantive advances in clinical epileptology may be realized through the judicious use of real-time automated seizure detection, quantification, warning, and delivery of therapy in subjects with pharmacoresistant seizures. Materialization of these objectives is likely to elevate epileptology to the level of a mature clinical science.

  10. The SSABLE system - Automated archive, catalog, browse and distribution of satellite data in near-real time

    NASA Technical Reports Server (NTRS)

    Simpson, James J.; Harkins, Daniel N.

    1993-01-01

    Historically, locating and browsing satellite data has been a cumbersome and expensive process. This has impeded the efficient and effective use of satellite data in the geosciences. SSABLE is a new interactive tool for the archive, browse, order, and distribution of satellite date based upon X Window, high bandwidth networks, and digital image rendering techniques. SSABLE provides for automatically constructing relational database queries to archived image datasets based on time, data, geographical location, and other selection criteria. SSABLE also provides a visual representation of the selected archived data for viewing on the user's X terminal. SSABLE is a near real-time system; for example, data are added to SSABLE's database within 10 min after capture. SSABLE is network and machine independent; it will run identically on any machine which satisfies the following three requirements: 1) has a bitmapped display (monochrome or greater); 2) is running the X Window system; and 3) is on a network directly reachable by the SSABLE system. SSABLE has been evaluated at over 100 international sites. Network response time in the United States and Canada varies between 4 and 7 s for browse image updates; reported transmission times to Europe and Australia typically are 20-25 s.

  11. Nonstationary EO/IR Clutter Suppression and Dim Object Tracking

    DTIC Science & Technology

    2010-01-01

    Brown, A., and Brown, J., Enhanced Algorithms for EO /IR Electronic Stabilization, Clutter Suppression, and Track - Before - Detect for Multiple Low...estimation-suppression and nonlinear filtering-based multiple-object track - before - detect . These algorithms are suitable for integration into...In such cases, it is imperative to develop efficient real or near-real time tracking before detection methods. This paper continues the work started

  12. The Internet Compendium: Subject Guides to Health and Science Resources.

    ERIC Educational Resources Information Center

    Rosenfeld, Louis; And Others

    This guide describes and evaluates the Internet's health and science resources by subject. It offers information on a multitude of listservs; Usenet newsgroups; forums; electronic journals; topical mailing lists; text archives; Freenets; bulletin boards; FAQs; newsletters; real-time chats; databases; and library catalogs. From alternative medicine…

  13. Low-cost real-time infrared scene generation for image projection and signal injection

    NASA Astrophysics Data System (ADS)

    Buford, James A., Jr.; King, David E.; Bowden, Mark H.

    1998-07-01

    As cost becomes an increasingly important factor in the development and testing of Infrared sensors and flight computer/processors, the need for accurate hardware-in-the- loop (HWIL) simulations is critical. In the past, expensive and complex dedicated scene generation hardware was needed to attain the fidelity necessary for accurate testing. Recent technological advances and innovative applications of established technologies are beginning to allow development of cost-effective replacements for dedicated scene generators. These new scene generators are mainly constructed from commercial-off-the-shelf (COTS) hardware and software components. At the U.S. Army Aviation and Missile Command (AMCOM) Missile Research, Development, and Engineering Center (MRDEC), researchers have developed such a dynamic IR scene generator (IRSG) built around COTS hardware and software. The IRSG is used to provide dynamic inputs to an IR scene projector for in-band seeker testing and for direct signal injection into the seeker or processor electronics. AMCOM MRDEC has developed a second generation IRSG, namely IRSG2, using the latest Silicon Graphics Incorporated (SGI) Onyx2 with Infinite Reality graphics. As reported in previous papers, the SGI Onyx Reality Engine 2 is the platform of the original IRSG that is now referred to as IRSG1. IRSG1 has been in operation and used daily for the past three years on several IR projection and signal injection HWIL programs. Using this second generation IRSG, frame rates have increased from 120 Hz to 400 Hz and intensity resolution from 12 bits to 16 bits. The key features of the IRSGs are real time missile frame rates and frame sizes, dynamic missile-to-target(s) viewpoint updated each frame in real-time by a six-degree-of- freedom (6DOF) system under test (SUT) simulation, multiple dynamic objects (e.g. targets, terrain/background, countermeasures, and atmospheric effects), latency compensation, point-to-extended source anti-aliased targets, and sensor modeling effects. This paper provides a comparison between the IRSG1 and IRSG2 systems and focuses on the IRSG software, real time features, and database development tools.

  14. Factors impacting time to acceptance and publication for peer-reviewed publications.

    PubMed

    Toroser, Dikran; Carlson, Janice; Robinson, Micah; Gegner, Julie; Girard, Victoria; Smette, Lori; Nilsen, Jon; O'Kelly, James

    2017-07-01

    Timely publication of data is important for the medical community and provides a valuable contribution to data disclosure. The objective of this study was to identify and evaluate times to acceptance and publication for peer-reviewed manuscripts, reviews, and letters to the editor. Key publication metrics for published manuscripts, reviews, and letters to the editor were identified by eight Amgen publications professionals. Data for publications submitted between 1 January 2013 and 1 November 2015 were extracted from a proprietary internal publication-tracking database. Variables included department initiating the study, publication type, number of submissions per publication, and the total number of weeks from first submission to acceptance, online publication, and final publication. A total of 337 publications were identified, of which 300 (89%) were manuscripts. Time from submission to acceptance and publication was generally similar between clinical and real-world evidence (e.g. observational and health economics studies) publications. Median (range) time from first submission to acceptance was 23.4 (0.2-226.2) weeks. Median (range) time from first submission to online (early-release) publication was 29.7 (2.4-162.6) weeks. Median (range) time from first submission to final (print) publication was 36.2 (2.8-230.8) weeks. Time from first submission to acceptance, online publication, and final publication increased accordingly with number of submissions required for acceptance, with similar times noted between each subsequent submission. Analysis of a single-company publication database showed that the median time for manuscripts to be fully published after initial submission was 36.2 weeks, and time to publication increased accordingly with the number of submissions. Causes for multiple submissions and time from clinical trial completion to first submission were not assessed; these were limitations of the study. Nonetheless, publication planners should consider these results when evaluating timelines and identifying potential journals early in the publication planning process.

  15. Thinking Out Loud while Studying Text: Rehearsing Key Ideas.

    ERIC Educational Resources Information Center

    Muth, K. Denise; And Others

    1988-01-01

    A study involving 32 undergraduate students was conducted to identify mechanisms by which instructional objectives affect learning. Protocols for thinking out loud were examined for evidence of rehearsal activity. Results suggest that instructional objectives enhanced real-time rehearsal activity, recall, and reading time. (TJH)

  16. Stereo Reconstruction Study

    DTIC Science & Technology

    1983-06-01

    be registered on the agenda. At each step or analysis, the action with the highest score is executed and the database is changed. The agenda controls...activation of production rules according to changes in the database . The agenda is updated whenever the database is changed. Each time, the number of...views of an object. Total prediction has combinatorial complexity. For a polyhedron with n distinct faces, there are 2" views. Instead, ACRONYM predicts

  17. Object detection and imaging with acoustic time reversal mirrors

    NASA Astrophysics Data System (ADS)

    Fink, Mathias

    1993-11-01

    Focusing an acoustic wave on an object of unknown shape through an inhomogeneous medium of any geometrical shape is a challenge in underground detection. Optimal detection and imaging of objects needs the development of such focusing techniques. The use of a time reversal mirror (TRM) represents an original solution to this problem. It realizes in real time a focusing process matched to the object shape, to the geometries of the acoustic interfaces and to the geometries of the mirror. It is a self adaptative technique which compensates for any geometrical distortions of the mirror structure as well as for diffraction and refraction effects through the interfaces. Two real time 64 and 128 channel prototypes have been built in our laboratory and TRM experiments demonstrating the TRM performance through inhomogeneous solid and liquid media are presented. Applications to medical therapy (kidney stone detection and destruction) and to nondestructive testing of metallurgical samples of different geometries are described. Extension of this study to underground detection and imaging will be discussed.

  18. Smartphone-Based Self-Assessment of Stress in Healthy Adult Individuals: A Systematic Review

    PubMed Central

    Þórarinsdóttir, Helga; Kessing, Lars Vedel

    2017-01-01

    Background Stress is a common experience in today’s society. Smartphone ownership is widespread, and smartphones can be used to monitor health and well-being. Smartphone-based self-assessment of stress can be done in naturalistic settings and may potentially reflect real-time stress level. Objective The objectives of this systematic review were to evaluate (1) the use of smartphones to measure self-assessed stress in healthy adult individuals, (2) the validity of smartphone-based self-assessed stress compared with validated stress scales, and (3) the association between smartphone-based self-assessed stress and smartphone generated objective data. Methods A systematic review of the scientific literature was reported and conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) statement. The scientific databases PubMed, PsycINFO, Embase, IEEE, and ACM were searched and supplemented by a hand search of reference lists. The databases were searched for original studies involving healthy individuals older than 18 years, measuring self-assessed stress using smartphones. Results A total of 35 published articles comprising 1464 individuals were included for review. According to the objectives, (1) study designs were heterogeneous, and smartphone-based self-assessed stress was measured using various methods (e.g., dichotomized questions on stress, yes or no; Likert scales on stress; and questionnaires); (2) the validity of smartphone-based self-assessed stress compared with validated stress scales was investigated in 3 studies, and of these, only 1 study found a moderate statistically significant positive correlation (r=.4; P<.05); and (3) in exploratory analyses, smartphone-based self-assessed stress was found to correlate with some of the reported smartphone generated objective data, including voice features and data on activity and phone usage. Conclusions Smartphones are being used to measure self-assessed stress in different contexts. The evidence of the validity of smartphone-based self-assessed stress is limited and should be investigated further. Smartphone generated objective data can potentially be used to monitor, predict, and reduce stress levels. PMID:28193600

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, K.; Tsai, H.; Liu, Y. Y.

    Radio frequency identification (RFID) is one of today's most rapidly growing technologies in the automatic data collection industry. Although commercial applications are already widespread, the use of this technology for managing nuclear materials is only in its infancy. Employing an RFID system has the potential to offer an immense payback: enhanced safety and security, reduced need for manned surveillance, real-time access to status and event history data, and overall cost-effectiveness. The Packaging Certification Program (PCP) in the U.S. Department of Energy's (DOE's) Office of Environmental Management (EM), Office of Packaging and Transportation (EM-63), is developing an RFID system for nuclearmore » materials management. The system consists of battery-powered RFID tags with onboard sensors and memories, a reader network, application software, a database server and web pages. The tags monitor and record critical parameters, including the status of seals, movement of objects, and environmental conditions of the nuclear material packages in real time. They also provide instant warnings or alarms when preset thresholds for the sensors are exceeded. The information collected by the readers is transmitted to a dedicated central database server that can be accessed by authorized users across the DOE complex via a secured network. The onboard memory of the tags allows the materials manifest and event history data to reside with the packages throughout their life cycles in storage, transportation, and disposal. Data security is currently based on Advanced Encryption Standard-256. The software provides easy-to-use graphical interfaces that allow access to all vital information once the security and privilege requirements are met. An innovative scheme has been developed for managing batteries in service for more than 10 years without needing to be changed. A miniature onboard dosimeter is being developed for applications that require radiation surveillance. A field demonstration of the RFID system was recently conducted to assess its performance. The preliminary results of the demonstration are reported in this paper.« less

  20. Real-time reliability measure-driven multi-hypothesis tracking using 2D and 3D features

    NASA Astrophysics Data System (ADS)

    Zúñiga, Marcos D.; Brémond, François; Thonnat, Monique

    2011-12-01

    We propose a new multi-target tracking approach, which is able to reliably track multiple objects even with poor segmentation results due to noisy environments. The approach takes advantage of a new dual object model combining 2D and 3D features through reliability measures. In order to obtain these 3D features, a new classifier associates an object class label to each moving region (e.g. person, vehicle), a parallelepiped model and visual reliability measures of its attributes. These reliability measures allow to properly weight the contribution of noisy, erroneous or false data in order to better maintain the integrity of the object dynamics model. Then, a new multi-target tracking algorithm uses these object descriptions to generate tracking hypotheses about the objects moving in the scene. This tracking approach is able to manage many-to-many visual target correspondences. For achieving this characteristic, the algorithm takes advantage of 3D models for merging dissociated visual evidence (moving regions) potentially corresponding to the same real object, according to previously obtained information. The tracking approach has been validated using video surveillance benchmarks publicly accessible. The obtained performance is real time and the results are competitive compared with other tracking algorithms, with minimal (or null) reconfiguration effort between different videos.

  1. Realtime automatic metal extraction of medical x-ray images for contrast improvement

    NASA Astrophysics Data System (ADS)

    Prangl, Martin; Hellwagner, Hermann; Spielvogel, Christian; Bischof, Horst; Szkaliczki, Tibor

    2006-03-01

    This paper focuses on an approach for real-time metal extraction of x-ray images taken from modern x-ray machines like C-arms. Such machines are used for vessel diagnostics, surgical interventions, as well as cardiology, neurology and orthopedic examinations. They are very fast in taking images from different angles. For this reason, manual adjustment of contrast is infeasible and automatic adjustment algorithms have been applied to try to select the optimal radiation dose for contrast adjustment. Problems occur when metallic objects, e.g., a prosthesis or a screw, are in the absorption area of interest. In this case, the automatic adjustment mostly fails because the dark, metallic objects lead the algorithm to overdose the x-ray tube. This outshining effect results in overexposed images and bad contrast. To overcome this limitation, metallic objects have to be detected and extracted from images that are taken as input for the adjustment algorithm. In this paper, we present a real-time solution for extracting metallic objects of x-ray images. We will explore the characteristic features of metallic objects in x-ray images and their distinction from bone fragments which form the basis to find a successful way for object segmentation and classification. Subsequently, we will present our edge based real-time approach for successful and fast automatic segmentation and classification of metallic objects. Finally, experimental results on the effectiveness and performance of our approach based on a vast amount of input image data sets will be presented.

  2. Real-time emissions from construction equipment compared with model predictions.

    PubMed

    Heidari, Bardia; Marr, Linsey C

    2015-02-01

    The construction industry is a large source of greenhouse gases and other air pollutants. Measuring and monitoring real-time emissions will provide practitioners with information to assess environmental impacts and improve the sustainability of construction. We employed a portable emission measurement system (PEMS) for real-time measurement of carbon dioxide (CO), nitrogen oxides (NOx), hydrocarbon, and carbon monoxide (CO) emissions from construction equipment to derive emission rates (mass of pollutant emitted per unit time) and emission factors (mass of pollutant emitted per unit volume of fuel consumed) under real-world operating conditions. Measurements were compared with emissions predicted by methodologies used in three models: NONROAD2008, OFFROAD2011, and a modal statistical model. Measured emission rates agreed with model predictions for some pieces of equipment but were up to 100 times lower for others. Much of the difference was driven by lower fuel consumption rates than predicted. Emission factors during idling and hauling were significantly different from each other and from those of other moving activities, such as digging and dumping. It appears that operating conditions introduce considerable variability in emission factors. Results of this research will aid researchers and practitioners in improving current emission estimation techniques, frameworks, and databases.

  3. Real-Time Tracking by Double Templates Matching Based on Timed Motion History Image with HSV Feature

    PubMed Central

    Li, Zhiyong; Li, Pengfei; Yu, Xiaoping; Hashem, Mervat

    2014-01-01

    It is a challenge to represent the target appearance model for moving object tracking under complex environment. This study presents a novel method with appearance model described by double templates based on timed motion history image with HSV color histogram feature (tMHI-HSV). The main components include offline template and online template initialization, tMHI-HSV-based candidate patches feature histograms calculation, double templates matching (DTM) for object location, and templates updating. Firstly, we initialize the target object region and calculate its HSV color histogram feature as offline template and online template. Secondly, the tMHI-HSV is used to segment the motion region and calculate these candidate object patches' color histograms to represent their appearance models. Finally, we utilize the DTM method to trace the target and update the offline template and online template real-timely. The experimental results show that the proposed method can efficiently handle the scale variation and pose change of the rigid and nonrigid objects, even in illumination change and occlusion visual environment. PMID:24592185

  4. Motion coordination and programmable teleoperation between two industrial robots

    NASA Technical Reports Server (NTRS)

    Luh, J. Y. S.; Zheng, Y. F.

    1987-01-01

    Tasks for two coordinated industrial robots always bring the robots in contact with a same object. The motion coordination among the robots and the object must be maintained all the time. To plan the coordinated tasks, only one robot's motion is planned according to the required motion of the object. The motion of the second robot is to follow the first one as specified by a set of holonomic equality constraints at every time instant. If any modification of the object's motion is needed in real-time, only the first robot's motion has to be modified accordingly in real-time. The modification for the second robot is done implicitly through the constraint conditions. Thus the operation is simplified. If the object is physically removed, the second robot still continually follows the first one through the constraint conditions. If the first robot is maneuvered through either the teach pendant or the keyboard, the second one moves accordingly to form the teleoperation which is linked through the software programming. Obviously, the second robot does not need to duplicate the first robot's motion. The programming of the constraints specifies their relative motions.

  5. Research of real-time communication software

    NASA Astrophysics Data System (ADS)

    Li, Maotang; Guo, Jingbo; Liu, Yuzhong; Li, Jiahong

    2003-11-01

    Real-time communication has been playing an increasingly important role in our work, life and ocean monitor. With the rapid progress of computer and communication technique as well as the miniaturization of communication system, it is needed to develop the adaptable and reliable real-time communication software in the ocean monitor system. This paper involves the real-time communication software research based on the point-to-point satellite intercommunication system. The object-oriented design method is adopted, which can transmit and receive video data and audio data as well as engineering data by satellite channel. In the real-time communication software, some software modules are developed, which can realize the point-to-point satellite intercommunication in the ocean monitor system. There are three advantages for the real-time communication software. One is that the real-time communication software increases the reliability of the point-to-point satellite intercommunication system working. Second is that some optional parameters are intercalated, which greatly increases the flexibility of the system working. Third is that some hardware is substituted by the real-time communication software, which not only decrease the expense of the system and promotes the miniaturization of communication system, but also aggrandizes the agility of the system.

  6. Performance of a New HPV Cervi-Collect Collection and Transportation Kit

    PubMed Central

    Chernesky, M.; Huang, S.; Jang, D.; Erickson, B.; Salituro, J.; Engel, H.; Gilchrist, J.; Neuscheler, P.; Mak, W. B.; Abravaya, K.

    2012-01-01

    Background. Liquid-based Pap (L-Pap) media are used for Pap and human papillomavirus (HPV) testing. Objectives. To compare RealTime High Risk (HR) HPV testing of a new collection kit (Cervi-Collect) and PreservCyt L-Pap specimens. To determine ease of use and safety of Cervi-Collect. Methods. L-Pap samples (n = 203) were tested with HC2 and RealTime HR HPV and Cervi-Collect with RealTime HR HPV. Discordant samples were genotyped. Results. L-Pap and Cervi-Collect specimens tested by RealTime HR HPV showed 93.1% agreement (Kappa 0.86). RealTime HR HPV and HC2 on L-Pap had 90.3% agreement (Kappa 0.80). RealTime HR HPV on Cervi-Collect and HC2 on L-Pap showed 88.2% agreement (Kappa 0.76). Sixteen of 21 samples which were HC2 negative and RealTime HR HPV positive on L-Pap or Cervi-Collect contained HR HPV genotypes. Eleven healthcare collectors were in strong agreement on a usability and safety questionnaire. Conclusion. Cervi-Collect samples were easy to collect and showed strong agreement with L-Pap samples tested with RealTime HR HPV or HC2. PMID:22174716

  7. Real-Time Risk Prediction on the Wards: A Feasibility Study.

    PubMed

    Kang, Michael A; Churpek, Matthew M; Zadravecz, Frank J; Adhikari, Richa; Twu, Nicole M; Edelson, Dana P

    2016-08-01

    Failure to detect clinical deterioration in the hospital is common and associated with poor patient outcomes and increased healthcare costs. Our objective was to evaluate the feasibility and accuracy of real-time risk stratification using the electronic Cardiac Arrest Risk Triage score, an electronic health record-based early warning score. We conducted a prospective black-box validation study. Data were transmitted via HL7 feed in real time to an integration engine and database server wherein the scores were calculated and stored without visualization for clinical providers. The high-risk threshold was set a priori. Timing and sensitivity of electronic Cardiac Arrest Risk Triage score activation were compared with standard-of-care Rapid Response Team activation for patients who experienced a ward cardiac arrest or ICU transfer. Three general care wards at an academic medical center. A total of 3,889 adult inpatients. The system generated 5,925 segments during 5,751 admissions. The area under the receiver operating characteristic curve for electronic Cardiac Arrest Risk Triage score was 0.88 for cardiac arrest and 0.80 for ICU transfer, consistent with previously published derivation results. During the study period, eight of 10 patients with a cardiac arrest had high-risk electronic Cardiac Arrest Risk Triage scores, whereas the Rapid Response Team was activated on two of these patients (p < 0.05). Furthermore, electronic Cardiac Arrest Risk Triage score identified 52% (n = 201) of the ICU transfers compared with 34% (n = 129) by the current system (p < 0.001). Patients met the high-risk electronic Cardiac Arrest Risk Triage score threshold a median of 30 hours prior to cardiac arrest or ICU transfer versus 1.7 hours for standard Rapid Response Team activation. Electronic Cardiac Arrest Risk Triage score identified significantly more cardiac arrests and ICU transfers than standard Rapid Response Team activation and did so many hours in advance.

  8. EPICS as a MARTe Configuration Environment

    NASA Astrophysics Data System (ADS)

    Valcarcel, Daniel F.; Barbalace, Antonio; Neto, André; Duarte, André S.; Alves, Diogo; Carvalho, Bernardo B.; Carvalho, Pedro J.; Sousa, Jorge; Fernandes, Horácio; Goncalves, Bruno; Sartori, Filippo; Manduchi, Gabriele

    2011-08-01

    The Multithreaded Application Real-Time executor (MARTe) software provides an environment for the hard real-time execution of codes while leveraging a standardized algorithm development process. The Experimental Physics and Industrial Control System (EPICS) software allows the deployment and remote monitoring of networked control systems. Channel Access (CA) is the protocol that enables the communication between EPICS distributed components. It allows to set and monitor process variables across the network belonging to different systems. The COntrol and Data Acquisition and Communication (CODAC) system for the ITER Tokamak will be EPICS based and will be used to monitor and live configure the plant controllers. The reconfiguration capability in a hard real-time system requires strict latencies from the request to the actuation and it is a key element in the design of the distributed control algorithm. Presently, MARTe and its objects are configured using a well-defined structured language. After each configuration, all objects are destroyed and the system rebuilt, following the strong hard real-time rule that a real-time system in online mode must behave in a strictly deterministic fashion. This paper presents the design and considerations to use MARTe as a plant controller and enable it to be EPICS monitorable and configurable without disturbing the execution at any time, in particular during a plasma discharge. The solutions designed for this will be presented and discussed.

  9. Optimal Reservoir Operation using Stochastic Model Predictive Control

    NASA Astrophysics Data System (ADS)

    Sahu, R.; McLaughlin, D.

    2016-12-01

    Hydropower operations are typically designed to fulfill contracts negotiated with consumers who need reliable energy supplies, despite uncertainties in reservoir inflows. In addition to providing reliable power the reservoir operator needs to take into account environmental factors such as downstream flooding or compliance with minimum flow requirements. From a dynamical systems perspective, the reservoir operating strategy must cope with conflicting objectives in the presence of random disturbances. In order to achieve optimal performance, the reservoir system needs to continually adapt to disturbances in real time. Model Predictive Control (MPC) is a real-time control technique that adapts by deriving the reservoir release at each decision time from the current state of the system. Here an ensemble-based version of MPC (SMPC) is applied to a generic reservoir to determine both the optimal power contract, considering future inflow uncertainty, and a real-time operating strategy that attempts to satisfy the contract. Contract selection and real-time operation are coupled in an optimization framework that also defines a Pareto trade off between the revenue generated from energy production and the environmental damage resulting from uncontrolled reservoir spills. Further insight is provided by a sensitivity analysis of key parameters specified in the SMPC technique. The results demonstrate that SMPC is suitable for multi-objective planning and associated real-time operation of a wide range of hydropower reservoir systems.

  10. A global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  11. Real-time vision, tactile cues, and visual form agnosia: removing haptic feedback from a “natural” grasping task induces pantomime-like grasps

    PubMed Central

    Whitwell, Robert L.; Ganel, Tzvi; Byrne, Caitlin M.; Goodale, Melvyn A.

    2015-01-01

    Investigators study the kinematics of grasping movements (prehension) under a variety of conditions to probe visuomotor function in normal and brain-damaged individuals. “Natural” prehensile acts are directed at the goal object and are executed using real-time vision. Typically, they also entail the use of tactile, proprioceptive, and kinesthetic sources of haptic feedback about the object (“haptics-based object information”) once contact with the object has been made. Natural and simulated (pantomimed) forms of prehension are thought to recruit different cortical structures: patient DF, who has visual form agnosia following bilateral damage to her temporal-occipital cortex, loses her ability to scale her grasp aperture to the size of targets (“grip scaling”) when her prehensile movements are based on a memory of a target previewed 2 s before the cue to respond or when her grasps are directed towards a visible virtual target but she is denied haptics-based information about the target. In the first of two experiments, we show that when DF performs real-time pantomimed grasps towards a 7.5 cm displaced imagined copy of a visible object such that her fingers make contact with the surface of the table, her grip scaling is in fact quite normal. This finding suggests that real-time vision and terminal tactile feedback are sufficient to preserve DF’s grip scaling slopes. In the second experiment, we examined an “unnatural” grasping task variant in which a tangible target (along with any proxy such as the surface of the table) is denied (i.e., no terminal tactile feedback). To do this, we used a mirror-apparatus to present virtual targets with and without a spatially coincident copy for the participants to grasp. We compared the grasp kinematics from trials with and without terminal tactile feedback to a real-time-pantomimed grasping task (one without tactile feedback) in which participants visualized a copy of the visible target as instructed in our laboratory in the past. Compared to natural grasps, removing tactile feedback increased RT, slowed the velocity of the reach, reduced in-flight grip aperture, increased the slopes relating grip aperture to target width, and reduced the final grip aperture (FGA). All of these effects were also observed in the real time-pantomime grasping task. These effects seem to be independent of those that arise from using the mirror in general as we also compared grasps directed towards virtual targets to those directed at real ones viewed directly through a pane of glass. These comparisons showed that the grasps directed at virtual targets increased grip aperture, slowed the velocity of the reach, and reduced the slopes relating grip aperture to the widths of the target. Thus, using the mirror has real consequences on grasp kinematics, reflecting the importance of task-relevant sources of online visual information for the programming and updating of natural prehensile movements. Taken together, these results provide compelling support for the view that removing terminal tactile feedback, even when the grasps are target-directed, induces a switch from real-time visual control towards one that depends more on visual perception and cognitive supervision. Providing terminal tactile feedback and real-time visual information can evidently keep the dorsal visuomotor system operating normally for prehensile acts. PMID:25999834

  12. Real-time vision, tactile cues, and visual form agnosia: removing haptic feedback from a "natural" grasping task induces pantomime-like grasps.

    PubMed

    Whitwell, Robert L; Ganel, Tzvi; Byrne, Caitlin M; Goodale, Melvyn A

    2015-01-01

    Investigators study the kinematics of grasping movements (prehension) under a variety of conditions to probe visuomotor function in normal and brain-damaged individuals. "Natural" prehensile acts are directed at the goal object and are executed using real-time vision. Typically, they also entail the use of tactile, proprioceptive, and kinesthetic sources of haptic feedback about the object ("haptics-based object information") once contact with the object has been made. Natural and simulated (pantomimed) forms of prehension are thought to recruit different cortical structures: patient DF, who has visual form agnosia following bilateral damage to her temporal-occipital cortex, loses her ability to scale her grasp aperture to the size of targets ("grip scaling") when her prehensile movements are based on a memory of a target previewed 2 s before the cue to respond or when her grasps are directed towards a visible virtual target but she is denied haptics-based information about the target. In the first of two experiments, we show that when DF performs real-time pantomimed grasps towards a 7.5 cm displaced imagined copy of a visible object such that her fingers make contact with the surface of the table, her grip scaling is in fact quite normal. This finding suggests that real-time vision and terminal tactile feedback are sufficient to preserve DF's grip scaling slopes. In the second experiment, we examined an "unnatural" grasping task variant in which a tangible target (along with any proxy such as the surface of the table) is denied (i.e., no terminal tactile feedback). To do this, we used a mirror-apparatus to present virtual targets with and without a spatially coincident copy for the participants to grasp. We compared the grasp kinematics from trials with and without terminal tactile feedback to a real-time-pantomimed grasping task (one without tactile feedback) in which participants visualized a copy of the visible target as instructed in our laboratory in the past. Compared to natural grasps, removing tactile feedback increased RT, slowed the velocity of the reach, reduced in-flight grip aperture, increased the slopes relating grip aperture to target width, and reduced the final grip aperture (FGA). All of these effects were also observed in the real time-pantomime grasping task. These effects seem to be independent of those that arise from using the mirror in general as we also compared grasps directed towards virtual targets to those directed at real ones viewed directly through a pane of glass. These comparisons showed that the grasps directed at virtual targets increased grip aperture, slowed the velocity of the reach, and reduced the slopes relating grip aperture to the widths of the target. Thus, using the mirror has real consequences on grasp kinematics, reflecting the importance of task-relevant sources of online visual information for the programming and updating of natural prehensile movements. Taken together, these results provide compelling support for the view that removing terminal tactile feedback, even when the grasps are target-directed, induces a switch from real-time visual control towards one that depends more on visual perception and cognitive supervision. Providing terminal tactile feedback and real-time visual information can evidently keep the dorsal visuomotor system operating normally for prehensile acts.

  13. Detection of tumor markers in prostate cancer and comparison of sensitivity between real time and nested PCR.

    PubMed

    Matsuoka, Takayuki; Shigemura, Katsumi; Yamamichi, Fukashi; Fujisawa, Masato; Kawabata, Masato; Shirakawa, Toshiro

    2012-06-27

    The objective of this study is to investigate and compare the sensitivity in conventional PCR, quantitative real time PCR, nested PCR and western blots for detection of prostate cancer tumor markers using prostate cancer (PCa) cells. We performed conventional PCR, quantitative real time PCR, nested PCR, and western blots using 5 kinds of PCa cells. Prostate specific antigen (PSA), prostate specific membrane antigen (PSMA), and androgen receptor (AR) were compared for their detection sensitivity by real time PCR and nested PCR. In real time PCR, there was a significant correlation between cell number and the RNA concentration obtained (R(2)=0.9944) for PSA, PSMA, and AR. We found it possible to detect these markers from a single LNCaP cell in both real time and nested PCR. By comparison, nested PCR reached a linear curve in fewer PCR cycles than real time PCR, suggesting that nested PCR may offer PCR results more quickly than real time PCR. In conclusion, nested PCR may offer tumor maker detection in PCa cells more quickly (with fewer PCR cycles) with the same high sensitivity as real time PCR. Further study is necessary to establish and evaluate the best tool for PCa tumor marker detection.

  14. Quality standards for real-world research. Focus on observational database studies of comparative effectiveness.

    PubMed

    Roche, Nicolas; Reddel, Helen; Martin, Richard; Brusselle, Guy; Papi, Alberto; Thomas, Mike; Postma, Dirjke; Thomas, Vicky; Rand, Cynthia; Chisholm, Alison; Price, David

    2014-02-01

    Real-world research can use observational or clinical trial designs, in both cases putting emphasis on high external validity, to complement the classical efficacy randomized controlled trials (RCTs) with high internal validity. Real-world research is made necessary by the variety of factors that can play an important a role in modulating effectiveness in real life but are often tightly controlled in RCTs, such as comorbidities and concomitant treatments, adherence, inhalation technique, access to care, strength of doctor-caregiver communication, and socio-economic and other organizational factors. Real-world studies belong to two main categories: pragmatic trials and observational studies, which can be prospective or retrospective. Focusing on comparative database observational studies, the process aimed at ensuring high-quality research can be divided into three parts: preparation of research, analyses and reporting, and discussion of results. Key points include a priori planning of data collection and analyses, identification of appropriate database(s), proper outcomes definition, study registration with commitment to publish, bias minimization through matching and adjustment processes accounting for potential confounders, and sensitivity analyses testing the robustness of results. When these conditions are met, observational database studies can reach a sufficient level of evidence to help create guidelines (i.e., clinical and regulatory decision-making).

  15. Working memory is not fixed-capacity: More active storage capacity for real-world objects than for simple stimuli

    PubMed Central

    Brady, Timothy F.; Störmer, Viola S.; Alvarez, George A.

    2016-01-01

    Visual working memory is the cognitive system that holds visual information active to make it resistant to interference from new perceptual input. Information about simple stimuli—colors and orientations—is encoded into working memory rapidly: In under 100 ms, working memory ‟fills up,” revealing a stark capacity limit. However, for real-world objects, the same behavioral limits do not hold: With increasing encoding time, people store more real-world objects and do so with more detail. This boost in performance for real-world objects is generally assumed to reflect the use of a separate episodic long-term memory system, rather than working memory. Here we show that this behavioral increase in capacity with real-world objects is not solely due to the use of separate episodic long-term memory systems. In particular, we show that this increase is a result of active storage in working memory, as shown by directly measuring neural activity during the delay period of a working memory task using EEG. These data challenge fixed-capacity working memory models and demonstrate that working memory and its capacity limitations are dependent upon our existing knowledge. PMID:27325767

  16. Working memory is not fixed-capacity: More active storage capacity for real-world objects than for simple stimuli.

    PubMed

    Brady, Timothy F; Störmer, Viola S; Alvarez, George A

    2016-07-05

    Visual working memory is the cognitive system that holds visual information active to make it resistant to interference from new perceptual input. Information about simple stimuli-colors and orientations-is encoded into working memory rapidly: In under 100 ms, working memory ‟fills up," revealing a stark capacity limit. However, for real-world objects, the same behavioral limits do not hold: With increasing encoding time, people store more real-world objects and do so with more detail. This boost in performance for real-world objects is generally assumed to reflect the use of a separate episodic long-term memory system, rather than working memory. Here we show that this behavioral increase in capacity with real-world objects is not solely due to the use of separate episodic long-term memory systems. In particular, we show that this increase is a result of active storage in working memory, as shown by directly measuring neural activity during the delay period of a working memory task using EEG. These data challenge fixed-capacity working memory models and demonstrate that working memory and its capacity limitations are dependent upon our existing knowledge.

  17. GENEASE: Real time bioinformatics tool for multi-omics and disease ontology exploration, analysis and visualization.

    PubMed

    Ghandikota, Sudhir; Hershey, Gurjit K Khurana; Mersha, Tesfaye B

    2018-03-24

    Advances in high-throughput sequencing technologies have made it possible to generate multiple omics data at an unprecedented rate and scale. The accumulation of these omics data far outpaces the rate at which biologists can mine and generate new hypothesis to test experimentally. There is an urgent need to develop a myriad of powerful tools to efficiently and effectively search and filter these resources to address specific post-GWAS functional genomics questions. However, to date, these resources are scattered across several databases and often lack a unified portal for data annotation and analytics. In addition, existing tools to analyze and visualize these databases are highly fragmented, resulting researchers to access multiple applications and manual interventions for each gene or variant in an ad hoc fashion until all the questions are answered. In this study, we present GENEASE, a web-based one-stop bioinformatics tool designed to not only query and explore multi-omics and phenotype databases (e.g., GTEx, ClinVar, dbGaP, GWAS Catalog, ENCODE, Roadmap Epigenomics, KEGG, Reactome, Gene and Phenotype Ontology) in a single web interface but also to perform seamless post genome-wide association downstream functional and overlap analysis for non-coding regulatory variants. GENEASE accesses over 50 different databases in public domain including model organism-specific databases to facilitate gene/variant and disease exploration, enrichment and overlap analysis in real time. It is a user-friendly tool with point-and-click interface containing links for support information including user manual and examples. GENEASE can be accessed freely at http://research.cchmc.org/mershalab/genease_new/login.html. Tesfaye.Mersha@cchmc.org, Sudhir.Ghandikota@cchmc.org. Supplementary data are available at Bioinformatics online.

  18. Real Time Search Algorithm for Observation Outliers During Monitoring Engineering Constructions

    NASA Astrophysics Data System (ADS)

    Latos, Dorota; Kolanowski, Bogdan; Pachelski, Wojciech; Sołoducha, Ryszard

    2017-12-01

    Real time monitoring of engineering structures in case of an emergency of disaster requires collection of a large amount of data to be processed by specific analytical techniques. A quick and accurate assessment of the state of the object is crucial for a probable rescue action. One of the more significant evaluation methods of large sets of data, either collected during a specified interval of time or permanently, is the time series analysis. In this paper presented is a search algorithm for those time series elements which deviate from their values expected during monitoring. Quick and proper detection of observations indicating anomalous behavior of the structure allows to take a variety of preventive actions. In the algorithm, the mathematical formulae used provide maximal sensitivity to detect even minimal changes in the object's behavior. The sensitivity analyses were conducted for the algorithm of moving average as well as for the Douglas-Peucker algorithm used in generalization of linear objects in GIS. In addition to determining the size of deviations from the average it was used the so-called Hausdorff distance. The carried out simulation and verification of laboratory survey data showed that the approach provides sufficient sensitivity for automatic real time analysis of large amount of data obtained from different and various sensors (total stations, leveling, camera, radar).

  19. Design of a two-level power system linear state estimator

    NASA Astrophysics Data System (ADS)

    Yang, Tao

    The availability of synchro-phasor data has raised the possibility of a linear state estimator if the inputs are only complex currents and voltages and if there are enough such measurements to meet observability and redundancy requirements. Moreover, the new digital substations can perform some of the computation at the substation itself resulting in a more accurate two-level state estimator. The objective of this research is to develop a two-level linear state estimator processing synchro-phasor data and estimating the states at both the substation level and the control center level. Both the mathematical algorithms that are different from those in the present state estimation procedure and the layered architecture of databases, communications and application programs that are required to support this two-level linear state estimator are described in this dissertation. Besides, as the availability of phasor measurements at substations will increase gradually, this research also describes how the state estimator can be enhanced to handle both the traditional state estimator and the proposed linear state estimator simultaneously. This provides a way to immediately utilize the benefits in those parts of the system where such phasor measurements become available and provides a pathway to transition to the smart grid of the future. The design procedure of the two-level state estimator is applied to two study systems. The first study system is the IEEE-14 bus system. The second one is the 179 bus Western Electricity Coordinating Council (WECC) system. The static database for the substations is constructed from the power flow data of these systems and the real-time measurement database is produced by a power system dynamic simulating tool (TSAT). Time-skew problems that may be caused by communication delays are also considered and simulated. We used the Network Simulator (NS) tool to simulate a simple communication system and analyse its time delay performance. These time delays were too small to affect the results especially since the measurement data is time-stamped and the state estimator for these small systems could be run with subseconf frequency. Keywords: State Estimation, Synchro-Phasor Measurement, Distributed System, Energy Control Center, Substation, Time-skew

  20. On-line interactive virtual experiments on nanoscience

    NASA Astrophysics Data System (ADS)

    Kadar, Manuella; Ileana, Ioan; Hutanu, Constantin

    2009-01-01

    This paper is an overview on the next generation web which allows students to experience virtual experiments on nano science, physics devices, processes and processing equipment. Virtual reality is used to support a real university lab in which a student can experiment real lab sessions. The web material is presented in an intuitive and highly visual 3D form that is accessible to a diverse group of students. Such type of laboratory provides opportunities for professional and practical education for a wide range of users. The expensive equipment and apparatuses that build the experimental stage in a particular standard laboratory is used to create virtual educational research laboratories. Students learn how to prepare the apparatuses and facilities for the experiment. The online experiments metadata schema is the format for describing online experiments, much like the schema behind a library catalogue used to describe the books in a library. As an online experiment is a special kind of learning object, one specifies its schema as an extension to an established metadata schema for learning objects. The content of the courses, metainformation as well as readings and user data are saved on the server in a database as XML objects.

  1. Nonlinear Real-Time Optical Signal Processing.

    DTIC Science & Technology

    1981-06-30

    bandwidth and space-bandwidth products. Real-time homonorphic and loga- rithmic filtering by halftone nonlinear processing has been achieved. A...Page ABSTRACT 1 1. RESEARCH OBJECTIVES AND PROGRESS 3 I-- 1.1 Introduction and Project overview 3 1.2 Halftone Processing 9 1.3 Direct Nonlinear...time homomorphic and logarithmic filtering by halftone nonlinear processing has been achieved. A detailed analysis of degradation due to the finite gamma

  2. Computation offloading for real-time health-monitoring devices.

    PubMed

    Kalantarian, Haik; Sideris, Costas; Tuan Le; Hosseini, Anahita; Sarrafzadeh, Majid

    2016-08-01

    Among the major challenges in the development of real-time wearable health monitoring systems is to optimize battery life. One of the major techniques with which this objective can be achieved is computation offloading, in which portions of computation can be partitioned between the device and other resources such as a server or cloud. In this paper, we describe a novel dynamic computation offloading scheme for real-time wearable health monitoring devices that adjusts the partitioning of data between the wearable device and mobile application as a function of desired classification accuracy.

  3. Use of a prototype pulse oximeter for time series analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    González, Erika; López, Jehú; Hautefeuille, Mathieu; Velázquez, Víctor; Del Moral, Jésica

    2015-05-01

    This work presents the development of a low cost pulse oximeter prototype consisting of pulsed red and infrared commercial LEDs and a broad spectral photodetector used to register time series of heart rate and oxygen saturation of blood. This platform, besides providing these values, like any other pulse oximeter, processes the signals to compute a power spectrum analysis of the patient heart rate variability in real time and, additionally, the device allows access to all raw and analyzed data if databases construction is required or another kind of further analysis is desired. Since the prototype is capable of acquiring data for long periods of time, it is suitable for collecting data in real life activities, enabling the development of future wearable applications.

  4. The hologram as a space of illusion

    NASA Astrophysics Data System (ADS)

    Oliveira, Rosa M.

    2013-03-01

    One of the most interesting aspects of art holography is the study of 3D holographic image. Over the centuries, artists have chased the best way to represent the third dimension as similar to reality as possible. Several steps have been given in this direction, first using perspective, then photography, and later with movies, but all of these representations of reality wouldn't reach the complete objective. The realism of a 3D representation on a 2D support (paper, canvas, celluloid) is completely overcome by holography. In spite of the fact that the holographic plate or film is also a 2D support, the holographic image is a recording of all the information of the object contained in light. Our perception doesn't need to translate the object as real. It is real. Though immaterial, the holographic image is real because it exists in light. The same parallax, the same shape. The representation is no more an imitation of reality but a replacement of the real object or scene. The space where it exists is a space of illusion and multiple objects can occupy the same place in the hologram, depending on the viewer's time and place. This introduces the fourth dimension in the hologram: time, as well as the apparent conflict between the presence and the absence of images, which is just possible in holography.

  5. CPU-GPU mixed implementation of virtual node method for real-time interactive cutting of deformable objects using OpenCL.

    PubMed

    Jia, Shiyu; Zhang, Weizhong; Yu, Xiaokang; Pan, Zhenkuan

    2015-09-01

    Surgical simulators need to simulate interactive cutting of deformable objects in real time. The goal of this work was to design an interactive cutting algorithm that eliminates traditional cutting state classification and can work simultaneously with real-time GPU-accelerated deformation without affecting its numerical stability. A modified virtual node method for cutting is proposed. Deformable object is modeled as a real tetrahedral mesh embedded in a virtual tetrahedral mesh, and the former is used for graphics rendering and collision, while the latter is used for deformation. Cutting algorithm first subdivides real tetrahedrons to eliminate all face and edge intersections, then splits faces, edges and vertices along cutting tool trajectory to form cut surfaces. Next virtual tetrahedrons containing more than one connected real tetrahedral fragments are duplicated, and connectivity between virtual tetrahedrons is updated. Finally, embedding relationship between real and virtual tetrahedral meshes is updated. Co-rotational linear finite element method is used for deformation. Cutting and collision are processed by CPU, while deformation is carried out by GPU using OpenCL. Efficiency of GPU-accelerated deformation algorithm was tested using block models with varying numbers of tetrahedrons. Effectiveness of our cutting algorithm under multiple cuts and self-intersecting cuts was tested using a block model and a cylinder model. Cutting of a more complex liver model was performed, and detailed performance characteristics of cutting, deformation and collision were measured and analyzed. Our cutting algorithm can produce continuous cut surfaces when traditional minimal element creation algorithm fails. Our GPU-accelerated deformation algorithm remains stable with constant time step under multiple arbitrary cuts and works on both NVIDIA and AMD GPUs. GPU-CPU speed ratio can be as high as 10 for models with 80,000 tetrahedrons. Forty to sixty percent real-time performance and 100-200 Hz simulation rate are achieved for the liver model with 3,101 tetrahedrons. Major bottlenecks for simulation efficiency are cutting, collision processing and CPU-GPU data transfer. Future work needs to improve on these areas.

  6. Expert database system for quality control

    NASA Astrophysics Data System (ADS)

    Wang, Anne J.; Li, Zhi-Cheng

    1993-09-01

    There are more competitors today. Markets are not homogeneous they are fragmented into increasingly focused niches requiring greater flexibility in the product mix shorter manufacturing production runs and above allhigher quality. In this paper the author identified a real-time expert system as a way to improve plantwide quality management. The quality control expert database system (QCEDS) by integrating knowledge of experts in operations quality management and computer systems use all information relevant to quality managementfacts as well as rulesto determine if a product meets quality standards. Keywords: expert system quality control data base

  7. Identification and human condition analysis based on the human voice analysis

    NASA Astrophysics Data System (ADS)

    Mieshkov, Oleksandr Yu.; Novikov, Oleksandr O.; Novikov, Vsevolod O.; Fainzilberg, Leonid S.; Kotyra, Andrzej; Smailova, Saule; Kozbekova, Ainur; Imanbek, Baglan

    2017-08-01

    The paper presents a two-stage biotechnical system for human condition analysis that is based on analysis of human voice signal. At the initial stage, the voice signal is pre-processed and its characteristics in time domain are determined. At the first stage, the developed system is capable of identifying the person in the database on the basis of the extracted characteristics. At the second stage, the model of a human voice is built on the basis of the real voice signals after clustering the whole database.

  8. Developing a database for pedestrians' earthquake emergency evacuation in indoor scenarios.

    PubMed

    Zhou, Junxue; Li, Sha; Nie, Gaozhong; Fan, Xiwei; Tan, Jinxian; Li, Huayue; Pang, Xiaoke

    2018-01-01

    With the booming development of evacuation simulation software, developing an extensive database in indoor scenarios for evacuation models is imperative. In this paper, we conduct a qualitative and quantitative analysis of the collected videotapes and aim to provide a complete and unitary database of pedestrians' earthquake emergency response behaviors in indoor scenarios, including human-environment interactions. Using the qualitative analysis method, we extract keyword groups and keywords that code the response modes of pedestrians and construct a general decision flowchart using chronological organization. Using the quantitative analysis method, we analyze data on the delay time, evacuation speed, evacuation route and emergency exit choices. Furthermore, we study the effect of classroom layout on emergency evacuation. The database for indoor scenarios provides reliable input parameters and allows the construction of real and effective constraints for use in software and mathematical models. The database can also be used to validate the accuracy of evacuation models.

  9. Graphical user interface concepts for tactical augmented reality

    NASA Astrophysics Data System (ADS)

    Argenta, Chris; Murphy, Anne; Hinton, Jeremy; Cook, James; Sherrill, Todd; Snarski, Steve

    2010-04-01

    Applied Research Associates and BAE Systems are working together to develop a wearable augmented reality system under the DARPA ULTRA-Vis program†. Our approach to achieve the objectives of ULTRAVis, called iLeader, incorporates a full color 40° field of view (FOV) see-thru holographic waveguide integrated with sensors for full position and head tracking to provide an unobtrusive information system for operational maneuvers. iLeader will enable warfighters to mark-up the 3D battle-space with symbologic identification of graphical control measures, friendly force positions and enemy/target locations. Our augmented reality display provides dynamic real-time painting of symbols on real objects, a pose-sensitive 360° representation of relevant object positions, and visual feedback for a variety of system activities. The iLeader user interface and situational awareness graphical representations are highly intuitive, nondisruptive, and always tactically relevant. We used best human-factors practices, system engineering expertise, and cognitive task analysis to design effective strategies for presenting real-time situational awareness to the military user without distorting their natural senses and perception. We present requirements identified for presenting information within a see-through display in combat environments, challenges in designing suitable visualization capabilities, and solutions that enable us to bring real-time iconic command and control to the tactical user community.

  10. Accounting Data to Web Interface Using PERL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hargeaves, C

    2001-08-13

    This document will explain the process to create a web interface for the accounting information generated by the High Performance Storage Systems (HPSS) accounting report feature. The accounting report contains useful data but it is not easily accessed in a meaningful way. The accounting report is the only way to see summarized storage usage information. The first step is to take the accounting data, make it meaningful and store the modified data in persistent databases. The second step is to generate the various user interfaces, HTML pages, that will be used to access the data. The third step is tomore » transfer all required files to the web server. The web pages pass parameters to Common Gateway Interface (CGI) scripts that generate dynamic web pages and graphs. The end result is a web page with specific information presented in text with or without graphs. The accounting report has a specific format that allows the use of regular expressions to verify if a line is storage data. Each storage data line is stored in a detailed database file with a name that includes the run date. The detailed database is used to create a summarized database file that also uses run date in its name. The summarized database is used to create the group.html web page that includes a list of all storage users. Scripts that query the database folder to build a list of available databases generate two additional web pages. A master script that is run monthly as part of a cron job, after the accounting report has completed, manages all of these individual scripts. All scripts are written in the PERL programming language. Whenever possible data manipulation scripts are written as filters. All scripts are written to be single source, which means they will function properly on both the open and closed networks at LLNL. The master script handles the command line inputs for all scripts, file transfers to the web server and records run information in a log file. The rest of the scripts manipulate the accounting data or use the files created to generate HTML pages. Each script will be described in detail herein. The following is a brief description of HPSS taken directly from an HPSS web site. ''HPSS is a major development project, which began in 1993 as a Cooperative Research and Development Agreement (CRADA) between government and industry. The primary objective of HPSS is to move very large data objects between high performance computers, workstation clusters, and storage libraries at speeds many times faster than is possible with today's software systems. For example, HPSS can manage parallel data transfers from multiple network-connected disk arrays at rates greater than 1 Gbyte per second, making it possible to access high definition digitized video in real time.'' The HPSS accounting report is a canned report whose format is controlled by the HPSS developers.« less

  11. Real-Time Motion Tracking for Indoor Moving Sphere Objects with a LiDAR Sensor.

    PubMed

    Huang, Lvwen; Chen, Siyuan; Zhang, Jianfeng; Cheng, Bang; Liu, Mingqing

    2017-08-23

    Object tracking is a crucial research subfield in computer vision and it has wide applications in navigation, robotics and military applications and so on. In this paper, the real-time visualization of 3D point clouds data based on the VLP-16 3D Light Detection and Ranging (LiDAR) sensor is achieved, and on the basis of preprocessing, fast ground segmentation, Euclidean clustering segmentation for outliers, View Feature Histogram (VFH) feature extraction, establishing object models and searching matching a moving spherical target, the Kalman filter and adaptive particle filter are used to estimate in real-time the position of a moving spherical target. The experimental results show that the Kalman filter has the advantages of high efficiency while adaptive particle filter has the advantages of high robustness and high precision when tested and validated on three kinds of scenes under the condition of target partial occlusion and interference, different moving speed and different trajectories. The research can be applied in the natural environment of fruit identification and tracking, robot navigation and control and other fields.

  12. A Scalable Distributed Approach to Mobile Robot Vision

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin; Browning, Robert L.; Gribble, William S.

    1997-01-01

    This paper documents our progress during the first year of work on our original proposal entitled 'A Scalable Distributed Approach to Mobile Robot Vision'. We are pursuing a strategy for real-time visual identification and tracking of complex objects which does not rely on specialized image-processing hardware. In this system perceptual schemas represent objects as a graph of primitive features. Distributed software agents identify and track these features, using variable-geometry image subwindows of limited size. Active control of imaging parameters and selective processing makes simultaneous real-time tracking of many primitive features tractable. Perceptual schemas operate independently from the tracking of primitive features, so that real-time tracking of a set of image features is not hurt by latency in recognition of the object that those features make up. The architecture allows semantically significant features to be tracked with limited expenditure of computational resources, and allows the visual computation to be distributed across a network of processors. Early experiments are described which demonstrate the usefulness of this formulation, followed by a brief overview of our more recent progress (after the first year).

  13. Real-Time Motion Tracking for Indoor Moving Sphere Objects with a LiDAR Sensor

    PubMed Central

    Chen, Siyuan; Zhang, Jianfeng; Cheng, Bang; Liu, Mingqing

    2017-01-01

    Object tracking is a crucial research subfield in computer vision and it has wide applications in navigation, robotics and military applications and so on. In this paper, the real-time visualization of 3D point clouds data based on the VLP-16 3D Light Detection and Ranging (LiDAR) sensor is achieved, and on the basis of preprocessing, fast ground segmentation, Euclidean clustering segmentation for outliers, View Feature Histogram (VFH) feature extraction, establishing object models and searching matching a moving spherical target, the Kalman filter and adaptive particle filter are used to estimate in real-time the position of a moving spherical target. The experimental results show that the Kalman filter has the advantages of high efficiency while adaptive particle filter has the advantages of high robustness and high precision when tested and validated on three kinds of scenes under the condition of target partial occlusion and interference, different moving speed and different trajectories. The research can be applied in the natural environment of fruit identification and tracking, robot navigation and control and other fields. PMID:28832520

  14. Object tracking mask-based NLUT on GPUs for real-time generation of holographic videos of three-dimensional scenes.

    PubMed

    Kwon, M-W; Kim, S-C; Yoon, S-E; Ho, Y-S; Kim, E-S

    2015-02-09

    A new object tracking mask-based novel-look-up-table (OTM-NLUT) method is proposed and implemented on graphics-processing-units (GPUs) for real-time generation of holographic videos of three-dimensional (3-D) scenes. Since the proposed method is designed to be matched with software and memory structures of the GPU, the number of compute-unified-device-architecture (CUDA) kernel function calls and the computer-generated hologram (CGH) buffer size of the proposed method have been significantly reduced. It therefore results in a great increase of the computational speed of the proposed method and enables real-time generation of CGH patterns of 3-D scenes. Experimental results show that the proposed method can generate 31.1 frames of Fresnel CGH patterns with 1,920 × 1,080 pixels per second, on average, for three test 3-D video scenarios with 12,666 object points on three GPU boards of NVIDIA GTX TITAN, and confirm the feasibility of the proposed method in the practical application of electro-holographic 3-D displays.

  15. Face-to-Face vs. Real-Time Clinical Education: No Significant Difference

    ERIC Educational Resources Information Center

    Mohammed, Y. Q.; Waddington, G.; Donnan, P.

    2007-01-01

    The main objective of this pilot research project was to determine whether the use of an internet broadband link to stream physiotherapy clinical education workshop proceedings in "real-time" is of equivalent educational value to the traditional face-to-face experience. This project looked at the benefits of using the above technology as…

  16. Development of a methodology to measure the effect of ergot alkaloids on forestomach motility using real-time wireless telemetry

    USDA-ARS?s Scientific Manuscript database

    The objectives of these experiments were to characterize rumen motility patterns of cattle fed once daily using a real-time wireless telemetry system, determine when to measure rumen motility with this system, and determine the effect of ruminal dosing of ergot alkaloids on rumen motility. Ruminally...

  17. Integrating Evidence-Based Practice into a Therapeutic Exercise Course: Real-Time Patient Experience

    ERIC Educational Resources Information Center

    Popp, Jennifer K.

    2014-01-01

    Athletic training students need real-time patient experiences in order to transfer the knowledge and skills learned in the classroom into clinical practice. The objective is to present a description of an assignment that could be incorporated into a therapeutic exercise course giving the student an opportunity to evaluate a patient, design a…

  18. Real-Time Imaging of Plant Cell Wall Structure at Nanometer Scale, with Respect to Cellulase Accessibility and Degradation Kinetics (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, S. Y.

    Presentation on real-time imaging of plant cell wall structure at nanometer scale. Objectives are to develop tools to measure biomass at the nanometer scale; elucidate the molecular bases of biomass deconstruction; and identify factors that affect the conversion efficiency of biomass-to-biofuels.

  19. Astronomical data analysis software and systems I; Proceedings of the 1st Annual Conference, Tucson, AZ, Nov. 6-8, 1991

    NASA Technical Reports Server (NTRS)

    Worrall, Diana M. (Editor); Biemesderfer, Chris (Editor); Barnes, Jeannette (Editor)

    1992-01-01

    Consideration is given to a definition of a distribution format for X-ray data, the Einstein on-line system, the NASA/IPAC extragalactic database, COBE astronomical databases, Cosmic Background Explorer astronomical databases, the ADAM software environment, the Groningen Image Processing System, search for a common data model for astronomical data analysis systems, deconvolution for real and synthetic apertures, pitfalls in image reconstruction, a direct method for spectral and image restoration, and a discription of a Poisson imagery super resolution algorithm. Also discussed are multivariate statistics on HI and IRAS images, a faint object classification using neural networks, a matched filter for improving SNR of radio maps, automated aperture photometry of CCD images, interactive graphics interpreter, the ROSAT extreme ultra-violet sky survey, a quantitative study of optimal extraction, an automated analysis of spectra, applications of synthetic photometry, an algorithm for extra-solar planet system detection and data reduction facilities for the William Herschel telescope.

  20. The Rondonia Lightning Detection Network: Network Description, Science Objectives, Data Processing Archival/Methodology, and Results

    NASA Technical Reports Server (NTRS)

    Blakeslee, R. J.; Bailey, J. C.; Pinto, O.; Athayde, A.; Renno, N.; Weidman, C. D.

    2003-01-01

    A four station Advanced Lightning Direction Finder (ALDF) network was established in the state of Rondonia in western Brazil in 1999 through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of- arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the Internet. The network, which is still operational, was deployed to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite that was launched in November 1997. The measurements are also being used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-time series observations produced by this network will help establish a regional lightning climatological database, supplementing other databases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at the NASA/Marshall Space Flight Center have been applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The data will also be corrected for the network detection efficiency. The processing methodology and the results from the analysis of four years of network operations will be presented.

  1. Unified Database Development Program. Final Report.

    ERIC Educational Resources Information Center

    Thomas, Everett L., Jr.; Deem, Robert N.

    The objective of the unified database (UDB) program was to develop an automated information system that would be useful in the design, development, testing, and support of new Air Force aircraft weapon systems. Primary emphasis was on the development of: (1) a historical logistics data repository system to provide convenient and timely access to…

  2. Real-time bus arrival information systems return-on-investment study.

    DOT National Transportation Integrated Search

    2006-08-01

    This reference document was prepared for the Office of Research, Demonstration and Innovation of the Federal Transit Administration (FTA). The objectives of the study include developing a methodology for determining the return on investment of real-t...

  3. Measuring Quality of Healthcare Outcomes in Type 2 Diabetes from Routine Data: a Seven-nation Survey Conducted by the IMIA Primary Health Care Working Group.

    PubMed

    Hinton, W; Liyanage, H; McGovern, A; Liaw, S-T; Kuziemsky, C; Munro, N; de Lusignan, S

    2017-08-01

    Background: The Institute of Medicine framework defines six dimensions of quality for healthcare systems: (1) safety, (2) effectiveness, (3) patient centeredness, (4) timeliness of care, (5) efficiency, and (6) equity. Large health datasets provide an opportunity to assess quality in these areas. Objective: To perform an international comparison of the measurability of the delivery of these aims, in people with type 2 diabetes mellitus (T2DM) from large datasets. Method: We conducted a survey to assess healthcare outcomes data quality of existing databases and disseminated this through professional networks. We examined the data sources used to collect the data, frequency of data uploads, and data types used for identifying people with T2DM. We compared data completeness across the six areas of healthcare quality, using selected measures pertinent to T2DM management. Results: We received 14 responses from seven countries (Australia, Canada, Italy, the Netherlands, Norway, Portugal, Turkey and the UK). Most databases reported frequent data uploads and would be capable of near real time analysis of healthcare quality.The majority of recorded data related to safety (particularly medication adverse events) and treatment efficacy (glycaemic control and microvascular disease). Data potentially measuring equity was less well recorded. Recording levels were lowest for patient-centred care, timeliness of care, and system efficiency, with the majority of databases containing no data in these areas. Databases using primary care sources had higher data quality across all areas measured. Conclusion: Data quality could be improved particularly in the areas of patient-centred care, timeliness, and efficiency. Primary care derived datasets may be most suited to healthcare quality assessment. Georg Thieme Verlag KG Stuttgart.

  4. Real-life compliance and persistence among users of subcutaneous and sublingual allergen immunotherapy.

    PubMed

    Kiel, Menno A; Röder, Esther; Gerth van Wijk, Roy; Al, Maiwenn J; Hop, Wim C J; Rutten-van Mölken, Maureen P M H

    2013-08-01

    Subcutaneous allergen immunotherapy (SCIT) and sublingual allergen immunotherapy (SLIT) are safe and effective treatments of allergic rhinitis, but high levels of compliance and persistence are crucial to achieving the desired clinical effects. Our objective was to assess levels and predictors of compliance and persistence among grass pollen, tree pollen, and house dust mite immunotherapy users in real life and to estimate the costs of premature discontinuation. We performed a retrospective analysis of a community pharmacy database from The Netherlands containing data from 6486 patients starting immunotherapy for 1 or more of the allergens of interest between 1994 and 2009. Two thousand seven hundred ninety-six patients received SCIT, and 3690 received SLIT. Time to treatment discontinuation was analyzed and included Cox proportional hazard models with time-dependent covariates, where appropriate. Overall, only 18% of users reached the minimally required duration of treatment of 3 years (SCIT, 23%; SLIT, 7%). Median durations for SCIT and SLIT users were 1.7 and 0.6 years, respectively (P < .001). Other independent predictors of premature discontinuation were prescriber, with patients of general practitioners demonstrating longer persistence than those of allergologists and other medical specialists; single-allergen immunotherapy, lower socioeconomic status; and younger age. Of the persistent patients, 56% were never late in picking up their medication from the pharmacy. Direct medication costs per nonpersistent patient discontinuing in the third year of treatment were €3800, an amount that was largely misspent. Real-life persistence is better in SCIT users than in SLIT users, although it is low overall. There is an urgent need for further identification of potential barriers and measures that will enhance persistence and compliance. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.

  5. Real-time data for estimating a forward-looking interest rate rule of the ECB.

    PubMed

    Bletzinger, Tilman; Wieland, Volker

    2017-12-01

    The purpose of the data presented in this article is to use it in ex post estimations of interest rate decisions by the European Central Bank (ECB), as it is done by Bletzinger and Wieland (2017) [1]. The data is of quarterly frequency from 1999 Q1 until 2013 Q2 and consists of the ECB's policy rate, inflation rate, real output growth and potential output growth in the euro area. To account for forward-looking decision making in the interest rate rule, the data consists of expectations about future inflation and output dynamics. While potential output is constructed based on data from the European Commission's annual macro-economic database, inflation and real output growth are taken from two different sources both provided by the ECB: the Survey of Professional Forecasters and projections made by ECB staff. Careful attention was given to the publication date of the collected data to ensure a real-time dataset only consisting of information which was available to the decision makers at the time of the decision.

  6. The ARTEMIS European driving cycles for measuring car pollutant emissions.

    PubMed

    André, Michel

    2004-12-01

    In the past 10 years, various work has been undertaken to collect data on the actual driving of European cars and to derive representative real-world driving cycles. A compilation and synthesis of this work is provided in this paper. In the frame of the European research project: ARTEMIS, this work has been considered to derive a set of reference driving cycles. The main objectives were as follows: to derive a common set of reference real-world driving cycles to be used in the frame of the ARTEMIS project but also in the frame of on-going national campaigns of pollutant emission measurements, to ensure the compatibility and integration of all the resulting emission data in the European systems of emission inventory; to ensure and validate the representativity of the database and driving cycles by comparing and taking into account all the available data regarding driving conditions; to include in three real-world driving cycles (urban, rural road and motorway) the diversity of the observed driving conditions, within sub-cycles allowing a disaggregation of the emissions according to more specific driving conditions (congested and free-flow urban). Such driving cycles present a real advantage as they are derived from a large database, using a methodology that was widely discussed and approved. In the main, these ARTEMIS driving cycles were designed using the available data, and the method of analysis was based to some extent on previous work. Specific steps were implemented. The study includes characterisation of driving conditions and vehicle uses. Starting conditions and gearbox use are also taken into account.

  7. Development of database of real-world diesel vehicle emission factors for China.

    PubMed

    Shen, Xianbao; Yao, Zhiliang; Zhang, Qiang; Wagner, David Vance; Huo, Hong; Zhang, Yingzhi; Zheng, Bo; He, Kebin

    2015-05-01

    A database of real-world diesel vehicle emission factors, based on type and technology, has been developed following tests on more than 300 diesel vehicles in China using a portable emission measurement system. The database provides better understanding of diesel vehicle emissions under actual driving conditions. We found that although new regulations have reduced real-world emission levels of diesel trucks and buses significantly for most pollutants in China, NOx emissions have been inadequately controlled by the current standards, especially for diesel buses, because of bad driving conditions in the real world. We also compared the emission factors in the database with those calculated by emission factor models and used in inventory studies. The emission factors derived from COPERT (Computer Programmer to calculate Emissions from Road Transport) and MOBILE may both underestimate real emission factors, whereas the updated COPERT and PART5 (Highway Vehicle Particulate Emission Modeling Software) models may overestimate emission factors in China. Real-world measurement results and emission factors used in recent emission inventory studies are inconsistent, which has led to inaccurate estimates of emissions from diesel trucks and buses over recent years. This suggests that emission factors derived from European or US-based models will not truly represent real-world emissions in China. Therefore, it is useful and necessary to conduct systematic real-world measurements of vehicle emissions in China in order to obtain the optimum inputs for emission inventory models. Copyright © 2015. Published by Elsevier B.V.

  8. Electromyography data for non-invasive naturally-controlled robotic hand prostheses

    PubMed Central

    Atzori, Manfredo; Gijsberts, Arjan; Castellini, Claudio; Caputo, Barbara; Hager, Anne-Gabrielle Mittaz; Elsig, Simone; Giatsidis, Giorgio; Bassetto, Franco; Müller, Henning

    2014-01-01

    Recent advances in rehabilitation robotics suggest that it may be possible for hand-amputated subjects to recover at least a significant part of the lost hand functionality. The control of robotic prosthetic hands using non-invasive techniques is still a challenge in real life: myoelectric prostheses give limited control capabilities, the control is often unnatural and must be learned through long training times. Meanwhile, scientific literature results are promising but they are still far from fulfilling real-life needs. This work aims to close this gap by allowing worldwide research groups to develop and test movement recognition and force control algorithms on a benchmark scientific database. The database is targeted at studying the relationship between surface electromyography, hand kinematics and hand forces, with the final goal of developing non-invasive, naturally controlled, robotic hand prostheses. The validation section verifies that the data are similar to data acquired in real-life conditions, and that recognition of different hand tasks by applying state-of-the-art signal features and machine-learning algorithms is possible. PMID:25977804

  9. Efficient frequent pattern mining algorithm based on node sets in cloud computing environment

    NASA Astrophysics Data System (ADS)

    Billa, V. N. Vinay Kumar; Lakshmanna, K.; Rajesh, K.; Reddy, M. Praveen Kumar; Nagaraja, G.; Sudheer, K.

    2017-11-01

    The ultimate goal of Data Mining is to determine the hidden information which is useful in making decisions using the large databases collected by an organization. This Data Mining involves many tasks that are to be performed during the process. Mining frequent itemsets is the one of the most important tasks in case of transactional databases. These transactional databases contain the data in very large scale where the mining of these databases involves the consumption of physical memory and time in proportion to the size of the database. A frequent pattern mining algorithm is said to be efficient only if it consumes less memory and time to mine the frequent itemsets from the given large database. Having these points in mind in this thesis we proposed a system which mines frequent itemsets in an optimized way in terms of memory and time by using cloud computing as an important factor to make the process parallel and the application is provided as a service. A complete framework which uses a proven efficient algorithm called FIN algorithm. FIN algorithm works on Nodesets and POC (pre-order coding) tree. In order to evaluate the performance of the system we conduct the experiments to compare the efficiency of the same algorithm applied in a standalone manner and in cloud computing environment on a real time data set which is traffic accidents data set. The results show that the memory consumption and execution time taken for the process in the proposed system is much lesser than those of standalone system.

  10. Real-time face and gesture analysis for human-robot interaction

    NASA Astrophysics Data System (ADS)

    Wallhoff, Frank; Rehrl, Tobias; Mayer, Christoph; Radig, Bernd

    2010-05-01

    Human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, or gestures. Facial expressions and gestures are one of the main nonverbal communication mechanisms and pass large amounts of information between human dialog partners. Therefore, to allow for intuitive human-machine interaction, a real-time capable processing and recognition of facial expressions, hand and head gestures are of great importance. We present a system that is tackling these challenges. The input features for the dynamic head gestures and facial expressions are obtained from a sophisticated three-dimensional model, which is fitted to the user in a real-time capable manner. Applying this model different kinds of information are extracted from the image data and afterwards handed over to a real-time capable data-transferring framework, the so-called Real-Time DataBase (RTDB). In addition to the head and facial-related features, also low-level image features regarding the human hand - optical flow, Hu-moments are stored into the RTDB for the evaluation process of hand gestures. In general, the input of a single camera is sufficient for the parallel evaluation of the different gestures and facial expressions. The real-time capable recognition of the dynamic hand and head gestures are performed via different Hidden Markov Models, which have proven to be a quick and real-time capable classification method. On the other hand, for the facial expressions classical decision trees or more sophisticated support vector machines are used for the classification process. These obtained results of the classification processes are again handed over to the RTDB, where other processes (like a Dialog Management Unit) can easily access them without any blocking effects. In addition, an adjustable amount of history can be stored by the RTDB buffer unit.

  11. Database of mineral deposits in the Islamic Republic of Mauritania (phase V, deliverables 90 and 91): Chapter S in Second projet de renforcement institutionnel du secteur minier de la République Islamique de Mauritanie (PRISM-II)

    USGS Publications Warehouse

    Marsh, Erin; Anderson, Eric D.

    2015-01-01

    Three ore deposits databases from previous studies were evaluated and combined with new known mineral occurrences into one database, which can now be used to manage information about the known mineral occurrences of Mauritania. The Microsoft Access 2010 database opens with the list of tables and forms held within the database and a Switchboard control panel from which to easily navigate through the existing mineral deposit data and to enter data for new deposit locations. The database is a helpful tool for the organization of the basic information about the mineral occurrences of Mauritania. It is suggested the database be administered by a single operator in order to avoid data overlap and override that can result from shared real time data entry. It is proposed that the mineral occurrence database be used in concert with the geologic maps, geophysics and geochemistry datasets, as a publically advertised interface for the abundant geospatial information that the Mauritanian government can provide to interested parties.

  12. Genotator: a disease-agnostic tool for genetic annotation of disease.

    PubMed

    Wall, Dennis P; Pivovarov, Rimma; Tong, Mark; Jung, Jae-Yoon; Fusaro, Vincent A; DeLuca, Todd F; Tonellato, Peter J

    2010-10-29

    Disease-specific genetic information has been increasing at rapid rates as a consequence of recent improvements and massive cost reductions in sequencing technologies. Numerous systems designed to capture and organize this mounting sea of genetic data have emerged, but these resources differ dramatically in their disease coverage and genetic depth. With few exceptions, researchers must manually search a variety of sites to assemble a complete set of genetic evidence for a particular disease of interest, a process that is both time-consuming and error-prone. We designed a real-time aggregation tool that provides both comprehensive coverage and reliable gene-to-disease rankings for any disease. Our tool, called Genotator, automatically integrates data from 11 externally accessible clinical genetics resources and uses these data in a straightforward formula to rank genes in order of disease relevance. We tested the accuracy of coverage of Genotator in three separate diseases for which there exist specialty curated databases, Autism Spectrum Disorder, Parkinson's Disease, and Alzheimer Disease. Genotator is freely available at http://genotator.hms.harvard.edu. Genotator demonstrated that most of the 11 selected databases contain unique information about the genetic composition of disease, with 2514 genes found in only one of the 11 databases. These findings confirm that the integration of these databases provides a more complete picture than would be possible from any one database alone. Genotator successfully identified at least 75% of the top ranked genes for all three of our use cases, including a 90% concordance with the top 40 ranked candidates for Alzheimer Disease. As a meta-query engine, Genotator provides high coverage of both historical genetic research as well as recent advances in the genetic understanding of specific diseases. As such, Genotator provides a real-time aggregation of ranked data that remains current with the pace of research in the disease fields. Genotator's algorithm appropriately transforms query terms to match the input requirements of each targeted databases and accurately resolves named synonyms to ensure full coverage of the genetic results with official nomenclature. Genotator generates an excel-style output that is consistent across disease queries and readily importable to other applications.

  13. Historical seismometry database project: A comprehensive relational database for historical seismic records

    NASA Astrophysics Data System (ADS)

    Bono, Andrea

    2007-01-01

    The recovery and preservation of the patrimony made of the instrumental registrations regarding the historical earthquakes is with no doubt a subject of great interest. This attention, besides being purely historical, must necessarily be also scientific. In fact, the availability of a great amount of parametric information on the seismic activity in a given area is a doubtless help to the seismologic researcher's activities. In this article the project of the Sismos group of the National Institute of Geophysics and Volcanology of Rome new database is presented. In the structure of the new scheme the matured experience of five years of activity is summarized. We consider it useful for those who are approaching to "recovery and reprocess" computer based facilities. In the past years several attempts on Italian seismicity have followed each other. It has almost never been real databases. Some of them have had positive success because they were well considered and organized. In others it was limited in supplying lists of events with their relative hypocentral standards. What makes this project more interesting compared to the previous work is the completeness and the generality of the managed information. For example, it will be possible to view the hypocentral information regarding a given historical earthquake; it will be possible to research the seismograms in raster, digital or digitalized format, the information on times of arrival of the phases in the various stations, the instrumental standards and so on. The relational modern logic on which the archive is based, allows the carrying out of all these operations with little effort. The database described below will completely substitute Sismos' current data bank. Some of the organizational principles of this work are similar to those that inspire the database for the real-time monitoring of the seismicity in use in the principal offices of international research. A modern planning logic in a distinctly historical context is introduced. Following are the descriptions of the various planning phases, from the conceptual level to the physical implementation of the scheme. Each time principle instructions, rules, considerations of technical-scientific nature are highlighted that take to the final result: a vanguard relational scheme for historical data.

  14. An Approach to Develop 3d Geo-Dbms Topological Operators by Re-Using Existing 2d Operators

    NASA Astrophysics Data System (ADS)

    Xu, D.; Zlatanova, S.

    2013-09-01

    Database systems are continuously extending their capabilities to store, process and analyse 3D data. Topological relationships which describe the interaction of objects in space is one of the important spatial issues. However, spatial operators for 3D objects are still insufficient. In this paper we present the development of a new 3D topological function to distinguish intersections of 3D planar polygons. The development uses existing 2D functions in the DBMS and two geometric transformations (rotation and projection). This function is tested for a real dataset to detect overlapping 3D city objects. The paper presents the algorithms and analyses the challenges. Suggestions for improvements of the current algorithm as well as possible extensions to handle more 3D topological cases are discussed at the end.

  15. Incorporating Auditory Models in Speech/Audio Applications

    NASA Astrophysics Data System (ADS)

    Krishnamoorthi, Harish

    2011-12-01

    Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding an auditory model in the objective function formulation and proposes possible solutions to overcome high complexity issues for use in real-time speech/audio algorithms. Specific problems addressed in this dissertation include: 1) the development of approximate but computationally efficient auditory model implementations that are consistent with the principles of psychoacoustics, 2) the development of a mapping scheme that allows synthesizing a time/frequency domain representation from its equivalent auditory model output. The first problem is aimed at addressing the high computational complexity involved in solving perceptual objective functions that require repeated application of auditory model for evaluation of different candidate solutions. In this dissertation, a frequency pruning and a detector pruning algorithm is developed that efficiently implements the various auditory model stages. The performance of the pruned model is compared to that of the original auditory model for different types of test signals in the SQAM database. Experimental results indicate only a 4-7% relative error in loudness while attaining up to 80-90 % reduction in computational complexity. Similarly, a hybrid algorithm is developed specifically for use with sinusoidal signals and employs the proposed auditory pattern combining technique together with a look-up table to store representative auditory patterns. The second problem obtains an estimate of the auditory representation that minimizes a perceptual objective function and transforms the auditory pattern back to its equivalent time/frequency representation. This avoids the repeated application of auditory model stages to test different candidate time/frequency vectors in minimizing perceptual objective functions. In this dissertation, a constrained mapping scheme is developed by linearizing certain auditory model stages that ensures obtaining a time/frequency mapping corresponding to the estimated auditory representation. This paradigm was successfully incorporated in a perceptual speech enhancement algorithm and a sinusoidal component selection task.

  16. [Research progress on real-time deformable models of soft tissues for surgery simulation].

    PubMed

    Xu, Shaoping; Liu, Xiaoping; Zhang, Hua; Luo, Jie

    2010-04-01

    Biological tissues generally exhibit nonlinearity, anisotropy, quasi-incompressibility and viscoelasticity about material properties. Simulating the behaviour of elastic objects in real time is one of the current objectives of virtual surgery simulation which is still a challenge for researchers to accurately depict the behaviour of human tissues. In this paper, we present a classification of the different deformable models that have been developed. We present the advantages and disadvantages of each one. Finally, we make a comparison of deformable models and perform an evaluation of the state of the art and the future of deformable models.

  17. Development of Real Time PCR Using Novel Genomic Target for Detection of Multiple Salmonella Serovars from Milk and Chickens

    USDA-ARS?s Scientific Manuscript database

    Background: A highly sensitive and specific novel genomic and plasmid target-based PCR platform was developed to detect multiple Salmonella serovars (S. Heidelberg, S. Dublin, S. Hadar, S. Kentucky and S. Enteritidis). Through extensive genome mining of protein databases of these serovars and compar...

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Brunhart-Lupo, Nicholas J; Gruchalla, Kenny M

    This brochure describes a system dynamics simulation (SD) framework that supports an end-to-end analysis workflow that is optimized for deployment on ESIF facilities(Peregrine and the Insight Center). It includes (I) parallel and distributed simulation of SD models, (ii) real-time 3D visualization of running simulations, and (iii) comprehensive database-oriented persistence of simulation metadata, inputs, and outputs.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Brunhart-Lupo, Nicholas J; Gruchalla, Kenny M

    This presentation describes a system dynamics simulation (SD) framework that supports an end-to-end analysis workflow that is optimized for deployment on ESIF facilities(Peregrine and the Insight Center). It includes (I) parallel and distributed simulation of SD models, (ii) real-time 3D visualization of running simulations, and (iii) comprehensive database-oriented persistence of simulation metadata, inputs, and outputs.

  20. Digital Education Governance: Data Visualization, Predictive Analytics, and "Real-Time" Policy Instruments

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    Educational institutions and governing practices are increasingly augmented with digital database technologies that function as new kinds of policy instruments. This article surveys and maps the landscape of digital policy instrumentation in education and provides two detailed case studies of new digital data systems. The Learning Curve is a…

  1. Cognitive Affordances of the Cyberinfrastructure for Science and Math Learning

    ERIC Educational Resources Information Center

    Martinez, Michael E.; Peters Burton, Erin E.

    2011-01-01

    The "cyberinfrastucture" is a broad informational network that entails connections to real-time data sensors as well as tools that permit visualization and other forms of analysis, and that facilitates access to vast scientific databases. This multifaceted network, already a major boon to scientific discovery, now shows exceptional promise in…

  2. Applying Agrep to r-NSA to solve multiple sequences approximate matching.

    PubMed

    Ni, Bing; Wong, Man-Hon; Lam, Chi-Fai David; Leung, Kwong-Sak

    2014-01-01

    This paper addresses the approximate matching problem in a database consisting of multiple DNA sequences, where the proposed approach applies Agrep to a new truncated suffix array, r-NSA. The construction time of the structure is linear to the database size, and the computations of indexing a substring in the structure are constant. The number of characters processed in applying Agrep is analysed theoretically, and the theoretical upper-bound can approximate closely the empirical number of characters, which is obtained through enumerating the characters in the actual structure built. Experiments are carried out using (synthetic) random DNA sequences, as well as (real) genome sequences including Hepatitis-B Virus and X-chromosome. Experimental results show that, compared to the straight-forward approach that applies Agrep to multiple sequences individually, the proposed approach solves the matching problem in much shorter time. The speed-up of our approach depends on the sequence patterns, and for highly similar homologous genome sequences, which are the common cases in real-life genomes, it can be up to several orders of magnitude.

  3. A practical approach to screen for authorised and unauthorised genetically modified plants.

    PubMed

    Waiblinger, Hans-Ulrich; Grohmann, Lutz; Mankertz, Joachim; Engelbert, Dirk; Pietsch, Klaus

    2010-03-01

    In routine analysis, screening methods based on real-time PCR are most commonly used for the detection of genetically modified (GM) plant material in food and feed. In this paper, it is shown that the combination of five DNA target sequences can be used as a universal screening approach for at least 81 GM plant events authorised or unauthorised for placing on the market and described in publicly available databases. Except for maize event LY038, soybean events DP-305423 and BPS-CV127-9 and cotton event 281-24-236 x 3006-210-23, at least one of the five genetic elements has been inserted in these GM plants and is targeted by this screening approach. For the detection of these sequences, fully validated real-time PCR methods have been selected. A screening table is presented that describes the presence or absence of the target sequences for most of the listed GM plants. These data have been verified either theoretically according to available databases or experimentally using available reference materials. The screening table will be updated regularly by a network of German enforcement laboratories.

  4. A multi-user real time inventorying system for radioactive materials: a networking approach.

    PubMed

    Mehta, S; Bandyopadhyay, D; Hoory, S

    1998-01-01

    A computerized system for radioisotope management and real time inventory coordinated across a large organization is reported. It handles hundreds of individual users and their separate inventory records. Use of highly efficient computer network and database technologies makes it possible to accept, maintain, and furnish all records related to receipt, usage, and disposal of the radioactive materials for the users separately and collectively. The system's central processor is an HP-9000/800 G60 RISC server and users from across the organization use their personal computers to login to this server using the TCP/IP networking protocol, which makes distributed use of the system possible. Radioisotope decay is automatically calculated by the program, so that it can make the up-to-date radioisotope inventory data of an entire institution available immediately. The system is specifically designed to allow use by large numbers of users (about 300) and accommodates high volumes of data input and retrieval without compromising simplicity and accuracy. Overall, it is an example of a true multi-user, on-line, relational database information system that makes the functioning of a radiation safety department efficient.

  5. BioNetSim: a Petri net-based modeling tool for simulations of biochemical processes.

    PubMed

    Gao, Junhui; Li, Li; Wu, Xiaolin; Wei, Dong-Qing

    2012-03-01

    BioNetSim, a Petri net-based software for modeling and simulating biochemistry processes, is developed, whose design and implement are presented in this paper, including logic construction, real-time access to KEGG (Kyoto Encyclopedia of Genes and Genomes), and BioModel database. Furthermore, glycolysis is simulated as an example of its application. BioNetSim is a helpful tool for researchers to download data, model biological network, and simulate complicated biochemistry processes. Gene regulatory networks, metabolic pathways, signaling pathways, and kinetics of cell interaction are all available in BioNetSim, which makes modeling more efficient and effective. Similar to other Petri net-based softwares, BioNetSim does well in graphic application and mathematic construction. Moreover, it shows several powerful predominances. (1) It creates models in database. (2) It realizes the real-time access to KEGG and BioModel and transfers data to Petri net. (3) It provides qualitative analysis, such as computation of constants. (4) It generates graphs for tracing the concentration of every molecule during the simulation processes.

  6. 41 CFR 102-75.210 - What must a transferee agency include in its request for an exception from the 100 percent...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Section 102-75.210 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 75-REAL PROPERTY DISPOSAL Utilization of Excess Real... exception would further essential agency program objectives and at the same time be consistent with...

  7. Real-time dangling objects sensing: A preliminary design of mobile headset ancillary device for visual impaired.

    PubMed

    Lin, C H; Cheng, P H; Shen, S T

    2014-01-01

    Blinds and severe visual impairments can utilize tactile sticks to assist their walking. However, they cannot fully understand the dangling objects in front of their walking routes. This research proposed a mobile real-time dangling objects sensing (RDOS) prototype, which is located on the cap to sense any front barrier. This device utilized cheap ultrasonic sensor to act as another complement eye for blinds to understand the front dangling objects. Meanwhile, the RDOS device can dynamically adjust the sensor's front angle that is depended on the user's body height and promote the sensing accuracy. Meanwhile, two major required algorithms, height-angle measurement and ultrasonic sensor alignment, are proposed with this prototype. The research team also integrated the RDOS device prototype with mobile Android devices by communicating with Bluetooth to record the walking route.

  8. Real-time model-based vision system for object acquisition and tracking

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian; Gennery, Donald B.; Bon, Bruce; Litwin, Todd

    1987-01-01

    A machine vision system is described which is designed to acquire and track polyhedral objects moving and rotating in space by means of two or more cameras, programmable image-processing hardware, and a general-purpose computer for high-level functions. The image-processing hardware is capable of performing a large variety of operations on images and on image-like arrays of data. Acquisition utilizes image locations and velocities of the features extracted by the image-processing hardware to determine the three-dimensional position, orientation, velocity, and angular velocity of the object. Tracking correlates edges detected in the current image with edge locations predicted from an internal model of the object and its motion, continually updating velocity information to predict where edges should appear in future frames. With some 10 frames processed per second, real-time tracking is possible.

  9. High Stability Engine Control (HISTEC) Flight Test Results

    NASA Technical Reports Server (NTRS)

    Southwick, Robert D.; Gallops, George W.; Kerr, Laura J.; Kielb, Robert P.; Welsh, Mark G.; DeLaat, John C.; Orme, John S.

    1998-01-01

    The High Stability Engine Control (HISTEC) Program, managed and funded by the NASA Lewis Research Center, is a cooperative effort between NASA and Pratt & Whitney (P&W). The program objective is to develop and flight demonstrate an advanced high stability integrated engine control system that uses real-time, measurement-based estimation of inlet pressure distortion to enhance engine stability. Flight testing was performed using the NASA Advanced Controls Technologies for Integrated Vehicles (ACTIVE) F-15 aircraft at the NASA Dryden Flight Research Center. The flight test configuration, details of the research objectives, and the flight test matrix to achieve those objectives are presented. Flight test results are discussed that show the design approach can accurately estimate distortion and perform real-time control actions for engine accommodation.

  10. HRT-UML: a design method for hard real-time systems based on the UML notation

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Massimo; Mazzini, Silvia; di Natale, Marco; Lipari, Giuseppe

    2002-07-01

    The Hard Real-Time-Unified Modelling Language (HRT-UML) method aims at providing a comprehensive solution to the modeling of Hard Real Time systems. The experience shows that the design of Hard Real-Time systems needs methodologies suitable for the modeling and analysis of aspects related to time, schedulability and performance. In the context of the European Aerospace community a reference method for design is Hierarchical Object Oriented Design (HOOD) and in particular its extension for the modeling of hard real time systems, Hard Real-Time-Hierarchical Object Oriented Design (HRT-HOOD), recommended by the European Space Agency (ESA) for the development of on-board systems. On the other hand in recent years the Unified Modelling Language (UML) has been gaining a very large acceptance in a wide range of domains, all over the world, becoming a de-facto international standard. Tool vendors are very active in this potentially big market. In the Aerospace domain the common opinion is that UML, as a general notation, is not suitable for Hard Real Time systems, even if its importance is recognized as a standard and as a technological trend in the near future. These considerations suggest the possibility of replacing the HRT-HOOD method with a customized version of UML, that incorporates the advantages of both standards and complements the weak points. This approach has the clear advantage of making HRT-HOOD converge on a more powerful and expressive modeling notation. The paper identifies a mapping of the HRT-HOOD semantics into the UML one, and proposes a UML extension profile, that we call HRT-UML, based on the UML standard extension mechanisms, to fully represent HRT-HOOD design concepts. Finally it discusses the relationships between our profile and the UML profile for schedulability, performance and time, adopted by OMG in November 2001.

  11. An energy-efficient transmission scheme for real-time data in wireless sensor networks.

    PubMed

    Kim, Jin-Woo; Barrado, José Ramón Ramos; Jeon, Dong-Keun

    2015-05-20

    The Internet of things (IoT) is a novel paradigm where all things or objects in daily life can communicate with other devices and provide services over the Internet. Things or objects need identifying, sensing, networking and processing capabilities to make the IoT paradigm a reality. The IEEE 802.15.4 standard is one of the main communication protocols proposed for the IoT. The IEEE 802.15.4 standard provides the guaranteed time slot (GTS) mechanism that supports the quality of service (QoS) for the real-time data transmission. In spite of some QoS features in IEEE 802.15.4 standard, the problem of end-to-end delay still remains. In order to solve this problem, we propose a cooperative medium access scheme (MAC) protocol for real-time data transmission. We also evaluate the performance of the proposed scheme through simulation. The simulation results demonstrate that the proposed scheme can improve the network performance.

  12. An Energy-Efficient Transmission Scheme for Real-Time Data in Wireless Sensor Networks

    PubMed Central

    Kim, Jin-Woo; Barrado, José Ramón Ramos; Jeon, Dong-Keun

    2015-01-01

    The Internet of things (IoT) is a novel paradigm where all things or objects in daily life can communicate with other devices and provide services over the Internet. Things or objects need identifying, sensing, networking and processing capabilities to make the IoT paradigm a reality. The IEEE 802.15.4 standard is one of the main communication protocols proposed for the IoT. The IEEE 802.15.4 standard provides the guaranteed time slot (GTS) mechanism that supports the quality of service (QoS) for the real-time data transmission. In spite of some QoS features in IEEE 802.15.4 standard, the problem of end-to-end delay still remains. In order to solve this problem, we propose a cooperative medium access scheme (MAC) protocol for real-time data transmission. We also evaluate the performance of the proposed scheme through simulation. The simulation results demonstrate that the proposed scheme can improve the network performance. PMID:26007722

  13. A Bit-Encoding Based New Data Structure for Time and Memory Efficient Handling of Spike Times in an Electrophysiological Setup.

    PubMed

    Ljungquist, Bengt; Petersson, Per; Johansson, Anders J; Schouenborg, Jens; Garwicz, Martin

    2018-04-01

    Recent neuroscientific and technical developments of brain machine interfaces have put increasing demands on neuroinformatic databases and data handling software, especially when managing data in real time from large numbers of neurons. Extrapolating these developments we here set out to construct a scalable software architecture that would enable near-future massive parallel recording, organization and analysis of neurophysiological data on a standard computer. To this end we combined, for the first time in the present context, bit-encoding of spike data with a specific communication format for real time transfer and storage of neuronal data, synchronized by a common time base across all unit sources. We demonstrate that our architecture can simultaneously handle data from more than one million neurons and provide, in real time (< 25 ms), feedback based on analysis of previously recorded data. In addition to managing recordings from very large numbers of neurons in real time, it also has the capacity to handle the extensive periods of recording time necessary in certain scientific and clinical applications. Furthermore, the bit-encoding proposed has the additional advantage of allowing an extremely fast analysis of spatiotemporal spike patterns in a large number of neurons. Thus, we conclude that this architecture is well suited to support current and near-future Brain Machine Interface requirements.

  14. On the reliable use of satellite-derived surface water products for global flood monitoring

    NASA Astrophysics Data System (ADS)

    Hirpa, F. A.; Revilla-Romero, B.; Thielen, J.; Salamon, P.; Brakenridge, R.; Pappenberger, F.; de Groeve, T.

    2015-12-01

    Early flood warning and real-time monitoring systems play a key role in flood risk reduction and disaster response management. To this end, real-time flood forecasting and satellite-based detection systems have been developed at global scale. However, due to the limited availability of up-to-date ground observations, the reliability of these systems for real-time applications have not been assessed in large parts of the globe. In this study, we performed comparative evaluations of the commonly used satellite-based global flood detections and operational flood forecasting system using 10 major flood cases reported over three years (2012-2014). Specially, we assessed the flood detection capabilities of the near real-time global flood maps from the Global Flood Detection System (GFDS), and from the Moderate Resolution Imaging Spectroradiometer (MODIS), and the operational forecasts from the Global Flood Awareness System (GloFAS) for the major flood events recorded in global flood databases. We present the evaluation results of the global flood detection and forecasting systems in terms of correctly indicating the reported flood events and highlight the exiting limitations of each system. Finally, we propose possible ways forward to improve the reliability of large scale flood monitoring tools.

  15. Simulation of intelligent object behavior in a virtual reality system

    NASA Astrophysics Data System (ADS)

    Mironov, Sergey F.

    1998-01-01

    This article presents a technique for computer control of a power boat movement in real-time marine trainers or arcade games. The author developed and successfully implemented a general technique allowing intellectual navigation of computer controlled moving objects that proved to be appropriate for real-time applications. This technique covers significant part of necessary behavioral tasks that appear in such titles. At the same time the technique forms a part of a more general system that involves control of less complicated characters of another nature. The system being an open one can be easily used by an action or arcade programming to improve the overall quality of characters artificial intelligence style.

  16. Real Time Flood Alert System (RTFAS) for Puerto Rico

    USGS Publications Warehouse

    Lopez-Trujillo, Dianne

    2010-01-01

    The Real Time Flood Alert System is a web-based computer program, developed as a data integration tool, and designed to increase the ability of emergency managers to rapidly and accurately predict flooding conditions of streams in Puerto Rico. The system includes software and a relational database to determine the spatial and temporal distribution of rainfall, water levels in streams and reservoirs, and associated storms to determine hazardous and potential flood conditions. The computer program was developed as part of a cooperative agreement between the U.S. Geological Survey Caribbean Water Science Center and the Puerto Rico Emergency Management Agency, and integrates information collected and processed by these two agencies and the National Weather Service.

  17. The X-33 range Operations Control Center

    NASA Technical Reports Server (NTRS)

    Shy, Karla S.; Norman, Cynthia L.

    1998-01-01

    This paper describes the capabilities and features of the X-33 Range Operations Center at NASA Dryden Flight Research Center. All the unprocessed data will be collected and transmitted over fiber optic lines to the Lockheed Operations Control Center for real-time flight monitoring of the X-33 vehicle. By using the existing capabilities of the Western Aeronautical Test Range, the Range Operations Center will provide the ability to monitor all down-range tracking sites for the Extended Test Range systems. In addition to radar tracking and aircraft telemetry data, the Telemetry and Radar Acquisition and Processing System is being enhanced to acquire vehicle command data, differential Global Positioning System corrections and telemetry receiver signal level status. The Telemetry and Radar Acquisition Processing System provides the flexibility to satisfy all X-33 data processing requirements quickly and efficiently. Additionally, the Telemetry and Radar Acquisition Processing System will run a real-time link margin analysis program. The results of this model will be compared in real-time with actual flight data. The hardware and software concepts presented in this paper describe a method of merging all types of data into a common database for real-time display in the Range Operations Center in support of the X-33 program. All types of data will be processed for real-time analysis and display of the range system status to ensure public safety.

  18. Analyzing a multimodal biometric system using real and virtual users

    NASA Astrophysics Data System (ADS)

    Scheidat, Tobias; Vielhauer, Claus

    2007-02-01

    Three main topics of recent research on multimodal biometric systems are addressed in this article: The lack of sufficiently large multimodal test data sets, the influence of cultural aspects and data protection issues of multimodal biometric data. In this contribution, different possibilities are presented to extend multimodal databases by generating so-called virtual users, which are created by combining single biometric modality data of different users. Comparative tests on databases containing real and virtual users based on a multimodal system using handwriting and speech are presented, to study to which degree the use of virtual multimodal databases allows conclusions with respect to recognition accuracy in comparison to real multimodal data. All tests have been carried out on databases created from donations from three different nationality groups. This allows to review the experimental results both in general and in context of cultural origin. The results show that in most cases the usage of virtual persons leads to lower accuracy than the usage of real users in terms of the measurement applied: the Equal Error Rate. Finally, this article will address the general question how the concept of virtual users may influence the data protection requirements for multimodal evaluation databases in the future.

  19. Efficiency in Rule- vs. Plan-Based Movements Is Modulated by Action-Mode.

    PubMed

    Scheib, Jean P P; Stoll, Sarah; Thürmer, J Lukas; Randerath, Jennifer

    2018-01-01

    The rule/plan motor cognition (RPMC) paradigm elicits visually indistinguishable motor outputs, resulting from either plan- or rule-based action-selection, using a combination of essentially interchangeable stimuli. Previous implementations of the RPMC paradigm have used pantomimed movements to compare plan- vs. rule-based action-selection. In the present work we attempt to determine the generalizability of previous RPMC findings to real object interaction by use of a grasp-to-rotate task. In the plan task, participants had to use prospective planning to achieve a comfortable post-handle rotation hand posture. The rule task used implementation intentions (if-then rules) leading to the same comfortable end-state. In Experiment A, we compare RPMC performance of 16 healthy participants in pantomime and real object conditions of the experiment, within-subjects. Higher processing efficiency of rule- vs. plan-based action-selection was supported by diffusion model analysis. Results show a significant response-time increase in the pantomime condition compared to the real object condition and a greater response-time advantage of rule-based vs. plan-based actions in the pantomime compared to the real object condition. In Experiment B, 24 healthy participants performed the real object RPMC task in a task switching vs. a blocked condition. Results indicate that plan-based action-selection leads to longer response-times and less efficient information processing than rule-based action-selection in line with previous RPMC findings derived from the pantomime action-mode. Particularly in the task switching mode, responses were faster in the rule compared to the plan task suggesting a modulating influence of cognitive load. Overall, results suggest an advantage of rule-based action-selection over plan-based action-selection; whereby differential mechanisms appear to be involved depending on the action-mode. We propose that cognitive load is a factor that modulates the advantageous effect of implementation intentions in motor cognition on different levels as illustrated by the varying speed advantages and the variation in diffusion parameters per action-mode or condition, respectively.

  20. Real-time multiple-objective path search for in-vehicle route guidance systems

    DOT National Transportation Integrated Search

    1997-01-01

    The application of multiple-objective route choice for in-vehicle route guidance systems is discussed. A bi-objective path search algorithm is presented and its use demonstrated. A concept of trip quality is introduced that is composed of two objecti...

Top