DOE Office of Scientific and Technical Information (OSTI.GOV)
Coram, Jamie L.; Morrow, James D.; Perkins, David Nikolaus
2015-09-01
This document describes the PANTHER R&D Application, a proof-of-concept user interface application developed under the PANTHER Grand Challenge LDRD. The purpose of the application is to explore interaction models for graph analytics, drive algorithmic improvements from an end-user point of view, and support demonstration of PANTHER technologies to potential customers. The R&D Application implements a graph-centric interaction model that exposes analysts to the algorithms contained within the GeoGraphy graph analytics library. Users define geospatial-temporal semantic graph queries by constructing search templates based on nodes, edges, and the constraints among them. Users then analyze the results of the queries using bothmore » geo-spatial and temporal visualizations. Development of this application has made user experience an explicit driver for project and algorithmic level decisions that will affect how analysts one day make use of PANTHER technologies.« less
Understanding the health care business model: the financial analysts' point of view.
Bukh, Per Nikolaj; Nielsen, Christian
2010-01-01
This study focuses on how financial analysts understand the strategy of a health care company and which elements, from such a strategy perspective, they perceive as constituting the cornerstone of a health care company's business model. The empirical part of this study is based on semi-structured interviews with analysts following a large health care company listed on the Copenhagen Stock Exchange. The authors analyse how the financial analysts view strategy and value creation within the framework of a business model. Further, the authors analyze whether the characteristics emerging from a comprehensive literature review are reflected in the financial analysts' perceptions of which information is decision-relevant and important to communicate to the financial markets. Among the conclusions of the study is the importance of distinguishing between the health care companies' business model and the model by which the payment of revenues are allocated between end users and reimbursing organizations.
Utilizing semantic Wiki technology for intelligence analysis at the tactical edge
NASA Astrophysics Data System (ADS)
Little, Eric
2014-05-01
Challenges exist for intelligence analysts to efficiently and accurately process large amounts of data collected from a myriad of available data sources. These challenges are even more evident for analysts who must operate within small military units at the tactical edge. In such environments, decisions must be made quickly without guaranteed access to the kinds of large-scale data sources available to analysts working at intelligence agencies. Improved technologies must be provided to analysts at the tactical edge to make informed, reliable decisions, since this is often a critical collection point for important intelligence data. To aid tactical edge users, new types of intelligent, automated technology interfaces are required to allow them to rapidly explore information associated with the intersection of hard and soft data fusion, such as multi-INT signals, semantic models, social network data, and natural language processing of text. Abilities to fuse these types of data is paramount to providing decision superiority. For these types of applications, we have developed BLADE. BLADE allows users to dynamically add, delete and link data via a semantic wiki, allowing for improved interaction between different users. Analysts can see information updates in near-real-time due to a common underlying set of semantic models operating within a triple store that allows for updates on related data points from independent users tracking different items (persons, events, locations, organizations, etc.). The wiki can capture pictures, videos and related information. New information added directly to pages is automatically updated in the triple store and its provenance and pedigree is tracked over time, making that data more trustworthy and easily integrated with other users' pages.
One decade of the Data Fusion Information Group (DFIG) model
NASA Astrophysics Data System (ADS)
Blasch, Erik
2015-05-01
The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.
Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V
2014-07-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology
Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.
2014-01-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914
A review method for UML requirements analysis model employing system-side prototyping.
Ogata, Shinpei; Matsuura, Saeko
2013-12-01
User interface prototyping is an effective method for users to validate the requirements defined by analysts at an early stage of a software development. However, a user interface prototype system offers weak support for the analysts to verify the consistency of the specifications about internal aspects of a system such as business logic. As the result, the inconsistency causes a lot of rework costs because the inconsistency often makes the developers impossible to actualize the system based on the specifications. For verifying such consistency, functional prototyping is an effective method for the analysts, but it needs a lot of costs and more detailed specifications. In this paper, we propose a review method so that analysts can verify the consistency among several different kinds of diagrams in UML efficiently by employing system-side prototyping without the detailed model. The system-side prototype system does not have any functions to achieve business logic, but visualizes the results of the integration among the diagrams in UML as Web pages. The usefulness of our proposal was evaluated by applying our proposal into a development of Library Management System (LMS) for a laboratory. This development was conducted by a group. As the result, our proposal was useful for discovering the serious inconsistency caused by the misunderstanding among the members of the group.
Micro-based fact collection tool user's manual
NASA Technical Reports Server (NTRS)
Mayer, Richard
1988-01-01
A procedure designed for use by an analyst to assist in the collection and organization of data gathered during the interview processes associated with system analysis and modeling task is described. The basic concept behind the development of this tool is that during the interview process an analyst is presented with assertions of facts by the domain expert. The analyst also makes observations of the domain. These facts need to be collected and preserved in such a way as to allow them to serve as the basis for a number of decision making processes throughout the system development process. This tool can be thought of as a computerization of the analysts's notebook.
1994-07-01
provide additional information for the user / policy analyst: Eichers, D., Sola, M., McLernan, G., EPICC User’s Manual , Systems Research and Applications...maintenance, and a set of on-line help screens. Each are further discussed below and a full discussion is included in the EPICC User’s Manual . Menu Based...written documentation (user’s manual ) that will be provided with the model. 55 The next chapter discusses the validation of the inventory projection and
Cutter Resource Effectiveness Evaluation (CREE) Program : A Guide for Users and Analysts
DOT National Transportation Integrated Search
1978-03-01
The Cutter Resource Effectiveness Evaluation (CREE) project has developed a sophisticated, user-oriented computer model which can evaluate the effectiveness of any existing Coast Guard craft, or the effectiveness of any of a number of proposed altern...
A users guide for SAMM: a prototype southeast Alaska multiresource model.
D.L. Weyermann; R.D. Fight; L.D. Garrett
1991-01-01
This paper instructs resource analysts on using the southeast Alaska multiresource model (SAMM). SAMM is an interactive microcomputer program that allows users to explore relations among several resources in southeast Alaska (timber, anadromous fish. deer, and hydrology) and the effects of timber management activities (logging, thinning, and road building) on those...
Software for Partly Automated Recognition of Targets
NASA Technical Reports Server (NTRS)
Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark; Selinsky, T.
2002-01-01
The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user's tendencies while the user is selecting targets and to increase the user's productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.
MetaboAnalystR: an R package for flexible and reproducible analysis of metabolomics data.
Chong, Jasmine; Xia, Jianguo
2018-06-28
The MetaboAnalyst web application has been widely used for metabolomics data analysis and interpretation. Despite its user-friendliness, the web interface has presented its inherent limitations (especially for advanced users) with regard to flexibility in creating customized workflow, support for reproducible analysis, and capacity in dealing with large data. To address these limitations, we have developed a companion R package (MetaboAnalystR) based on the R code base of the web server. The package has been thoroughly tested to ensure that the same R commands will produce identical results from both interfaces. MetaboAnalystR complements the MetaboAnalyst web server to facilitate transparent, flexible and reproducible analysis of metabolomics data. MetaboAnalystR is freely available from https://github.com/xia-lab/MetaboAnalystR. Supplementary data are available at Bioinformatics online.
Oil and Gas Supply Module - NEMS Documentation
2017-01-01
Defines the objectives of the Oil and Gas Supply Model (OGSM), to describe the model's basic approach, and to provide detail on how the model works. This report is intended as a reference document for model analysts, users, and the public.
Interpreting Black-Box Classifiers Using Instance-Level Visual Explanations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tamagnini, Paolo; Krause, Josua W.; Dasgupta, Aritra
2017-05-14
To realize the full potential of machine learning in diverse real- world domains, it is necessary for model predictions to be readily interpretable and actionable for the human in the loop. Analysts, who are the users but not the developers of machine learning models, often do not trust a model because of the lack of transparency in associating predictions with the underlying data space. To address this problem, we propose Rivelo, a visual analytic interface that enables analysts to understand the causes behind predictions of binary classifiers by interactively exploring a set of instance-level explanations. These explanations are model-agnostic, treatingmore » a model as a black box, and they help analysts in interactively probing the high-dimensional binary data space for detecting features relevant to predictions. We demonstrate the utility of the interface with a case study analyzing a random forest model on the sentiment of Yelp reviews about doctors.« less
Physics-based and human-derived information fusion for analysts
NASA Astrophysics Data System (ADS)
Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael
2017-05-01
Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.
NASTRAN user's guide (Level 17.5)
NASA Technical Reports Server (NTRS)
Field, E. I.; Herting, D. N.; Morgan, M. J.
1979-01-01
The user's guide is a handbook for engineers and analysts who use the NASTRAN finite element computer program supplements the NASTRAN Theoretical Manual (NASA SP-221), the NASTRAN User's Manual (NASA SP-222), the NASTRAN Programmer's Manual (NASA SP-223), and the NASTRAN Demonstration Program Manual (NASA SP-224). It provides modeling hints, attributes of the program, and references to the four manuals listed.
Software for Partly Automated Recognition of Targets
NASA Technical Reports Server (NTRS)
Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark
2003-01-01
The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user s tendencies while the user is selecting targets and to increase the user s productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.
2016-11-01
Display Design, Methods , and Results for a User Study by Christopher J Garneau and Robert F Erbacher Approved for public...NOV 2016 US Army Research Laboratory Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods ...January 2013–September 2015 4. TITLE AND SUBTITLE Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods
Emissions Scenario Portal for Visualization of Low Carbon Pathways
NASA Astrophysics Data System (ADS)
Friedrich, J.; Hennig, R. J.; Mountford, H.; Altamirano, J. C.; Ge, M.; Fransen, T.
2016-12-01
This proposal for a presentation is centered around a new project which is developed collaboratively by the World Resources Institute (WRI), Google Inc., and Deep Decarbonization Pathways Project (DDPP). The project aims to develop an online, open portal, the Emissions Scenario Portal (ESP),to enable users to easily visualize a range of future greenhouse gas emission pathways linked to different scenarios of economic and energy developments, drawing from a variety of modeling tools. It is targeted to users who are not modelling experts, but instead policy analysts or advisors, investment analysts, and similar who draw on modelled scenarios to inform their work, and who can benefit from better access to, and transparency around, the wide range of emerging scenarios on ambitious climate action. The ESP will provide information from scenarios in a visually appealing and easy-to-understand manner that enable these users to recognize the opportunities to reduce GHG emissions, the implications of the different scenarios, and the underlying assumptions. To facilitate the application of the portal and tools in policy dialogues, a series of country-specific and potentially sector-specific workshops with key decision-makers and analysts, supported by relevant analysis, will be organized by the key partners and also in broader collaboration with others who might wish to convene relevant groups around the information. This project will provide opportunities for modelers to increase their outreach and visibility in the public space and to directly interact with key audiences of emissions scenarios, such as policy analysts and advisors. The information displayed on the portal will cover a wide range of indicators, sectors and important scenario characteristics such as macroeconomic information, emission factors and policy as well as technology assumptions in order to facilitate comparison. These indicators have been selected based on existing standards (such as the IIASA AR5 database, the Greenhouse Gas Protocol and accounting literature) and stakeholder consultations. Examples for use cases include: technical advisers for governments NGO/Civil Society advocates Investors and bankers Modelers and academics Business sustainability officers
SAFARI, an On-Line Text-Processing System User's Manual.
ERIC Educational Resources Information Center
Chapin, P.G.; And Others.
This report describes for the potential user a set of procedures for processing textual materials on-line. In this preliminary model an information analyst can scan through messages, reports, and other documents on a display scope and select relevant facts, which are processed linguistically and then stored in the computer in the form of logical…
Human-machine interaction to disambiguate entities in unstructured text and structured datasets
NASA Astrophysics Data System (ADS)
Ward, Kevin; Davenport, Jack
2017-05-01
Creating entity network graphs is a manual, time consuming process for an intelligence analyst. Beyond the traditional big data problems of information overload, individuals are often referred to by multiple names and shifting titles as they advance in their organizations over time which quickly makes simple string or phonetic alignment methods for entities insufficient. Conversely, automated methods for relationship extraction and entity disambiguation typically produce questionable results with no way for users to vet results, correct mistakes or influence the algorithm's future results. We present an entity disambiguation tool, DRADIS, which aims to bridge the gap between human-centric and machinecentric methods. DRADIS automatically extracts entities from multi-source datasets and models them as a complex set of attributes and relationships. Entities are disambiguated across the corpus using a hierarchical model executed in Spark allowing it to scale to operational sized data. Resolution results are presented to the analyst complete with sourcing information for each mention and relationship allowing analysts to quickly vet the correctness of results as well as correct mistakes. Corrected results are used by the system to refine the underlying model allowing analysts to optimize the general model to better deal with their operational data. Providing analysts with the ability to validate and correct the model to produce a system they can trust enables them to better focus their time on producing higher quality analysis products.
User's guide to the Reliability Estimation System Testbed (REST)
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam
1992-01-01
The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.
LG-ANALYST: linguistic geometry for master air attack planning
NASA Astrophysics Data System (ADS)
Stilman, Boris; Yakhnis, Vladimir; Umanskiy, Oleg
2003-09-01
We investigate the technical feasibility of implementing LG-ANALYST, a new software tool based on the Linguistic Geometry (LG) approach. The tool will be capable of modeling and providing solutions to Air Force related battlefield problems and of conducting multiple experiments to verify the quality of the solutions it generates. LG-ANALYST will support generation of the Fast Master Air Attack Plan (MAAP) with subsequent conversion into Air Tasking Order (ATO). An Air Force mission is modeled employing abstract board games (ABG). Such a mission may include, for example, an aircraft strike package moving to a target area with the opposing side having ground-to-air missiles, anti-aircraft batteries, fighter wings, and radars. The corresponding abstract board captures 3D air space, terrain, the aircraft trajectories, positions of the batteries, strategic features of the terrain, such as bridges, and their status, radars and illuminated space, etc. Various animated views are provided by LG-ANALYST including a 3D view for realistic representation of the battlespace and a 2D view for ease of analysis and control. LG-ANALYST will allow a user to model full scale intelligent enemy, plan in advance, re-plan and control in real time Blue and Red forces by generating optimal (or near-optimal) strategies for all sides of a conflict.
ListingAnalyst: A program for analyzing the main output file from MODFLOW
Winston, Richard B.; Paulinski, Scott
2014-01-01
ListingAnalyst is a Windows® program for viewing the main output file from MODFLOW-2005, MODFLOW-NWT, or MODFLOW-LGR. It organizes and displays large files quickly without using excessive memory. The sections and subsections of the file are displayed in a tree-view control, which allows the user to navigate quickly to desired locations in the files. ListingAnalyst gathers error and warning messages scattered throughout the main output file and displays them all together in an error and a warning tab. A grid view displays tables in a readable format and allows the user to copy the table into a spreadsheet. The user can also search the file for terms of interest.
Model documentation report: Transportation sector model of the National Energy Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-03-01
This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. This document serves three purposes. First, it is a reference document providing a detailed description of TRAN for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, 57(b)(1)). Third, it permits continuity inmore » model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements.« less
Evaluating the risk of industrial espionage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bott, T.F.
1998-12-31
A methodology for estimating the relative probabilities of different compromise paths for protected information by insider and visitor intelligence collectors has been developed based on an event-tree analysis of the intelligence collection operation. The analyst identifies target information and ultimate users who might attempt to gain that information. The analyst then uses an event tree to develop a set of compromise paths. Probability models are developed for each of the compromise paths that user parameters based on expert judgment or historical data on security violations. The resulting probability estimates indicate the relative likelihood of different compromise paths and provide anmore » input for security resource allocation. Application of the methodology is demonstrated using a national security example. A set of compromise paths and probability models specifically addressing this example espionage problem are developed. The probability models for hard-copy information compromise paths are quantified as an illustration of the results using parametric values representative of historical data available in secure facilities, supplemented where necessary by expert judgment.« less
OpinionFlow: Visual Analysis of Opinion Diffusion on Social Media.
Wu, Yingcai; Liu, Shixia; Yan, Kai; Liu, Mengchen; Wu, Fangzhao
2014-12-01
It is important for many different applications such as government and business intelligence to analyze and explore the diffusion of public opinions on social media. However, the rapid propagation and great diversity of public opinions on social media pose great challenges to effective analysis of opinion diffusion. In this paper, we introduce a visual analysis system called OpinionFlow to empower analysts to detect opinion propagation patterns and glean insights. Inspired by the information diffusion model and the theory of selective exposure, we develop an opinion diffusion model to approximate opinion propagation among Twitter users. Accordingly, we design an opinion flow visualization that combines a Sankey graph with a tailored density map in one view to visually convey diffusion of opinions among many users. A stacked tree is used to allow analysts to select topics of interest at different levels. The stacked tree is synchronized with the opinion flow visualization to help users examine and compare diffusion patterns across topics. Experiments and case studies on Twitter data demonstrate the effectiveness and usability of OpinionFlow.
A content analysis of analyst research: health care through the eyes of analysts.
Nielsen, Christian
2008-01-01
This article contributes to the understanding of how health care companies may communicate the business models by studying financial analysts' analyst reports. The study examines the differences between the information conveyed in recurrent and fundamental analyst reports as well as whether the characteristics of the analysts and their environment affect their business model analyses. A medium-sized health care company in the medical-technology sector, internationally renowned for its state-of-the-art business reporting, was chosen as the basis for the study. An analysis of 111 fundamental and recurrent analyst reports on this company by each investment bank actively following it was conducted using a content analysis methodology. The study reveals that the recurrent analyses are concerned with evaluating the information disclosed by the health care company itself and not so much with digging up new information. It also indicates that while maintenance work might be focused on evaluating specific details, fundamental research is more concerned with extending the understanding of the general picture, i.e., the sustainability and performance of the overall business model. The amount of financial information disclosed in either type of report is not correlated to the other disclosures in the reports. In comparison to business reporting practices, the fundamental analyst reports put considerably less weight on social and sustainability, intellectual capital and corporate governance information, and they disclose much less comparable non-financial information. The suggestion made is that looking at the types of information financial analysts consider important and convey to their "customers," the investors and fund managers, constitutes a valuable indication to health care companies regarding the needs of the financial market users of their reports and other communications. There are some limitations to the possibility of applying statistical tests to the data-set as well as methodological limitations in relation to the exclusion of tables and graphs.
King, Robert; Parker, Simon; Mouzakis, Kon; Fletcher, Winston; Fitzgerald, Patrick
2007-11-01
The Integrated Task Modeling Environment (ITME) is a user-friendly software tool that has been developed to automatically recode low-level data into an empirical record of meaningful task performance. The present research investigated and validated the performance of the ITME software package by conducting complex simulation missions and comparing the task analyses produced by ITME with taskanalyses produced by experienced video analysts. A very high interrater reliability (> or = .94) existed between experienced video analysts and the ITME for the task analyses produced for each mission. The mean session time:analysis time ratio was 1:24 using video analysis techniques and 1:5 using the ITME. It was concluded that the ITME produced task analyses that were as reliable as those produced by experienced video analysts, and significantly reduced the time cost associated with these analyses.
Collaborative human-machine analysis using a controlled natural language
NASA Astrophysics Data System (ADS)
Mott, David H.; Shemanski, Donald R.; Giammanco, Cheryl; Braines, Dave
2015-05-01
A key aspect of an analyst's task in providing relevant information from data is the reasoning about the implications of that data, in order to build a picture of the real world situation. This requires human cognition, based upon domain knowledge about individuals, events and environmental conditions. For a computer system to collaborate with an analyst, it must be capable of following a similar reasoning process to that of the analyst. We describe ITA Controlled English (CE), a subset of English to represent analyst's domain knowledge and reasoning, in a form that it is understandable by both analyst and machine. CE can be used to express domain rules, background data, assumptions and inferred conclusions, thus supporting human-machine interaction. A CE reasoning and modeling system can perform inferences from the data and provide the user with conclusions together with their rationale. We present a logical problem called the "Analysis Game", used for training analysts, which presents "analytic pitfalls" inherent in many problems. We explore an iterative approach to its representation in CE, where a person can develop an understanding of the problem solution by incremental construction of relevant concepts and rules. We discuss how such interactions might occur, and propose that such techniques could lead to better collaborative tools to assist the analyst and avoid the "pitfalls".
How Analysts Cognitively “Connect the Dots”
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradel, Lauren; Self, Jessica S.; Endert, Alexander
2013-06-04
As analysts attempt to make sense of a collection of documents, such as intelligence analysis reports, they may wish to “connect the dots” between pieces of information that may initially seem unrelated. This process of synthesizing information between information requires users to make connections between pairs of documents, creating a conceptual story. We conducted a user study to analyze the process by which users connect pairs of documents and how they spatially arrange information. Users created conceptual stories that connected the dots using organizational strategies that ranged in complexity. We propose taxonomies for cognitive connections and physical structures used whenmore » trying to “connect the dots” between two documents. We compared the user-created stories with a data-mining algorithm that constructs chains of documents using co-occurrence metrics. Using the insight gained into the storytelling process, we offer design considerations for the existing data mining algorithm and corresponding tools to combine the power of data mining and the complex cognitive processing of analysts.« less
Liquid Fuels Market Module - NEMS Documentation
2017-01-01
Defines the objectives of the Liquid Fuels Market Model (LFMM), describes its basic approach, and provides detail on how it works. This report is intended as a reference document for model analysts, users, and the public. This edition of the LFMM reflects changes made to the module over the past two years for the Annual Energy Outlook 2016.
Using participatory design to develop (public) health decision support systems through GIS.
Dredger, S Michelle; Kothari, Anita; Morrison, Jason; Sawada, Michael; Crighton, Eric J; Graham, Ian D
2007-11-27
Organizations that collect substantial data for decision-making purposes are often characterized as being 'data rich' but 'information poor'. Maps and mapping tools can be very useful for research transfer in converting locally collected data into information. Challenges involved in incorporating GIS applications into the decision-making process within the non-profit (public) health sector include a lack of financial resources for software acquisition and training for non-specialists to use such tools. This on-going project has two primary phases. This paper critically reflects on Phase 1: the participatory design (PD) process of developing a collaborative web-based GIS tool. A case study design is being used whereby the case is defined as the data analyst and manager dyad (a two person team) in selected Ontario Early Year Centres (OEYCs). Multiple cases are used to support the reliability of findings. With nine producer/user pair participants, the goal in Phase 1 was to identify barriers to map production, and through the participatory design process, develop a web-based GIS tool suited for data analysts and their managers. This study has been guided by the Ottawa Model of Research Use (OMRU) conceptual framework. Due to wide variations in OEYC structures, only some data analysts used mapping software and there was no consistency or standardization in the software being used. Consequently, very little sharing of maps and data occurred among data analysts. Using PD, this project developed a web-based mapping tool (EYEMAP) that was easy to use, protected proprietary data, and permit limited and controlled sharing between participants. By providing data analysts with training on its use, the project also ensured that data analysts would not break cartographic conventions (e.g. using a chloropleth map for count data). Interoperability was built into the web-based solution; that is, EYEMAP can read many different standard mapping file formats (e.g. ESRI, MapInfo, CSV). Based on the evaluation of Phase 1, the PD process has served both as a facilitator and a barrier. In terms of successes, the PD process identified two key components that are important to users: increased data/map sharing functionality and interoperability. Some of the challenges affected developers and users; both individually and as a collective. From a development perspective, this project experienced difficulties in obtaining personnel skilled in web application development and GIS. For users, some data sharing barriers are beyond what a technological tool can address (e.g. third party data). Lastly, the PD process occurs in real time; both a strength and a limitation. Programmatic changes at the provincial level and staff turnover at the organizational level made it difficult to maintain buy-in as participants changed over time. The impacts of these successes and challenges will be evaluated more concretely at the end of Phase 2. PD approaches, by their very nature, encourage buy-in to the development process, better addresses user-needs, and creates a sense of user-investment and ownership.
Users manual for the US baseline corn and soybean segment classification procedure
NASA Technical Reports Server (NTRS)
Horvath, R.; Colwell, R. (Principal Investigator); Hay, C.; Metzler, M.; Mykolenko, O.; Odenweller, J.; Rice, D.
1981-01-01
A user's manual for the classification component of the FY-81 U.S. Corn and Soybean Pilot Experiment in the Foreign Commodity Production Forecasting Project of AgRISTARS is presented. This experiment is one of several major experiments in AgRISTARS designed to measure and advance the remote sensing technologies for cropland inventory. The classification procedure discussed is designed to produce segment proportion estimates for corn and soybeans in the U.S. Corn Belt (Iowa, Indiana, and Illinois) using LANDSAT data. The estimates are produced by an integrated Analyst/Machine procedure. The Analyst selects acquisitions, participates in stratification, and assigns crop labels to selected samples. In concert with the Analyst, the machine digitally preprocesses LANDSAT data to remove external effects, stratifies the data into field like units and into spectrally similar groups, statistically samples the data for Analyst labeling, and combines the labeled samples into a final estimate.
Synfuel program analysis. Volume 2: VENVAL users manual
NASA Astrophysics Data System (ADS)
Muddiman, J. B.; Whelan, J. W.
1980-07-01
This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.
SURVIAC Bulletin: AFRL Research Audit Trail Viewer (ATV). Volume 19, Issue 1, 2003
2003-01-01
Trail Viewer, the analyst obtained a close up view of the detailed aircraft model using the Orbit View, enabled the SkyBox , enabled fictional ter...trails and element projections, several simulated terrain types and Skybox environments to help the user maintain perspective, file based
The discounting model selector: Statistical software for delay discounting applications.
Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A
2017-05-01
Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.
Development and Application of an Analyst Process Model for a Search Task Scenario
2013-12-01
varied experience levels of the users we will be looking at not only testing the new tool, but also understanding the impact on user groups that the...each group using the toolsets to complete search tasks. 2.4 Hypotheses This research effort seeks to test the following hypotheses: H0... quantitative measures: report quality, errors, and cognitive workload. Due to the crossover design of the experiment, these were analyzed by group and within
GRODY - GAMMA RAY OBSERVATORY DYNAMICS SIMULATOR IN ADA
NASA Technical Reports Server (NTRS)
Stark, M.
1994-01-01
Analysts use a dynamics simulator to test the attitude control system algorithms used by a satellite. The simulator must simulate the hardware, dynamics, and environment of the particular spacecraft and provide user services which enable the analyst to conduct experiments. Researchers at Goddard's Flight Dynamics Division developed GRODY alongside GROSS (GSC-13147), a FORTRAN simulator which performs the same functions, in a case study to assess the feasibility and effectiveness of the Ada programming language for flight dynamics software development. They used popular object-oriented design techniques to link the simulator's design with its function. GRODY is designed for analysts familiar with spacecraft attitude analysis. The program supports maneuver planning as well as analytical testing and evaluation of the attitude determination and control system used on board the Gamma Ray Observatory (GRO) satellite. GRODY simulates the GRO on-board computer and Control Processor Electronics. The analyst/user sets up and controls the simulation. GRODY allows the analyst to check and update parameter values and ground commands, obtain simulation status displays, interrupt the simulation, analyze previous runs, and obtain printed output of simulation runs. The video terminal screen display allows visibility of command sequences, full-screen display and modification of parameters using input fields, and verification of all input data. Data input available for modification includes alignment and performance parameters for all attitude hardware, simulation control parameters which determine simulation scheduling and simulator output, initial conditions, and on-board computer commands. GRODY generates eight types of output: simulation results data set, analysis report, parameter report, simulation report, status display, plots, diagnostic output (which helps the user trace any problems that have occurred during a simulation), and a permanent log of all runs and errors. The analyst can send results output in graphical or tabular form to a terminal, disk, or hardcopy device, and can choose to have any or all items plotted against time or against each other. Goddard researchers developed GRODY on a VAX 8600 running VMS version 4.0. For near real time performance, GRODY requires a VAX at least as powerful as a model 8600 running VMS 4.0 or a later version. To use GRODY, the VAX needs an Ada Compilation System (ACS), Code Management System (CMS), and 1200K memory. GRODY is written in Ada and FORTRAN.
DataQs analyst guide : best practices for federal and state agency users.
DOT National Transportation Integrated Search
2014-12-01
The DataQs Analyst Guide provides practical guidance and : best practices to address and resolve Requests for Data : Reviews (RDRs) submitted electronically to FMCSA by motor : carriers, commercial drivers, and other persons using the : DataQs system...
Unlocking User-Centered Design Methods for Building Cyber Security Visualizations
2015-08-07
have rarely linked these methods to a final, deployed tool. Goodall et al. interviewed analysts to derive requirements for a network security tool [14... Goodall , W. Lutters, and A. Komlodi. The work of intrusion detec- tion: rethinking the role of security analysts. AMCIS 2004 Proceed- ings, 2004. [14] J. R... Goodall , A. A. Ozok, W. G. Lutters, P. Rheingans, and A. Kom- lodi. A user-centered approach to visualizing network traffic for intru- sion
Software Tool Integrating Data Flow Diagrams and Petri Nets
NASA Technical Reports Server (NTRS)
Thronesbery, Carroll; Tavana, Madjid
2010-01-01
Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.
The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation
2015-02-20
being integrated within MAT, including Granger causality. Granger causality tests whether a data series helps when predicting future values of another...relations by econometric models and cross-spectral methods. Econometrica: Journal of the Econometric Society, 424-438. Granger, C. W. (1980). Testing ... testing dataset. This effort is described in Section 3.2. 3.1. Improvements in Granger Causality User Interface Various metrics of causality are
Advanced Technology Lifecycle Analysis System (ATLAS)
NASA Technical Reports Server (NTRS)
O'Neil, Daniel A.; Mankins, John C.
2004-01-01
Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is satisfied with the system configurations, technology portfolios, and deployment strategies, he or she can present the concepts to a team, which will conduct a detailed, discipline-oriented analysis within a CEE. An analog to this approach is the music industry where a songwriter creates the lyrics and music before entering a recording studio.
Using participatory design to develop (public) health decision support systems through GIS
Dredger, S Michelle; Kothari, Anita; Morrison, Jason; Sawada, Michael; Crighton, Eric J; Graham, Ian D
2007-01-01
Background Organizations that collect substantial data for decision-making purposes are often characterized as being 'data rich' but 'information poor'. Maps and mapping tools can be very useful for research transfer in converting locally collected data into information. Challenges involved in incorporating GIS applications into the decision-making process within the non-profit (public) health sector include a lack of financial resources for software acquisition and training for non-specialists to use such tools. This on-going project has two primary phases. This paper critically reflects on Phase 1: the participatory design (PD) process of developing a collaborative web-based GIS tool. Methods A case study design is being used whereby the case is defined as the data analyst and manager dyad (a two person team) in selected Ontario Early Year Centres (OEYCs). Multiple cases are used to support the reliability of findings. With nine producer/user pair participants, the goal in Phase 1 was to identify barriers to map production, and through the participatory design process, develop a web-based GIS tool suited for data analysts and their managers. This study has been guided by the Ottawa Model of Research Use (OMRU) conceptual framework. Results Due to wide variations in OEYC structures, only some data analysts used mapping software and there was no consistency or standardization in the software being used. Consequently, very little sharing of maps and data occurred among data analysts. Using PD, this project developed a web-based mapping tool (EYEMAP) that was easy to use, protected proprietary data, and permit limited and controlled sharing between participants. By providing data analysts with training on its use, the project also ensured that data analysts would not break cartographic conventions (e.g. using a chloropleth map for count data). Interoperability was built into the web-based solution; that is, EYEMAP can read many different standard mapping file formats (e.g. ESRI, MapInfo, CSV). Discussion Based on the evaluation of Phase 1, the PD process has served both as a facilitator and a barrier. In terms of successes, the PD process identified two key components that are important to users: increased data/map sharing functionality and interoperability. Some of the challenges affected developers and users; both individually and as a collective. From a development perspective, this project experienced difficulties in obtaining personnel skilled in web application development and GIS. For users, some data sharing barriers are beyond what a technological tool can address (e.g. third party data). Lastly, the PD process occurs in real time; both a strength and a limitation. Programmatic changes at the provincial level and staff turnover at the organizational level made it difficult to maintain buy-in as participants changed over time. The impacts of these successes and challenges will be evaluated more concretely at the end of Phase 2. Conclusion PD approaches, by their very nature, encourage buy-in to the development process, better addresses user-needs, and creates a sense of user-investment and ownership. PMID:18042298
An Advanced Simulation Framework for Parallel Discrete-Event Simulation
NASA Technical Reports Server (NTRS)
Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.
1994-01-01
Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.
MAGIC Computer Simulation. Volume 2: Analyst Manual, Part 1
1971-05-01
A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1971 4. TITLE AND SUBTITLE MAGIC Computer Simulation Analyst Manual Part 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...14. ABSTRACT The MAGIC computer simulation generates target description data consisting of item-by-item listings of the target’s components and air
MOD Tool (Microwave Optics Design Tool)
NASA Technical Reports Server (NTRS)
Katz, Daniel S.; Borgioli, Andrea; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.
1999-01-01
The Jet Propulsion Laboratory (JPL) is currently designing and building a number of instruments that operate in the microwave and millimeter-wave bands. These include MIRO (Microwave Instrument for the Rosetta Orbiter), MLS (Microwave Limb Sounder), and IMAS (Integrated Multispectral Atmospheric Sounder). These instruments must be designed and built to meet key design criteria (e.g., beamwidth, gain, pointing) obtained from the scientific goals for the instrument. These criteria are frequently functions of the operating environment (both thermal and mechanical). To design and build instruments which meet these criteria, it is essential to be able to model the instrument in its environments. Currently, a number of modeling tools exist. Commonly used tools at JPL include: FEMAP (meshing), NASTRAN (structural modeling), TRASYS and SINDA (thermal modeling), MACOS/IMOS (optical modeling), and POPO (physical optics modeling). Each of these tools is used by an analyst, who models the instrument in one discipline. The analyst then provides the results of this modeling to another analyst, who continues the overall modeling in another discipline. There is a large reengineering task in place at JPL to automate and speed-up the structural and thermal modeling disciplines, which does not include MOD Tool. The focus of MOD Tool (and of this paper) is in the fields unique to microwave and millimeter-wave instrument design. These include initial design and analysis of the instrument without thermal or structural loads, the automation of the transfer of this design to a high-end CAD tool, and the analysis of the structurally deformed instrument (due to structural and/or thermal loads). MOD Tool is a distributed tool, with a database of design information residing on a server, physical optics analysis being performed on a variety of supercomputer platforms, and a graphical user interface (GUI) residing on the user's desktop computer. The MOD Tool client is being developed using Tcl/Tk, which allows the user to work on a choice of platforms (PC, Mac, or Unix) after downloading the Tcl/Tk binary, which is readily available on the web. The MOD Tool server is written using Expect, and it resides on a Sun workstation. Client/server communications are performed over a socket, where upon a connection from a client to the server, the server spawns a child which is be dedicated to communicating with that client. The server communicates with other machines, such as supercomputers using Expect with the username and password being provided by the user on the client.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Zhao, Jian; Glueck, Michael; Breslav, Simon; Chevalier, Fanny; Khan, Azam
2017-01-01
User-authored annotations of data can support analysts in the activity of hypothesis generation and sensemaking, where it is not only critical to document key observations, but also to communicate insights between analysts. We present annotation graphs, a dynamic graph visualization that enables meta-analysis of data based on user-authored annotations. The annotation graph topology encodes annotation semantics, which describe the content of and relations between data selections, comments, and tags. We present a mixed-initiative approach to graph layout that integrates an analyst's manual manipulations with an automatic method based on similarity inferred from the annotation semantics. Various visual graph layout styles reveal different perspectives on the annotation semantics. Annotation graphs are implemented within C8, a system that supports authoring annotations during exploratory analysis of a dataset. We apply principles of Exploratory Sequential Data Analysis (ESDA) in designing C8, and further link these to an existing task typology in the visualization literature. We develop and evaluate the system through an iterative user-centered design process with three experts, situated in the domain of analyzing HCI experiment data. The results suggest that annotation graphs are effective as a method of visually extending user-authored annotations to data meta-analysis for discovery and organization of ideas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
BERG, MICHAEL; RILEY, MARSHALL
System assessments typically yield large quantities of data from disparate sources for an analyst to scrutinize for issues. Netmeld is used to parse input from different file formats, store the data in a common format, allow users to easily query it, and enable analysts to tie different analysis tools together using a common back-end.
Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy
2017-03-01
Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.
Air Vehicles Division Computational Structural Analysis Facilities Policy and Guidelines for Users
2005-05-01
34 Thermal " as appropriate and the tolerance set to "default". b) Create the model geometry. c) Create the finite elements. d) Create the...linear, non-linear, dynamic, thermal , acoustic analysis. The modelling of composite materials, creep, fatigue and plasticity are also covered...perform professional, high quality finite element analysis (FEA). FE analysts from many tasks within AVD are using the facilities to conduct FEA with
Real-Time Visualization of Network Behaviors for Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.
Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less
Developing Guidelines for Assessing Visual Analytics Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean
2011-07-01
In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less
Visualization of Spatio-Temporal Relations in Movement Event Using Multi-View
NASA Astrophysics Data System (ADS)
Zheng, K.; Gu, D.; Fang, F.; Wang, Y.; Liu, H.; Zhao, W.; Zhang, M.; Li, Q.
2017-09-01
Spatio-temporal relations among movement events extracted from temporally varying trajectory data can provide useful information about the evolution of individual or collective movers, as well as their interactions with their spatial and temporal contexts. However, the pure statistical tools commonly used by analysts pose many difficulties, due to the large number of attributes embedded in multi-scale and multi-semantic trajectory data. The need for models that operate at multiple scales to search for relations at different locations within time and space, as well as intuitively interpret what these relations mean, also presents challenges. Since analysts do not know where or when these relevant spatio-temporal relations might emerge, these models must compute statistical summaries of multiple attributes at different granularities. In this paper, we propose a multi-view approach to visualize the spatio-temporal relations among movement events. We describe a method for visualizing movement events and spatio-temporal relations that uses multiple displays. A visual interface is presented, and the user can interactively select or filter spatial and temporal extents to guide the knowledge discovery process. We also demonstrate how this approach can help analysts to derive and explain the spatio-temporal relations of movement events from taxi trajectory data.
The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation
2013-05-20
Charles River’s Metronome framework. This framework is built on top of the same Equinox libraries that the popular Eclipse Development Environment uses...the names are fully visible (see Figure 8). The Metronome framework also provides functionality for undo and redo, so the user can easily correct...mistakes. Figure 8. Changing Pane sizes and layouts in the new Metronome -enhanced MAT This period, we also improved the MAT project file format so
User’s Guide for the SAS (Stand-Off Attack Simulation) Computer Model.
1982-01-15
A99QAXFD000-01 Albuquerque, New Mexico 87110 I1. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Director 15 January 1982 Defense Nuclear Aqency 13...computer model. SAS is an effective survivability and security system design tool which allows an analyst to compare the relative effectiveness of selected...mounted against other systems during uploading for dispersal or for non -emergency relocation. GLCM and LANCE must be mobilized and formed into convoys
Model Analyst’s Toolkit User Guide, Version 7.1.0
2015-08-01
Help > About) Environment details ( operating system ) metronome.log file, located in your MAT 7.1.0 installation folder Any log file that...requirements to run the Model Analyst’s Toolkit: Windows XP operating system (or higher) with Service Pack 2 and all critical Windows updates installed...application icon on your desktop Create a Quick Launch icon – Creates a MAT application icon on the taskbar for operating systems released
ERIC Educational Resources Information Center
Auster, Ethel; Lawton, Stephen B.
This research study involved a systematic investigation into the relationships among: (1) the techniques used by search analysts during preliminary interviews with users before engaging in online retrieval of bibliographic citations; (2) the amount of new information gained by the user as a result of the search; and (3) the user's ultimate…
MetaboAnalyst 3.0--making metabolomics more meaningful.
Xia, Jianguo; Sinelnikov, Igor V; Han, Beomsoo; Wishart, David S
2015-07-01
MetaboAnalyst (www.metaboanalyst.ca) is a web server designed to permit comprehensive metabolomic data analysis, visualization and interpretation. It supports a wide range of complex statistical calculations and high quality graphical rendering functions that require significant computational resources. First introduced in 2009, MetaboAnalyst has experienced more than a 50X growth in user traffic (>50 000 jobs processed each month). In order to keep up with the rapidly increasing computational demands and a growing number of requests to support translational and systems biology applications, we performed a substantial rewrite and major feature upgrade of the server. The result is MetaboAnalyst 3.0. By completely re-implementing the MetaboAnalyst suite using the latest web framework technologies, we have been able substantially improve its performance, capacity and user interactivity. Three new modules have also been added including: (i) a module for biomarker analysis based on the calculation of receiver operating characteristic curves; (ii) a module for sample size estimation and power analysis for improved planning of metabolomics studies and (iii) a module to support integrative pathway analysis for both genes and metabolites. In addition, popular features found in existing modules have been significantly enhanced by upgrading the graphical output, expanding the compound libraries and by adding support for more diverse organisms. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
The DPAC Compensation Model: An Introductory Handbook.
1987-04-01
introductory and advanced economics courses at the US Air Force Academy, he served for four years as an analyst and action officer in the ...introduces new users to the ACOL framework and provides some guidelines for choosing reasonable values for the four long-run parameters required to run the ...regression coefficients for ACOL and the civilian unemployment rate; for pilots, the number of " new " pilot
Knowledge Development Generic Framework Concept
2008-12-18
requirements. The conceptual model serves as a communication interface among analysts, military staff, and other actors involved [22015] Systems Analysis will...It designates all long- lived basic mechanisms of material and institutional kind, which guarantee the functioning of a complex community . 2.2.3.2...cooperation with users) • Analyze and decide whether it is better to communicate an information object automatically (“document-to-people”) or via human
User's guide for the thermal analyst's help desk expert system
NASA Technical Reports Server (NTRS)
Ormsby, Rachel A.
1994-01-01
A guide for users of the Thermal Analyst's Help Desk is provided. Help Desk is an expert system that runs on a DOS based personal computer and operates within the EXSYS expert system shell. Help Desk is an analysis tool designed to provide users having various degrees of experience with the capability to determine first approximations of thermal capacity for spacecraft and instruments. The five analyses supported in Help Desk are: surface area required for a radiating surface, equilibrium temperature of a surface, enclosure temperature and heat loads for a defined position in orbit, enclosure temperature and heat loads over a complete orbit, and selection of appropriate surface properties. The two geometries supported by Help Desk are a single flat plate and a rectangular box enclosure.
NASA Technical Reports Server (NTRS)
Shooman, Martin L.
1991-01-01
Many of the most challenging reliability problems of our present decade involve complex distributed systems such as interconnected telephone switching computers, air traffic control centers, aircraft and space vehicles, and local area and wide area computer networks. In addition to the challenge of complexity, modern fault-tolerant computer systems require very high levels of reliability, e.g., avionic computers with MTTF goals of one billion hours. Most analysts find that it is too difficult to model such complex systems without computer aided design programs. In response to this need, NASA has developed a suite of computer aided reliability modeling programs beginning with CARE 3 and including a group of new programs such as: HARP, HARP-PC, Reliability Analysts Workbench (Combination of model solvers SURE, STEM, PAWS, and common front-end model ASSIST), and the Fault Tree Compiler. The HARP program is studied and how well the user can model systems using this program is investigated. One of the important objectives will be to study how user friendly this program is, e.g., how easy it is to model the system, provide the input information, and interpret the results. The experiences of the author and his graduate students who used HARP in two graduate courses are described. Some brief comparisons were made with the ARIES program which the students also used. Theoretical studies of the modeling techniques used in HARP are also included. Of course no answer can be any more accurate than the fidelity of the model, thus an Appendix is included which discusses modeling accuracy. A broad viewpoint is taken and all problems which occurred in the use of HARP are discussed. Such problems include: computer system problems, installation manual problems, user manual problems, program inconsistencies, program limitations, confusing notation, long run times, accuracy problems, etc.
The Hazard Mapping System (HMS)-a Multiplatform Remote Sensing Approach to Fire and Smoke Detection
NASA Astrophysics Data System (ADS)
Kibler, J.; Ruminski, M. G.
2003-12-01
The HMS is a multiplatform remote sensing approach to detecting fires and smoke over the US and adjacent areas of Canada and Mexico that has been in place since June 2002. This system is an integral part of the National Environmental Satellite and Data Information Service (NESDIS) near realtime hazard detection and mitigation efforts. The system utilizes NOAA's Geostationary Operational Environmental Satellites (GOES), Polar Operational Environmental Satellites (POES) and the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on NASA's Terra and Aqua spacecraft. Automated detection algorithms are employed for each of the satellites for the fire detects while smoke is added by a satellite image analyst. In June 2003 the HMS underwent an upgrade. A number of features were added for users of the products generated on the HMS. Sectors covering Alaska and Hawaii were added. The use of Geographic Information System (GIS) shape files for smoke analysis is a new feature. Shape files show the progression and time of a single smoke plume as each analysis is drawn and then updated. The analyst now has the ability to view GOES, POES, and MODIS data in a single loop. This allows the fire analyst the ability to easily confirm a fire in three different data sets. The upgraded HMS has faster satellite looping and gives the analyst the ability to design a false color image for a particular region. The GOES satellites provide a relatively coarse 4 km infrared resolution at satellite subpoint for thermal fire detection but provide the advantage of a rapid update cycle. GOES imagery is updated every 15 minutes utilizing both GOES-10 and GOES-12. POES imagery from NOAA-15, NOAA-16 and NOAA-17 and MODIS from Terra and Aqua are employed with each satellite providing twice per day coverage (more frequent over Alaska). While the frequency of imagery is much less than with GOES the higher resolution of these satellites (1 km along the suborbital track) allows for detection of smaller and/or cooler burning fires. Each of the algorithms utilizes a number of temporal, thermal and contextual filters in an attempt to screen out false detects. However, false detects do get processed by the algorithms to varying degrees. Therefore, the automated fire detects from each algorithm are quality controlled by an analyst who scans the imagery and may either accept or delete fire points. The analyst also has the ability to manually add additional fire points based on the imagery. Smoke is outlined by the analyst using visible imagery, primarily GOES which provides 1 km resolution. Occasionally a smoke plume seen in visible imagery is the only indicator of a fire and would be manually added to the fire detect file. The Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) is a forecast model that projects the trajectory and dispersion of a smoke plume over a period of time. The HYSPLIT is run for fires that are selected by the analyst that are seen to be producing a significant smoke plume. The analyst defines a smoke producing area commensurate with the size of the fire and amount of smoke detected. The output is hosted on an Air Resources Lab (ARL) web site which can be accessed from the web site listed below. All of the information is posted to the web page noted below. Besides the interactive GIS presentation users can view the product in graphical jpg format. The analyst edited points as well as the unedited automated fire detects are available for users to view directly on the web page or to download. All of the data is also archived and accessed via ftp.
NASA Astrophysics Data System (ADS)
Czajkowski, M.; Shilliday, A.; LoFaso, N.; Dipon, A.; Van Brackle, D.
2016-09-01
In this paper, we describe and depict the Defense Advanced Research Projects Agency (DARPA)'s OrbitOutlook Data Archive (OODA) architecture. OODA is the infrastructure that DARPA's OrbitOutlook program has developed to integrate diverse data from various academic, commercial, government, and amateur space situational awareness (SSA) telescopes. At the heart of the OODA system is its world model - a distributed data store built to quickly query big data quantities of information spread out across multiple processing nodes and data centers. The world model applies a multi-index approach where each index is a distinct view on the data. This allows for analysts and analytics (algorithms) to access information through queries with a variety of terms that may be of interest to them. Our indices include: a structured global-graph view of knowledge, a keyword search of data content, an object-characteristic range search, and a geospatial-temporal orientation of spatially located data. In addition, the world model applies a federated approach by connecting to existing databases and integrating them into one single interface as a "one-stop shopping place" to access SSA information. In addition to the world model, OODA provides a processing platform for various analysts to explore and analytics to execute upon this data. Analytic algorithms can use OODA to take raw data and build information from it. They can store these products back into the world model, allowing analysts to gain situational awareness with this information. Analysts in turn would help decision makers use this knowledge to address a wide range of SSA problems. OODA is designed to make it easy for software developers who build graphical user interfaces (GUIs) and algorithms to quickly get started with working with this data. This is done through a multi-language software development kit that includes multiple application program interfaces (APIs) and a data model with SSA concepts and terms such as: space observation, observable, measurable, metadata, track, space object, catalog, expectation, and maneuver.
Defense AT and L. Volume 45, Issue 1
2016-02-01
and government organizations. She currently is a senior research analyst for the MCBL Science and Technology Branch at Fort Leavenworth, Kansas...core functionality and interface design. Analysts from the Army S&T and MC user communities participated, including MCBL, Army Research Laboratory...Mica R. Endsley, Ph.D. Programs can use the 60-year foundation of scientific research and engineering in the field of human factors to develop robust
TOOKUIL: A case study in user interface development for safety code application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, D.L.; Harkins, C.K.; Hoole, J.G.
1997-07-01
Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less
A simple node and conductor data generator for SINDA
NASA Technical Reports Server (NTRS)
Gottula, Ronald R.
1992-01-01
This paper presents a simple, automated method to generate NODE and CONDUCTOR DATA for thermal match modes. The method uses personal computer spreadsheets to create SINDA inputs. It was developed in order to make SINDA modeling less time consuming and serves as an alternative to graphical methods. Anyone having some experience using a personal computer can easily implement this process. The user develops spreadsheets to automatically calculate capacitances and conductances based on material properties and dimensional data. The necessary node and conductor information is then taken from the spreadsheets and automatically arranged into the proper format, ready for insertion directly into the SINDA model. This technique provides a number of benefits to the SINDA user such as a reduction in the number of hand calculations, and an ability to very quickly generate a parametric set of NODE and CONDUCTOR DATA blocks. It also provides advantages over graphical thermal modeling systems by retaining the analyst's complete visibility into the thermal network, and by permitting user comments anywhere within the DATA blocks.
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.
Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less
User's manual for COAST 4: a code for costing and sizing tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sink, D. A.; Iwinski, E. M.
1979-09-01
The purpose of this report is to document the computer program COAST 4 for the user/analyst. COAST, COst And Size Tokamak reactors, provides complete and self-consistent size models for the engineering features of D-T burning tokamak reactors and associated facilities involving a continuum of performance including highly beam driven through ignited plasma devices. TNS (The Next Step) devices with no tritium breeding or electrical power production are handled as well as power producing and fissile producing fusion-fission hybrid reactors. The code has been normalized with a TFTR calculation which is consistent with cost, size, and performance data published in themore » conceptual design report for that device. Information on code development, computer implementation and detailed user instructions are included in the text.« less
Allocation of surgical procedures to operating rooms.
Ozkarahan, I
1995-08-01
Reduction of health care costs is of paramount importance in our time. This paper is a part of the research which proposes an expert hospital decision support system for resource scheduling. The proposed system combines mathematical programming, knowledge base, and database technologies, and what is more, its friendly interface is suitable for any novice user. Operating rooms in hospitals represent big investments and must be utilized efficiently. In this paper, first a mathematical model similar to job shop scheduling models is developed. The model loads surgical cases to operating rooms by maximizing room utilization and minimizing overtime in a multiple operating room setting. Then a prototype expert system which replaces the expertise of the operations research analyst for the model, drives the modelbase, database, and manages the user dialog is developed. Finally, an overview of the sequencing procedures for operations within an operating room is also presented.
2013-07-08
bias. Moreover, it is to be expected that a rational agent learns and adapts its strategies and knowledge, its metacognitive control (e.g., more...Pirolli and S. K. Card, “The sensemaking process and leverage points for analyst technology ,” inProceedings of the International Conference on...user: the sense- making of qualitative-quantitative methodology,” in Sense- Making Methodology Reader: Selected Writings of Brenda Dervin, B. Dervin, L
CASL Dakota Capabilities Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Simmons, Chris; Williams, Brian J.
2017-10-10
The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.
NASA Technical Reports Server (NTRS)
1992-01-01
The purpose of QASE RT is to enable system analysts and software engineers to evaluate performance and reliability implications of design alternatives. The program resulted from two Small Business Innovation Research (SBIR) projects. After receiving a description of the system architecture and workload from the user, QASE RT translates the system description into simulation models and executes them. Simulation provides detailed performance evaluation. The results of the evaluations are service and response times, offered load and device utilizations and functional availability.
Common Bolted Joint Analysis Tool
NASA Technical Reports Server (NTRS)
Imtiaz, Kauser
2011-01-01
Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.
Language workbench user interfaces for data analysis
Benson, Victoria M.
2015-01-01
Biological data analysis is frequently performed with command line software. While this practice provides considerable flexibility for computationally savy individuals, such as investigators trained in bioinformatics, this also creates a barrier to the widespread use of data analysis software by investigators trained as biologists and/or clinicians. Workflow systems such as Galaxy and Taverna have been developed to try and provide generic user interfaces that can wrap command line analysis software. These solutions are useful for problems that can be solved with workflows, and that do not require specialized user interfaces. However, some types of analyses can benefit from custom user interfaces. For instance, developing biomarker models from high-throughput data is a type of analysis that can be expressed more succinctly with specialized user interfaces. Here, we show how Language Workbench (LW) technology can be used to model the biomarker development and validation process. We developed a language that models the concepts of Dataset, Endpoint, Feature Selection Method and Classifier. These high-level language concepts map directly to abstractions that analysts who develop biomarker models are familiar with. We found that user interfaces developed in the Meta-Programming System (MPS) LW provide convenient means to configure a biomarker development project, to train models and view the validation statistics. We discuss several advantages of developing user interfaces for data analysis with a LW, including increased interface consistency, portability and extension by language composition. The language developed during this experiment is distributed as an MPS plugin (available at http://campagnelab.org/software/bdval-for-mps/). PMID:25755929
Communication Problems in Requirements Engineering: A Field Study
NASA Technical Reports Server (NTRS)
Al-Rawas, Amer; Easterbrook, Steve
1996-01-01
The requirements engineering phase of software development projects is characterized by the intensity and importance of communication activities. During this phase, the various stakeholders must be able to communicate their requirements to the analysts, and the analysts need to be able to communicate the specifications they generate back to the stakeholders for validation. This paper describes a field investigation into the problems of communication between disparate communities involved in the requirements specification activities. The results of this study are discussed in terms of their relation to three major communication barriers: (1) ineffectiveness of the current communication channels; (2) restrictions on expressiveness imposed by notations; and (3) social and organizational barriers. The results confirm that organizational and social issues have great influence on the effectiveness of communication. They also show that in general, end-users find the notations used by software practitioners to model their requirements difficult to understand and validate.
Enhanced detection and visualization of anomalies in spectral imagery
NASA Astrophysics Data System (ADS)
Basener, William F.; Messinger, David W.
2009-05-01
Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.
Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius
2002-01-01
Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation. PMID:12463921
Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius
2002-01-01
Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christon, Mark A.; Bakosi, Jozsef; Lowrie, Robert B.
Hydra-TH is a hybrid finite-element/finite-volume code built using the Hydra toolkit specifically to attack a broad class of incompressible, viscous fluid dynamics problems prevalent in the thermalhydraulics community. The purpose for this manual is provide sufficient information for an experience analyst to use Hydra-TH in an effective way. The Hydra-TH User's Manual present a brief overview of capabilities and visualization interfaces. The execution and restart models are described before turning to the detailed description of keyword input. Finally, a series of example problems are presented with sufficient data to permit the user to verify the local installation of Hydra-TH, andmore » to permit a convenient starting point for more detailed and complex analyses.« less
Orbital Debris Engineering Model (ORDEM) v.3
NASA Technical Reports Server (NTRS)
Matney, Mark; Krisko, Paula; Xu, Yu-Lin; Horstman, Matthew
2013-01-01
A model of the manmade orbital debris environment is required by spacecraft designers, mission planners, and others in order to understand and mitigate the effects of the environment on their spacecraft or systems. A manmade environment is dynamic, and can be altered significantly by intent (e.g., the Chinese anti-satellite weapon test of January 2007) or accident (e.g., the collision of Iridium 33 and Cosmos 2251 spacecraft in February 2009). Engineering models are used to portray the manmade debris environment in Earth orbit. The availability of new sensor and in situ data, the re-analysis of older data, and the development of new analytical and statistical techniques has enabled the construction of this more comprehensive and sophisticated model. The primary output of this model is the flux [#debris/area/time] as a function of debris size and year. ORDEM may be operated in spacecraft mode or telescope mode. In the former case, an analyst defines an orbit for a spacecraft and "flies" the spacecraft through the orbital debris environment. In the latter case, an analyst defines a ground-based sensor (telescope or radar) in terms of latitude, azimuth, and elevation, and the model provides the number of orbital debris traversing the sensor's field of view. An upgraded graphical user interface (GUI) is integrated with the software. This upgraded GUI uses project-oriented organization and provides the user with graphical representations of numerous output data products. These range from the conventional flux as a function of debris size for chosen analysis orbits (or views), for example, to the more complex color-contoured two-dimensional (2D) directional flux diagrams in local spacecraft elevation and azimuth.
Omen: identifying potential spear-phishing targets before the email is sent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendt, Jeremy Daniel.
2013-07-01
We present the results of a two year project focused on a common social engineering attack method called "spear phishing". In a spear phishing attack, the user receives an email with information specifically focused on the user. This email contains either a malware-laced attachment or a link to download the malware that has been disguised as a useful program. Spear phishing attacks have been one of the most effective avenues for attackers to gain initial entry into a target network. This project focused on a proactive approach to spear phishing. To create an effective, user-specific spear phishing email, the attackermore » must research the intended recipient. We believe that much of the information used by the attacker is provided by the target organization's own external website. Thus when researching potential targets, the attacker leaves signs of his research in the webserver's logs. We created tools and visualizations to improve cybersecurity analysts' abilities to quickly understand a visitor's visit patterns and interests. Given these suspicious visitors and log-parsing tools, analysts can more quickly identify truly suspicious visitors, search for potential spear-phishing targeted users, and improve security around those users before the spear phishing email is sent.« less
Kozlikova, Barbora; Sebestova, Eva; Sustr, Vilem; Brezovsky, Jan; Strnad, Ondrej; Daniel, Lukas; Bednar, David; Pavelka, Antonin; Manak, Martin; Bezdeka, Martin; Benes, Petr; Kotry, Matus; Gora, Artur; Damborsky, Jiri; Sochor, Jiri
2014-09-15
The transport of ligands, ions or solvent molecules into proteins with buried binding sites or through the membrane is enabled by protein tunnels and channels. CAVER Analyst is a software tool for calculation, analysis and real-time visualization of access tunnels and channels in static and dynamic protein structures. It provides an intuitive graphic user interface for setting up the calculation and interactive exploration of identified tunnels/channels and their characteristics. CAVER Analyst is a multi-platform software written in JAVA. Binaries and documentation are freely available for non-commercial use at http://www.caver.cz. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vang, Leng; Prescott, Steven R; Smith, Curtis
In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.
Communications Effects Server (CES) Model for Systems Engineering Research
2012-01-31
Visualization Tool Interface «logical» HLA Tool Interface «logical» DIS Tool Interface «logical» STK Tool Interface «module» Execution Kernels «logical...interoperate with STK when running simulations. GUI Components Architect – The Architect represents the main network design and visualization ...interest» CES «block» Third Party Visualization Tool «block» Third Party Analysis Tool «block» Third Party Text Editor «block» HLA Tools Analyst User Army
1988-10-01
Structured Analysis involves building a logical (non-physical) model of a system, using graphic techniques which enable users, analysts, and designers to... Design uses tools, especially graphic ones, to render systems readily understandable. 8 Ř. Structured Design offers a set of strategies for...in the overall systems design process, and an overview of the assessment procedures, as well as a guide to the overall assessment. 20. DISTRIBUTION
Measuring the Interestingness of News Articles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pon, R K; Cardenas, A F; Buttler, D J
An explosive growth of online news has taken place. Users are inundated with thousands of news articles, only some of which are interesting. A system to filter out uninteresting articles would aid users that need to read and analyze many articles daily, such as financial analysts and government officials. The most obvious approach for reducing the amount of information overload is to learn keywords of interest for a user (Carreira et al., 2004). Although filtering articles based on keywords removes many irrelevant articles, there are still many uninteresting articles that are highly relevant to keyword searches. A relevant article maymore » not be interesting for various reasons, such as the article's age or if it discusses an event that the user has already read about in other articles. Although it has been shown that collaborative filtering can aid in personalized recommendation systems (Wang et al., 2006), a large number of users is needed. In a limited user environment, such as a small group of analysts monitoring news events, collaborative filtering would be ineffective. The definition of what makes an article interesting--or its 'interestingness'--varies from user to user and is continually evolving, calling for adaptable user personalization. Furthermore, due to the nature of news, most articles are uninteresting since many are similar or report events outside the scope of an individual's concerns. There has been much work in news recommendation systems, but none have yet addressed the question of what makes an article interesting.« less
2013-09-01
Result Analysis In this phase, users and analysts check all the results per objective- question. Then, they consolidate all these results to form...the CRUD technique. By using both the CRUD and the user goal techniques, we identified all the use cases the iFRE system must perform. Table 3...corresponding Focus Area or Critical Operation Issue to simplify the user tasks, and exempts the user from remembering the identifying codes/numbers of
User interface for ground-water modeling: Arcview extension
Tsou, Ming‐shu; Whittemore, Donald O.
2001-01-01
Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.
Analysis of field test data on residential heating and cooling
NASA Astrophysics Data System (ADS)
Talbert, S. G.
1980-12-01
The computer program using field site data collected on 48 homes located in six cities in different climatic regions of the United States is discussed. In addition, a User's Guide was prepared for the computer program which is contained in a separate two-volume document entitled User's Guide for REAP: Residential Energy Analysis Program. Feasibility studies were conducted pertaining to potential improvements for REAP, including: the addition of an oil-furnace model; improving the infiltration subroutine; adding active and/or passive solar subroutines; incorporating a thermal energy storage model; and providing dual HVAC systems (e.g., heat pump-gas furnace). The purpose of REAP is to enable building designers and energy analysts to evaluate how such factors as building design, weather conditions, internal heat loads, and HVAC equipment performance, influence the energy requirements of residential buildings.
A suite of models to support the quantitative assessment of spread in pest risk analysis.
Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J; Baker, Richard H A; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke
2012-01-01
Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.
A Suite of Models to Support the Quantitative Assessment of Spread in Pest Risk Analysis
Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J.; Baker, Richard H. A.; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke
2012-01-01
Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice. PMID:23056174
Intrasystem Analysis Program (IAP) Structural Design Study.
1981-06-01
accuracy constraints, and user competence . This report is designed to serve as a guide in con- structing procedures and identifying those aspects of the...parameters. 3.3.3 Userability The term "Userability" refers here to the level of competence assumed for an IAP analyst in need of a procedure. There...media the wires pass through is homogeneous along the length of the wires. Under these assumptions the wave propagation is predominantly tranverse
Providing Focus via a Social Media Exploitation Strategy
2014-06-01
networking sites, video/photo sharing websites, forums, message boards, blogs and user -generated content in general as a way to determine the volume...that are constantly being updated by users around the world provide an excellent near-real time sensor. This sensor can be used to alert analysts...using the platform is to mine the profiles provided by the various platforms. At a minimum, users require a username, but there is usually a large
EIA model documentation: Petroleum Market Model of the National Energy Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1994-12-30
The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. Documentation of the model is in accordance with EIA`s legal obligation to provide adequate documentation in support of its models (Public Law 94-385, section 57.b.2). The PMM models petroleum refining activities, the marketing of products, the production of natural gas liquids and domestic methanol, projects petroleum provides and sources of supplies for meeting demand. In addition, the PMMmore » estimates domestic refinery capacity expansion and fuel consumption.« less
SnapShot: Visualization to Propel Ice Hockey Analytics.
Pileggi, H; Stolper, C D; Boyle, J M; Stasko, J T
2012-12-01
Sports analysts live in a world of dynamic games flattened into tables of numbers, divorced from the rinks, pitches, and courts where they were generated. Currently, these professional analysts use R, Stata, SAS, and other statistical software packages for uncovering insights from game data. Quantitative sports consultants seek a competitive advantage both for their clients and for themselves as analytics becomes increasingly valued by teams, clubs, and squads. In order for the information visualization community to support the members of this blossoming industry, it must recognize where and how visualization can enhance the existing analytical workflow. In this paper, we identify three primary stages of today's sports analyst's routine where visualization can be beneficially integrated: 1) exploring a dataspace; 2) sharing hypotheses with internal colleagues; and 3) communicating findings to stakeholders.Working closely with professional ice hockey analysts, we designed and built SnapShot, a system to integrate visualization into the hockey intelligence gathering process. SnapShot employs a variety of information visualization techniques to display shot data, yet given the importance of a specific hockey statistic, shot length, we introduce a technique, the radial heat map. Through a user study, we received encouraging feedback from several professional analysts, both independent consultants and professional team personnel.
Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.
Stolper, Charles D; Perer, Adam; Gotz, David
2014-12-01
As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.
A multi-phase network situational awareness cognitive task analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.
Abstract The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into making certain that we had feedback from network analysts and managers and understand what their genuine needs are. This article discusses the cognitive task-analysis methodology that we followed to acquire feedback from the analysts. This article also provides the details we acquired from the analysts on their processes, goals, concerns, themore » data and metadata that they analyze. Finally, we describe the generation of a novel task-flow diagram representing the activities of the target user base.« less
Tricco, Andrea C; Zarin, Wasifa; Rios, Patricia; Nincic, Vera; Khan, Paul A; Ghassemi, Marco; Diaz, Sanober; Pham, Ba'; Straus, Sharon E; Langlois, Etienne V
2018-02-12
It is unclear how to engage a wide range of knowledge users in research. We aimed to map the evidence on engaging knowledge users with an emphasis on policy-makers, health system managers, and policy analysts in the knowledge synthesis process through a scoping review. We used the Joanna Briggs Institute guidance for scoping reviews. Nine electronic databases (e.g., MEDLINE), two grey literature sources (e.g., OpenSIGLE), and reference lists of relevant systematic reviews were searched from 1996 to August 2016. We included any type of study describing strategies, barriers and facilitators, or assessing the impact of engaging policy-makers, health system managers, and policy analysts in the knowledge synthesis process. Screening and data abstraction were conducted by two reviewers independently with a third reviewer resolving discrepancies. Frequency and thematic analyses were conducted. After screening 8395 titles and abstracts followed by 394 full-texts, 84 unique documents and 7 companion reports fulfilled our eligibility criteria. All 84 documents were published in the last 10 years, and half were prepared in North America. The most common type of knowledge synthesis with knowledge user engagement was a systematic review (36%). The knowledge synthesis most commonly addressed an issue at the level of national healthcare system (48%) and focused on health services delivery (17%) in high-income countries (86%). Policy-makers were the most common (64%) knowledge users, followed by healthcare professionals (49%) and government agencies as well as patients and caregivers (34%). Knowledge users were engaged in conceptualization and design (49%), literature search and data collection (52%), data synthesis and interpretation (71%), and knowledge dissemination and application (44%). Knowledge users were most commonly engaged as key informants through meetings and workshops as well as surveys, focus groups, and interviews either in-person or by telephone and emails. Knowledge user content expertise/awareness was a common facilitator (18%), while lack of time or opportunity to participate was a common barrier (12%). Knowledge users were most commonly engaged during the data synthesis and interpretation phases of the knowledge synthesis conduct. Researchers should document and evaluate knowledge user engagement in knowledge synthesis. Open Science Framework ( https://osf.io/4dy53/ ).
Hydrologic modeling for monitoring water availability in Africa and the Middle East
NASA Astrophysics Data System (ADS)
McNally, A.; Getirana, A.; Arsenault, K. R.; Peters-Lidard, C. D.; Verdin, J. P.
2015-12-01
Drought impacts water resources required by crops and communities, in turn threatening lives and livelihoods. Early warning systems, which rely on inputs from hydro-climate models, are used to help manage risk and provide humanitarian assistance to the right place at the right time. However, translating advancements in hydro-climate science into action is a persistent and time-consuming challenge: scientists and decision-makers need to work together to enhance the salience, credibility, and legitimacy of the hydrological data products being produced. One organization that tackles this challenge is the Famine Early Warning Systems Network (FEWS NET), which has been using evidence-based approaches to address food security since the 1980s.In this presentation, we describe the FEWS NET Land Data Assimilation System (FLDAS), developed by FEWS NET and NASA hydrologic scientists to maximize the use of limited hydro-climatic observations for humanitarian applications. The FLDAS, an instance of the NASA Land Information System (LIS), is comprised of land surface models driven by satellite rainfall inputs already familiar to FEWS NET food security analysts. First, we evaluate the quality of model outputs over parts of the Middle East and Africa using remotely sensed soil moisture and vegetation indices. We then describe derived water availability indices that have been identified by analysts as potentially useful sources of information. Specifically, we demonstrate how the Baseline Water Stress and Drought Severity Index detect recent water availability crisis events in the Tigris-Euphrates Basin and the Gaborone Reservoir, Botswana. Finally we discuss ongoing work to deliver this information to FEWS NET analysts in a timely and user-friendly manner, with the ultimate goal of integrating these water availability metrics into regular decision-making activities.
Community Sourced Knowledge: Solving the Maintenance Problem
2012-05-01
The end-user doesn’t need further assistance. KPI – Key Performance Indicator KPI -1 = Relayed solution to the end-user. KPI -2 = Trouble ticket is...created for the end-user and escalated for further review. KPI -3 = Trouble ticket is created for the end-user. KPI -4 = End of Call. Start of Process...14i KPI -1 KPI -2 KPI -3 KPI -4 QI-2QI-3 QI-4 QI-5 QI-6 Does the analyst understand the end- user’s rqst. Yes No 7 QI-1 Inform KMT. Legend KMT
Framework Development Supporting the Safety Portal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven Ralph; Kvarfordt, Kellie Jean; Vang, Leng
2015-07-01
In a collaborating scientific research arena it is important to have an environment where analysts have access to a shared repository of information, documents, and software tools, and be able to accurately maintain and track historical changes in models. The new Safety Portal cloud-based environment will be accessible remotely from anywhere regardless of computing platforms given that the platform has available Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report discusses current development of a cloud-based web portal for PRA tools.
2016-01-01
Moving Target Indicator, Unmanned Aircraft System (UAS), Rivet Joint, U-2, and ground signals intelligence (PROPHET). At the BCT, Ranger Regiment and... metadata catalog managed by the DIB management office (outside of the DCGS-A system ). A metadata is a searchable description of data, and users across...challenge for users . The system required reboots about every 20 hours for users who had heavy workloads such as the fire support analysts and data
Dynamics of analyst forecasts and emergence of complexity: Role of information disparity
Ahn, Kwangwon
2017-01-01
We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts’ forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems. PMID:28498831
Visualization of multi-INT fusion data using Java Viewer (JVIEW)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Aved, Alex; Nagy, James; Scott, Stephen
2014-05-01
Visualization is important for multi-intelligence fusion and we demonstrate issues for presenting physics-derived (i.e., hard) and human-derived (i.e., soft) fusion results. Physics-derived solutions (e.g., imagery) typically involve sensor measurements that are objective, while human-derived (e.g., text) typically involve language processing. Both results can be geographically displayed for user-machine fusion. Attributes of an effective and efficient display are not well understood, so we demonstrate issues and results for filtering, correlation, and association of data for users - be they operators or analysts. Operators require near-real time solutions while analysts have the opportunities of non-real time solutions for forensic analysis. In a use case, we demonstrate examples using the JVIEW concept that has been applied to piloting, space situation awareness, and cyber analysis. Using the open-source JVIEW software, we showcase a big data solution for multi-intelligence fusion application for context-enhanced information fusion.
Manufacturing Cost Levelization Model – A User’s Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrow, William R.; Shehabi, Arman; Smith, Sarah Josephine
The Manufacturing Cost Levelization Model is a cost-performance techno-economic model that estimates total large-scale manufacturing costs for necessary to produce a given product. It is designed to provide production cost estimates for technology researchers to help guide technology research and development towards an eventual cost-effective product. The model presented in this user’s guide is generic and can be tailored to the manufacturing of any product, including the generation of electricity (as a product). This flexibility, however, requires the user to develop the processes and process efficiencies that represents a full-scale manufacturing facility. The generic model is comprised of several modulesmore » that estimate variable costs (material, labor, and operating), fixed costs (capital & maintenance), financing structures (debt and equity financing), and tax implications (taxable income after equipment and building depreciation, debt interest payments, and expenses) of a notional manufacturing plant. A cash-flow method is used to estimate a selling price necessary for the manufacturing plant to recover its total cost of production. A levelized unit sales price ($ per unit of product) is determined by dividing the net-present value of the manufacturing plant’s expenses ($) by the net present value of its product output. A user defined production schedule drives the cash-flow method that determines the levelized unit price. In addition, an analyst can increase the levelized unit price to include a gross profit margin to estimate a product sales price. This model allows an analyst to understand the effect that any input variables could have on the cost of manufacturing a product. In addition, the tool is able to perform sensitivity analysis, which can be used to identify the key variables and assumptions that have the greatest influence on the levelized costs. This component is intended to help technology researchers focus their research attention on tasks that offer the greatest opportunities for cost reduction early in the research and development stages of technology invention.« less
Stochastic Simulation Tool for Aerospace Structural Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F.; Moore, David F.
2006-01-01
Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.
Intelligence Dissemination to the Warfighter
2007-12-01
that prevent other JWICS users from exchanging data. The CIA conducts most of their business on the CIAnet , which can pull data from JWICS but...data. Spreadsheets and word processors, in order to retain a high level of user- friendliness, handle several complex background processes that...the “ complex adaptive systems”, where the onus is placed equally on the analyst and on the tools to be receptive and adaptable. It is the
Timeline analysis tools for law enforcement
NASA Astrophysics Data System (ADS)
Mucks, John
1997-02-01
The timeline analysis system (TAS) was developed by Rome Laboratory to assist intelligence analysts with the comprehension of large amounts of information. Under the TAS program data visualization, manipulation and reasoning tools were developed in close coordination with end users. The initial TAS prototype was developed for foreign command and control analysts at Space Command in Colorado Springs and was fielded there in 1989. The TAS prototype replaced manual paper timeline maintenance and analysis techniques and has become an integral part of Space Command's information infrastructure. TAS was designed to be domain independent and has been tailored and proliferated to a number of other users. The TAS program continues to evolve because of strong user support. User funded enhancements and Rome Lab funded technology upgrades have significantly enhanced TAS over the years and will continue to do so for the foreseeable future. TAS was recently provided to the New York State Police (NYSP) for evaluation using actual case data. Timeline analysis it turns out is a popular methodology used in law enforcement. The evaluation has led to a more comprehensive application and evaluation project sponsored by the National Institute of Justice (NIJ). This paper describes the capabilities of TAS, results of the initial NYSP evaluation and the plan for a more comprehensive NYSP evaluation.
An intelligent interface for satellite operations: Your Orbit Determination Assistant (YODA)
NASA Technical Reports Server (NTRS)
Schur, Anne
1988-01-01
An intelligent interface is often characterized by the ability to adapt evaluation criteria as the environment and user goals change. Some factors that impact these adaptations are redefinition of task goals and, hence, user requirements; time criticality; and system status. To implement adaptations affected by these factors, a new set of capabilities must be incorporated into the human-computer interface design. These capabilities include: (1) dynamic update and removal of control states based on user inputs, (2) generation and removal of logical dependencies as change occurs, (3) uniform and smooth interfacing to numerous processes, databases, and expert systems, and (4) unobtrusive on-line assistance to users of concepts were applied and incorporated into a human-computer interface using artificial intelligence techniques to create a prototype expert system, Your Orbit Determination Assistant (YODA). YODA is a smart interface that supports, in real teime, orbit analysts who must determine the location of a satellite during the station acquisition phase of a mission. Also described is the integration of four knowledge sources required to support the orbit determination assistant: orbital mechanics, spacecraft specifications, characteristics of the mission support software, and orbit analyst experience. This initial effort is continuing with expansion of YODA's capabilities, including evaluation of results of the orbit determination task.
Lee, Bruce Y; Wong, Kim F; Bartsch, Sarah M; Yilmaz, S Levent; Avery, Taliser R; Brown, Shawn T; Song, Yeohan; Singh, Ashima; Kim, Diane S; Huang, Susan S
2013-06-01
As healthcare systems continue to expand and interconnect with each other through patient sharing, administrators, policy makers, infection control specialists, and other decision makers may have to take account of the entire healthcare 'ecosystem' in infection control. We developed a software tool, the Regional Healthcare Ecosystem Analyst (RHEA), that can accept user-inputted data to rapidly create a detailed agent-based simulation model (ABM) of the healthcare ecosystem (ie, all healthcare facilities, their adjoining community, and patient flow among the facilities) of any region to better understand the spread and control of infectious diseases. To demonstrate RHEA's capabilities, we fed extensive data from Orange County, California, USA, into RHEA to create an ABM of a healthcare ecosystem and simulate the spread and control of methicillin-resistant Staphylococcus aureus. Various experiments explored the effects of changing different parameters (eg, degree of transmission, length of stay, and bed capacity). Our model emphasizes how individual healthcare facilities are components of integrated and dynamic networks connected via patient movement and how occurrences in one healthcare facility may affect many other healthcare facilities. A decision maker can utilize RHEA to generate a detailed ABM of any healthcare system of interest, which in turn can serve as a virtual laboratory to test different policies and interventions.
Visually Exploring Transportation Schedules.
Palomo, Cesar; Guo, Zhan; Silva, Cláudio T; Freire, Juliana
2016-01-01
Public transportation schedules are designed by agencies to optimize service quality under multiple constraints. However, real service usually deviates from the plan. Therefore, transportation analysts need to identify, compare and explain both eventual and systemic performance issues that must be addressed so that better timetables can be created. The purely statistical tools commonly used by analysts pose many difficulties due to the large number of attributes at trip- and station-level for planned and real service. Also challenging is the need for models at multiple scales to search for patterns at different times and stations, since analysts do not know exactly where or when relevant patterns might emerge and need to compute statistical summaries for multiple attributes at different granularities. To aid in this analysis, we worked in close collaboration with a transportation expert to design TR-EX, a visual exploration tool developed to identify, inspect and compare spatio-temporal patterns for planned and real transportation service. TR-EX combines two new visual encodings inspired by Marey's Train Schedule: Trips Explorer for trip-level analysis of frequency, deviation and speed; and Stops Explorer for station-level study of delay, wait time, reliability and performance deficiencies such as bunching. To tackle overplotting and to provide a robust representation for a large numbers of trips and stops at multiple scales, the system supports variable kernel bandwidths to achieve the level of detail required by users for different tasks. We justify our design decisions based on specific analysis needs of transportation analysts. We provide anecdotal evidence of the efficacy of TR-EX through a series of case studies that explore NYC subway service, which illustrate how TR-EX can be used to confirm hypotheses and derive new insights through visual exploration.
Model documentation report: Residential sector demand module of the national energy modeling system
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code. This reference document provides a detailed description for energy analysts, other users, and the public. The NEMS Residential Sector Demand Module is currently used for mid-term forecasting purposes and energy policy analysis over the forecast horizon of 1993 through 2020. The model generates forecasts of energy demand for the residential sector by service, fuel, and Census Division. Policy impacts resulting from new technologies,more » market incentives, and regulatory changes can be estimated using the module. 26 refs., 6 figs., 5 tabs.« less
Analyst-centered models for systems design, analysis, and development
NASA Technical Reports Server (NTRS)
Bukley, A. P.; Pritchard, Richard H.; Burke, Steven M.; Kiss, P. A.
1988-01-01
Much has been written about the possible use of Expert Systems (ES) technology for strategic defense system applications, particularly for battle management algorithms and mission planning. It is proposed that ES (or more accurately, Knowledge Based System (KBS)) technology can be used in situations for which no human expert exists, namely to create design and analysis environments that allow an analyst to rapidly pose many different possible problem resolutions in game like fashion and to then work through the solution space in search of the optimal solution. Portions of such an environment exist for expensive AI hardware/software combinations such as the Xerox LOOPS and Intellicorp KEE systems. Efforts are discussed to build an analyst centered model (ACM) using an ES programming environment, ExperOPS5 for a simple missile system tradeoff study. By analyst centered, it is meant that the focus of learning is for the benefit of the analyst, not the model. The model's environment allows the analyst to pose a variety of what if questions without resorting to programming changes. Although not an ES per se, the ACM would allow for a design and analysis environment that is much superior to that of current technologies.
NASA Technical Reports Server (NTRS)
1974-01-01
The purpose of the BRAVO User's Manual is to describe the BRAVO methodology in terms of step-by-step procedures. The BRAVO methodology then becomes a tool which a team of analysts can utilize to perform cost effectiveness analyses on potential future space applications with a relatively general set of input information and a relatively small expenditure of resources. An overview of the BRAVO procedure is given by describing the complete procedure in a general form.
Integration of GCAM-USA into GLIMPSE: Update and ...
The purpose of this presentation is to (i) discuss changes made to the GCAM-USA model to more fully support long-term, coordinated environmental-climate-energy planning within the U.S., and (ii) demonstrate the graphical user interface that has been constructed to construct modeling scenarios, execute GCAM-USA, and visualize and compare model outputs. GLIMPSE is intended to provide insights into linkages and synergies among the goals of air quality management, climate change mitigation, and long-range energy planning. We have expanded GLIMPSE to also incorporate the open-source Global Change Assessment Model-USA (GCAM-USA), which has state-level representation of the U.S. energy system. With GCAM-USA, GLIMPSE can consider more aspects of the economy, linkages to the water and climate systems, and interactions with other regions of the world. A user-friendly graphical interface allows the system to be applied by analysts to explore a range of policies, such emission taxes or caps, efficiency standards, and renewable portfolio standards. We expect GLIMPSE to be used within research and planning activities, both within the EPA and beyond.
Integration of bus stop counts data with census data for improving bus service.
DOT National Transportation Integrated Search
2016-04-01
This research project produced an open source transit market data visualization and analysis tool suite, : The Bus Transit Market Analyst (BTMA), which contains user-friendly GIS mapping and data : analytics tools, and state-of-the-art transit demand...
Guidance Document for PMF Applications with the Multilinear Engine
This document serves as a guide for users of the Multilinear Engine version 2 (ME-2) for source apportionment applications utilizing positive matrix factorization (PMF). It aims to educate experienced source apportionment analysts on available ME rotational tools and provides gui...
ERIC Educational Resources Information Center
White, Owen Roberts
1985-01-01
The author reviews systems providing objective guidelines to facilitate ongoing, daily instructional decisions, focusing on those which utilize the sensitive datum and uniform charting procedures of precision teaching. Potential users are warned that the special education teacher must remain a critical and vigilant analyst of the learning process.…
Integrated multidisciplinary analysis tool IMAT users' guide
NASA Technical Reports Server (NTRS)
Meissner, Frances T. (Editor)
1988-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.
Comparison of air-coupled GPR data analysis results determined by multiple analysts
NASA Astrophysics Data System (ADS)
Martino, Nicole; Maser, Ken
2016-04-01
Current bridge deck condition assessments using ground penetrating radar (GPR) requires a trained analyst to manually interpret substructure layering information from B-scan images in order to proceed with an intended analysis (pavement thickness, concrete cover, effects of rebar corrosion, etc.) For example, a recently developed method to rapidly and accurately analyze air-coupled GPR data based on the effects of rebar corrosion, requires that a user "picks" a layer of rebar reflections in each B-scan image collected along the length of the deck. These "picks" have information like signal amplitude and two way travel time. When a deck is new, or has little rebar corrosion, the resulting layer of rebar reflections is readily evident and there is little room for subjectivity. However, when a deck is severely deteriorated, the rebar layer may be difficult to identify, and different analysts may make different interpretations of the appropriate layer to analyze. One highly corroded bridge deck, was assessed with a number of nondestructive evaluation techniques including 2GHz air-coupled GPR. Two trained analysts separately selected the rebar layer in each B-scan image, choosing as much information as possible, even in areas of significant deterioration. The post processing of the selected data points was then completed and the results from each analyst were contour plotted to observe any discrepancies. The paper describes the differences between ground coupled and air-coupled GPR systems, the data collection and analysis methods used by two different analysts for one case study, and the results of the two different analyses.
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel Anne
Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show that the basic Image Interpretation Elements continue to be important despite technological augmentation of the interpretation process. These results are used to derive a set of design guidelines for developing geovisual analytic tools and training to support image analysis.
The Variability of Crater Identification Among Expert and Community Crater Analysts
NASA Astrophysics Data System (ADS)
Robbins, S. J.; Antonenko, I.; Kirchoff, M. R.; Chapman, C. R.; Fassett, C. I.; Herrick, R. R.; Singer, K.; Zanetti, M.; Lehan, C.; Huang, D.; Gay, P.
2014-04-01
Statistical studies of impact crater populations have been used to model ages of planetary surfaces for several decades [1]. This assumes that crater counts are approximately invariant and a "correct" population will be identified if the analyst is skilled and diligent. However, the reality is that crater identification is somewhat subjective, so variability between analysts, or even a single analyst's variation from day-to-day, is expected [e.g., 2, 3]. This study was undertaken to quantify that variability within an expert analyst population and between experts and minimally trained volunteers.
Bergin, Michael
2011-01-01
Qualitative data analysis is a complex process and demands clear thinking on the part of the analyst. However, a number of deficiencies may obstruct the research analyst during the process, leading to inconsistencies occurring. This paper is a reflection on the use of a qualitative data analysis program, NVivo 8, and its usefulness in identifying consistency and inconsistency during the coding process. The author was conducting a large-scale study of providers and users of mental health services in Ireland. He used NVivo 8 to store, code and analyse the data and this paper reflects some of his observations during the study. The demands placed on the analyst in trying to balance the mechanics of working through a qualitative data analysis program, while simultaneously remaining conscious of the value of all sources are highlighted. NVivo 8 as a qualitative data analysis program is a challenging but valuable means for advancing the robustness of qualitative research. Pitfalls can be avoided during analysis by running queries as the analyst progresses from tree node to tree node rather than leaving it to a stage whereby data analysis is well advanced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.
Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-02-26
The Natural Gas Transmission and Distribution Model (NGTDM) of the National Energy Modeling System is developed and maintained by the Energy Information Administration (EIA), Office of Integrated Analysis and Forecasting. This report documents the archived version of the NGTDM that was used to produce the natural gas forecasts presented in the Annual Energy Outlook 1996, (DOE/EIA-0383(96)). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic approach, and provides detail on the methodology employed. Previously this report represented Volume I of amore » two-volume set. Volume II reported on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues.« less
EPA Positive Matrix Factorization (PMF) 3.0 Fundamentals & User Guide
Positive matrix factorization (PMF) is a multivariate factor analysis tool that decomposes a matrix of ambient data into two matrices - factor contributions and factor profiles - which then need to be interpreted by an analyst as to what source types are represented using measure...
Zhang, Zhen; Franklin, Amy; Walji, Muhammad; Zhang, Jiajie; Gong, Yang
2014-01-01
EHR usability has been identified as a major barrier to care quality optimization. One major challenge of improving EHR usability is the lack of systematic training in usability or cognitive ergonomics for EHR designers/developers in the vendor community and EHR analysts making significant configurations in healthcare organizations. A practical solution is to provide usability inspection tools that can be easily operationalized by EHR analysts. This project is aimed at developing a set of usability tools with demonstrated validity and reliability. We present a preliminary study of a metric for cognitive transparency and an exploratory experiment testing its validity in predicting the effectiveness of action-effect mapping. Despite the pilot nature of both, we found high sensitivity and specificity of the metric and higher response accuracy within a shorter time for users to determine action-effect mappings in transparent user interface controls. We plan to expand the sample size in our empirical study. PMID:25954439
Advancing satellite operations with intelligent graphical monitoring systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.
1993-01-01
For nearly twenty-five years, spacecraft missions have been operated in essentially the same manner: human operators monitor displays filled with alphanumeric text watching for limit violations or other indicators that signal a problem. The task is performed predominately by humans. Only in recent years have graphical user interfaces and expert systems been accepted within the control center environment to help reduce operator workloads. Unfortunately, the development of these systems is often time consuming and costly. At the NASA Goddard Space Flight Center (GSFC), a new domain specific expert system development tool called the Generic Spacecraft Analyst Assistant (GenSAA) has been developed. Through the use of a highly graphical user interface and point-and-click operation, GenSAA facilitates the rapid, 'programming-free' construction of intelligent graphical monitoring systems to serve as real-time, fault-isolation assistants for spacecraft analysts. Although specifically developed to support real-time satellite monitoring, GenSAA can support the development of intelligent graphical monitoring systems in a variety of space and commercial applications.
BiSet: Semantic Edge Bundling with Biclusters for Sensemaking.
Sun, Maoyuan; Mi, Peng; North, Chris; Ramakrishnan, Naren
2016-01-01
Identifying coordinated relationships is an important task in data analytics. For example, an intelligence analyst might want to discover three suspicious people who all visited the same four cities. Existing techniques that display individual relationships, such as between lists of entities, require repetitious manual selection and significant mental aggregation in cluttered visualizations to find coordinated relationships. In this paper, we present BiSet, a visual analytics technique to support interactive exploration of coordinated relationships. In BiSet, we model coordinated relationships as biclusters and algorithmically mine them from a dataset. Then, we visualize the biclusters in context as bundled edges between sets of related entities. Thus, bundles enable analysts to infer task-oriented semantic insights about potentially coordinated activities. We make bundles as first class objects and add a new layer, "in-between", to contain these bundle objects. Based on this, bundles serve to organize entities represented in lists and visually reveal their membership. Users can interact with edge bundles to organize related entities, and vice versa, for sensemaking purposes. With a usage scenario, we demonstrate how BiSet supports the exploration of coordinated relationships in text analytics.
NASA Astrophysics Data System (ADS)
Davenport, Jack H.
2016-05-01
Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.
MAGIC Computer Simulation. Volume 1: User Manual
1970-07-01
vulnerability and MAGIC programs. A three-digit code is assigned to each component of the target, such as armor, gun tube; and a two-digit code is assigned to...A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1970 4. TITLE AND SUBTITLE MAGIC Computer Simulation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT
Arctic Capability Inventory Tool User Guide: Version 2 (International References)
2011-07-01
drawn from the primary source documents. In cases where the analyst included additional information, the text is included in [square brackets]. The...following: FERPistheallhazardsplanforacoordinatedfederalresponsetoemergencies. Inmost cases ,departmentsmanageemergencieswithevent...signed—(20)Andorra,Azerbaijan, Ecuador ,Eritrea,Israel,Kazakhstan, Kyrgyzstan,Peru,SanMarino,Syria,Tajikistan,TimorLeste,Turkey
Setting analyst: A practical harvest planning technique
Olivier R.M. Halleux; W. Dale Greene
2001-01-01
Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2000-01-01
Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2001-12-01
Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.
Supporting tactical intelligence using collaborative environments and social networking
NASA Astrophysics Data System (ADS)
Wollocko, Arthur B.; Farry, Michael P.; Stark, Robert F.
2013-05-01
Modern military environments place an increased emphasis on the collection and analysis of intelligence at the tactical level. The deployment of analytical tools at the tactical level helps support the Warfighter's need for rapid collection, analysis, and dissemination of intelligence. However, given the lack of experience and staffing at the tactical level, most of the available intelligence is not exploited. Tactical environments are staffed by a new generation of intelligence analysts who are well-versed in modern collaboration environments and social networking. An opportunity exists to enhance tactical intelligence analysis by exploiting these personnel strengths, but is dependent on appropriately designed information sharing technologies. Existing social information sharing technologies enable users to publish information quickly, but do not unite or organize information in a manner that effectively supports intelligence analysis. In this paper, we present an alternative approach to structuring and supporting tactical intelligence analysis that combines the benefits of existing concepts, and provide detail on a prototype system embodying that approach. Since this approach employs familiar collaboration support concepts from social media, it enables new-generation analysts to identify the decision-relevant data scattered among databases and the mental models of other personnel, increasing the timeliness of collaborative analysis. Also, the approach enables analysts to collaborate visually to associate heterogeneous and uncertain data within the intelligence analysis process, increasing the robustness of collaborative analyses. Utilizing this familiar dynamic collaboration environment, we hope to achieve a significant reduction of time and skill required to glean actionable intelligence in these challenging operational environments.
COMET-AR User's Manual: COmputational MEchanics Testbed with Adaptive Refinement
NASA Technical Reports Server (NTRS)
Moas, E. (Editor)
1997-01-01
The COMET-AR User's Manual provides a reference manual for the Computational Structural Mechanics Testbed with Adaptive Refinement (COMET-AR), a software system developed jointly by Lockheed Palo Alto Research Laboratory and NASA Langley Research Center under contract NAS1-18444. The COMET-AR system is an extended version of an earlier finite element based structural analysis system called COMET, also developed by Lockheed and NASA. The primary extensions are the adaptive mesh refinement capabilities and a new "object-like" database interface that makes COMET-AR easier to extend further. This User's Manual provides a detailed description of the user interface to COMET-AR from the viewpoint of a structural analyst.
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Hooper, Russell
2016-11-01
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically, it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. This manual offers Consortium for Advanced Simulation of Light Water Reactors (LWRs) (CASL) partners a guide to conducting Dakota-based VUQ studies for CASL problems. It motivates various classes of Dakota methods and includes examples of their use on representative application problems. On reading, a CASL analyst should understand why and howmore » to apply Dakota to a simulation problem.« less
Memory Forensics: Review of Acquisition and Analysis Techniques
2013-11-01
Management Overview Processes running on modern multitasking operating systems operate on an abstraction of RAM, called virtual memory [7]. In these systems...information such as user names, email addresses and passwords [7]. Analysts also use tools such as WinHex to identify headers or other suspicious data within
Stage Evolution of Office Automation Technological Change and Organizational Learning.
ERIC Educational Resources Information Center
Sumner, Mary
1985-01-01
A study was conducted to identify stage characteristics in terms of technology, applications, the role and responsibilities of the office automation organization, and planning and control strategies; and to describe the respective roles of data processing professionals, office automation analysts, and users in office automation systems development…
Quest: The Interactive Test Analysis System.
ERIC Educational Resources Information Center
Adams, Raymond J.; Khoo, Siek-Toon
The Quest program offers a comprehensive test and questionnaire analysis environment by providing a data analyst (a computer program) with access to the most recent developments in Rasch measurement theory, as well as a range of traditional analysis procedures. This manual helps the user use Quest to construct and validate variables based on…
ERIC Educational Resources Information Center
Cullen, Kevin
2005-01-01
Corporations employ data mining to analyze operations, find trends in recorded information, and look for new opportunities. Libraries are no different. Librarians manage large stores of data--about collections and usage, for example--and they also want to analyze this data to serve their users better. Analysts use data mining to query a data…
Original data preprocessor for Femap/Nastran
NASA Astrophysics Data System (ADS)
Oanta, Emil M.; Panait, Cornel; Raicu, Alexandra
2016-12-01
Automatic data processing and visualization in the finite elements analysis of the structural problems is a long run concern in mechanical engineering. The paper presents the `common database' concept according to which the same information may be accessed from an analytical model, as well as from a numerical one. In this way, input data expressed as comma-separated-value (CSV) files are loaded into the Femap/Nastran environment using original API codes, being automatically generated: the geometry of the model, the loads and the constraints. The original API computer codes are general, being possible to generate the input data of any model. In the next stages, the user may create the discretization of the model, set the boundary conditions and perform a given analysis. If additional accuracy is needed, the analyst may delete the previous discretizations and using the same information automatically loaded, other discretizations and analyses may be done. Moreover, if new more accurate information regarding the loads or constraints is acquired, they may be modelled and then implemented in the data generating program which creates the `common database'. This means that new more accurate models may be easily generated. Other facility consists of the opportunity to control the CSV input files, several loading scenarios being possible to be generated in Femap/Nastran. In this way, using original intelligent API instruments the analyst is focused to accurately model the phenomena and on creative aspects, the repetitive and time-consuming activities being performed by the original computer-based instruments. Using this data processing technique we apply to the best Asimov's principle `minimum change required / maximum desired response'.
Unlocking User-Centered Design Methods for Building Cyber Security Visualizations
2015-10-03
a final, deployed tool. Goodall et al. interviewed analysts to derive requirements for a network security tool [14], while Stoll et al. explain the...4673-7599-3/15/$31.00 c©2015 IEEE 2015 IEEE SYMPOSIUM ON VISUALIZATION FOR CYBER SECURITY (VIZSEC) [14] J. R. Goodall , A. A. Ozok, W. G. Lutters, P...Visualization for Cyber Security, pages 91–98. IEEE, 2005. [19] A. Komlodi, P. Rheingans, U. Ayachit, J. Goodall , and A. Joshi. A user- centered look at
IMAT (Integrated Multidisciplinary Analysis Tool) user's guide for the VAX/VMS computer
NASA Technical Reports Server (NTRS)
Meissner, Frances T. (Editor)
1988-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system for the VAX/VMS computer developed at the Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.
NASA Technical Reports Server (NTRS)
1974-01-01
The BRAVO User's Manual is presented which describes the BRAVO methodology in terms of step-by-step procedures, so that it may be used as a tool for a team of analysts performing cost effectiveness analyses on potential future space applications. BRAVO requires a relatively general set of input information and a relatively small expenditure of resources. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.
Instrument Systems Analysis and Verification Facility (ISAVF) users guide
NASA Technical Reports Server (NTRS)
Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.
1985-01-01
The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.
Modeling a terminology-based electronic nursing record system: an object-oriented approach.
Park, Hyeoun-Ae; Cho, InSook; Byeun, NamSoo
2007-10-01
The aim of this study was to present our perspectives on healthcare information analysis at a conceptual level and the lessons learned from our experience with the development of a terminology-based enterprise electronic nursing record system - which was one of components in an EMR system at a tertiary teaching hospital in Korea - using an object-oriented system analysis and design concept. To ensure a systematic approach and effective collaboration, the department of nursing constituted a system modeling team comprising a project manager, systems analysts, user representatives, an object-oriented methodology expert, and healthcare informaticists (including the authors). A rational unified process (RUP) and the Unified Modeling Language were used as a development process and for modeling notation, respectively. From the scenario and RUP approach, user requirements were formulated into use case sets and the sequence of activities in the scenario was depicted in an activity diagram. The structure of the system was presented in a class diagram. This approach allowed us to identify clearly the structural and behavioral states and important factors of a terminology-based ENR system (e.g., business concerns and system design concerns) according to the viewpoints of both domain and technical experts.
NASA Technical Reports Server (NTRS)
Birisan, Mihnea; Beling, Peter
2011-01-01
New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.
Rapid Exploitation and Analysis of Documents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buttler, D J; Andrzejewski, D; Stevens, K D
Analysts are overwhelmed with information. They have large archives of historical data, both structured and unstructured, and continuous streams of relevant messages and documents that they need to match to current tasks, digest, and incorporate into their analysis. The purpose of the READ project is to develop technologies to make it easier to catalog, classify, and locate relevant information. We approached this task from multiple angles. First, we tackle the issue of processing large quantities of information in reasonable time. Second, we provide mechanisms that allow users to customize their queries based on latent topics exposed from corpus statistics. Third,more » we assist users in organizing query results, adding localized expert structure over results. Forth, we use word sense disambiguation techniques to increase the precision of matching user generated keyword lists with terms and concepts in the corpus. Fifth, we enhance co-occurrence statistics with latent topic attribution, to aid entity relationship discovery. Finally we quantitatively analyze the quality of three popular latent modeling techniques to examine under which circumstances each is useful.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-11
... end-use and end-user of the U.S. origin commodities to be exported. The information will assist the... information collection). Affected Public: Business or other for-profit organizations. Estimated Number of... become a matter of public record. Dated: January 5, 2012. Gwellnar Banks, Management Analyst, Office of...
Tree value system: users guide.
J.K. Ayer Sachet; D.G. Briggs; R.D. Fight
1989-01-01
This paper instructs resource analysts on use of the Tree Value System (TREEVAL). TREEVAL is a microcomputer system of programs for calculating tree or stand values and volumes based on predicted product recovery. Designed for analyzing silvicultural decisions, the system can also be used for appraisals and for evaluating log bucking. The system calculates results...
SOLVE The performance analyst for hardwood sawmills
Jeff Palmer; Jan Wiedenbeck; Elizabeth Porterfield
2009-01-01
Presents the users manual and CD-ROM for SOLVE, a computer program that helps sawmill managers improve efficiency and solve problems commonly found in hardwood sawmills. SOLVE provides information on key operational factors including log size distribution, lumber grade yields, lumber recovery factor and overrun, and break-even log costs. (Microsoft Windows? Edition)...
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Hooper, Russell
2016-11-01
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility.
FIESTA—An R estimation tool for FIA analysts
Tracey S. Frescino; Paul L. Patterson; Gretchen G. Moisen; Elizabeth A. Freeman
2015-01-01
FIESTA (Forest Inventory ESTimation for Analysis) is a user-friendly R package that was originally developed to support the production of estimates consistent with current tools available for the Forest Inventory and Analysis (FIA) National Program, such as FIDO (Forest Inventory Data Online) and EVALIDator. FIESTA provides an alternative data retrieval and reporting...
Analytic Steering: Inserting Context into the Information Dialog
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.
2011-10-23
An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less
Local Debonding and Fiber Breakage in Composite Materials Modeled Accurately
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2001-01-01
A prerequisite for full utilization of composite materials in aerospace components is accurate design and life prediction tools that enable the assessment of component performance and reliability. Such tools assist both structural analysts, who design and optimize structures composed of composite materials, and materials scientists who design and optimize the composite materials themselves. NASA Glenn Research Center's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package (http://www.grc.nasa.gov/WWW/LPB/mac) addresses this need for composite design and life prediction tools by providing a widely applicable and accurate approach to modeling composite materials. Furthermore, MAC/GMC serves as a platform for incorporating new local models and capabilities that are under development at NASA, thus enabling these new capabilities to progress rapidly to a stage in which they can be employed by the code's end users.
Intelligence Reach for Expertise (IREx)
NASA Astrophysics Data System (ADS)
Hadley, Christina; Schoening, James R.; Schreiber, Yonatan
2015-05-01
IREx is a search engine for next-generation analysts to find collaborators. U.S. Army Field Manual 2.0 (Intelligence) calls for collaboration within and outside the area of operations, but finding the best collaborator for a given task can be challenging. IREx will be demonstrated as part of Actionable Intelligence Technology Enabled Capability Demonstration (AI-TECD) at the E15 field exercises at Ft. Dix in July 2015. It includes a Task Model for describing a task and its prerequisite competencies, plus a User Model (i.e., a user profile) for individuals to assert their capabilities and other relevant data. These models use a canonical suite of ontologies as a foundation for these models, which enables robust queries and also keeps the models logically consistent. IREx also supports learning validation, where a learner who has completed a course module can search and find a suitable task to practice and demonstrate that their new knowledge can be used in the real world for its intended purpose. The IREx models are in the initial phase of a process to develop them as an IEEE standard. This initiative is currently an approved IEEE Study Group, after which follows a standards working group, then a balloting group, and if all goes well, an IEEE standard.
NASA Astrophysics Data System (ADS)
Kase, Sue E.; Vanni, Michelle; Knight, Joanne A.; Su, Yu; Yan, Xifeng
2016-05-01
Within operational environments decisions must be made quickly based on the information available. Identifying an appropriate knowledge base and accurately formulating a search query are critical tasks for decision-making effectiveness in dynamic situations. The spreading of graph data management tools to access large graph databases is a rapidly emerging research area of potential benefit to the intelligence community. A graph representation provides a natural way of modeling data in a wide variety of domains. Graph structures use nodes, edges, and properties to represent and store data. This research investigates the advantages of information search by graph query initiated by the analyst and interactively refined within the contextual dimensions of the answer space toward a solution. The paper introduces SLQ, a user-friendly graph querying system enabling the visual formulation of schemaless and structureless graph queries. SLQ is demonstrated with an intelligence analyst information search scenario focused on identifying individuals responsible for manufacturing a mosquito-hosted deadly virus. The scenario highlights the interactive construction of graph queries without prior training in complex query languages or graph databases, intuitive navigation through the problem space, and visualization of results in graphical format.
Understanding interfirm relationships in business ecosystems with interactive visualization.
Basole, Rahul C; Clear, Trustin; Hu, Mengdie; Mehrotra, Harshit; Stasko, John
2013-12-01
Business ecosystems are characterized by large, complex, and global networks of firms, often from many different market segments, all collaborating, partnering, and competing to create and deliver new products and services. Given the rapidly increasing scale, complexity, and rate of change of business ecosystems, as well as economic and competitive pressures, analysts are faced with the formidable task of quickly understanding the fundamental characteristics of these interfirm networks. Existing tools, however, are predominantly query- or list-centric with limited interactive, exploratory capabilities. Guided by a field study of corporate analysts, we have designed and implemented dotlink360, an interactive visualization system that provides capabilities to gain systemic insight into the compositional, temporal, and connective characteristics of business ecosystems. dotlink360 consists of novel, multiple connected views enabling the analyst to explore, discover, and understand interfirm networks for a focal firm, specific market segments or countries, and the entire business ecosystem. System evaluation by a small group of prototypical users shows supporting evidence of the benefits of our approach. This design study contributes to the relatively unexplored, but promising area of exploratory information visualization in market research and business strategy.
Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam
2017-06-01
The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.
The N-BOD2 user's and programmer's manual
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1978-01-01
A general purpose digital computer program was developed and designed to aid in the analysis of spacecraft attitude dynamics. The program provides the analyst with the capability of automatically deriving and numerically solving the equations of motion of any system that can be modeled as a topological tree of coupled rigid bodies, flexible bodies, point masses, and symmetrical momentum wheels. Two modes of output are available. The composite system equations of motion may be outputted on a line printer in a symbolic form that may be easily translated into common vector-dyadic notation, or the composite system equations of motion may be solved numerically and any desirable set of system state variables outputted as a function of time.
Berenguera, Anna; Pujol-Ribera, Enriqueta; Violan, Concepció; Romaguera, Amparo; Mansilla, Rosa; Giménez, Albert; Almeda, Jesús
2011-01-01
The main aim of this study was to identify the experiences of professionals in nongovernmental organizations (NGO) in Catalonia (Spain) working in HIV/AIDS prevention and control activities and potential areas of improvement of these activities and their evaluation. A further aim was to characterize the experiences, knowledge and practices of users of these organizations with regard to HIV infection and its prevention. A phenomenological qualitative study was conducted with the participation of both professionals and users of Catalan nongovernmental organizations (NGO) working in HIV/AIDS. Theoretical sampling (professional) and opportunistic sampling (users) were performed. To collect information, the following techniques were used: four focus groups and one triangular group (professionals), 22 semi-structured interviews, and two observations (users). A thematic interpretive content analysis was conducted by three analysts. The professionals of nongovernmental organizations working in HIV/AIDS adopted a holistic approach in their activities, maintained confidentiality, had cultural and professional competence and followed the principles of equality and empathy. The users of these organizations had knowledge of HIV/AIDS and understood the risk of infection. However, a gap was found between knowledge, attitudes and behavior. NGO offer distinct activities adapted to users' needs. Professionals emphasize the need for support and improvement of planning and implementation of current assessment. The preventive activities of these HIV/AIDS organizations are based on a participatory health education model adjusted to people's needs and focused on empowerment. Copyright © 2010 SESPAS. Published by Elsevier Espana. All rights reserved.
Learning patterns of life from intelligence analyst chat
NASA Astrophysics Data System (ADS)
Schneider, Michael K.; Alford, Mark; Babko-Malaya, Olga; Blasch, Erik; Chen, Lingji; Crespi, Valentino; HandUber, Jason; Haney, Phil; Nagy, Jim; Richman, Mike; Von Pless, Gregory; Zhu, Howie; Rhodes, Bradley J.
2016-05-01
Our Multi-INT Data Association Tool (MIDAT) learns patterns of life (POL) of a geographical area from video analyst observations called out in textual reporting. Typical approaches to learning POLs from video make use of computer vision algorithms to extract locations in space and time of various activities. Such approaches are subject to the detection and tracking performance of the video processing algorithms. Numerous examples of human analysts monitoring live video streams annotating or "calling out" relevant entities and activities exist, such as security analysis, crime-scene forensics, news reports, and sports commentary. This user description typically corresponds with textual capture, such as chat. Although the purpose of these text products is primarily to describe events as they happen, organizations typically archive the reports for extended periods. This archive provides a basis to build POLs. Such POLs are useful for diagnosis to assess activities in an area based on historical context, and for consumers of products, who gain an understanding of historical patterns. MIDAT combines natural language processing, multi-hypothesis tracking, and Multi-INT Activity Pattern Learning and Exploitation (MAPLE) technologies in an end-to-end lab prototype that processes textual products produced by video analysts, infers POLs, and highlights anomalies relative to those POLs with links to "tracks" of related activities performed by the same entity. MIDAT technologies perform well, achieving, for example, a 90% F1-value on extracting activities from the textual reports.
SoccerStories: a kick-off for visual soccer analysis.
Perin, Charles; Vuillemot, Romain; Fekete, Jean-Daniel
2013-12-01
This article presents SoccerStories, a visualization interface to support analysts in exploring soccer data and communicating interesting insights. Currently, most analyses on such data relate to statistics on individual players or teams. However, soccer analysts we collaborated with consider that quantitative analysis alone does not convey the right picture of the game, as context, player positions and phases of player actions are the most relevant aspects. We designed SoccerStories to support the current practice of soccer analysts and to enrich it, both in the analysis and communication stages. Our system provides an overview+detail interface of game phases, and their aggregation into a series of connected visualizations, each visualization being tailored for actions such as a series of passes or a goal attempt. To evaluate our tool, we ran two qualitative user studies on recent games using SoccerStories with data from one of the world's leading live sports data providers. The first study resulted in a series of four articles on soccer tactics, by a tactics analyst, who said he would not have been able to write these otherwise. The second study consisted in an exploratory follow-up to investigate design alternatives for embedding soccer phases into word-sized graphics. For both experiments, we received a very enthusiastic feedback and participants consider further use of SoccerStories to enhance their current workflow.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-07
... users in the field in preventing the illegal entry of people and goods, or identifying other violations....g., Employer Identification Number (EIN) or Social Security Number (SSN), where available). f... entry of people and goods, or identifying other violations of law; (2) Allow analysts to conduct...
NASA Astrophysics Data System (ADS)
Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin
2015-05-01
The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.
The Analyst's "Use" of Theory or Theories: The Play of Theory.
Cooper, Steven H
2017-10-01
Two clinical vignettes demonstrate a methodological approach that guides the analyst's attention to metaphors and surfaces that are the focus of different theories. Clinically, the use of different theories expands the metaphorical language with which the analyst tries to make contact with the patient's unconscious life. Metaphorical expressions may be said to relate to each other as the syntax of unconscious fantasy (Arlow 1979). The unconscious fantasy itself represents a metaphorical construction of childhood experience that has persisted, dynamically expressive and emergent into adult life. This persistence is evident in how, in some instances, long periods of an analysis focus on translating one or a few metaphors, chiefly because the manifest metaphorical expressions of a central theme regularly lead to better understanding of an unconscious fantasy. At times employing another model or theory assists in a level of self-reflection about clinical understanding and clinical decisions. The analyst's choice of theory or theories is unique to the analyst and is not prescriptive, except as illustrating a way to think about these issues. The use of multiple models in no way suggests or implies that theories may be integrated.
Temporal abstraction-based clinical phenotyping with Eureka!
Post, Andrew R; Kurc, Tahsin; Willard, Richie; Rathod, Himanshu; Mansour, Michel; Pai, Akshatha Kalsanka; Torian, William M; Agravat, Sanjay; Sturm, Suzanne; Saltz, Joel H
2013-01-01
Temporal abstraction, a method for specifying and detecting temporal patterns in clinical databases, is very expressive and performs well, but it is difficult for clinical investigators and data analysts to understand. Such patterns are critical in phenotyping patients using their medical records in research and quality improvement. We have previously developed the Analytic Information Warehouse (AIW), which computes such phenotypes using temporal abstraction but requires software engineers to use. We have extended the AIW's web user interface, Eureka! Clinical Analytics, to support specifying phenotypes using an alternative model that we developed with clinical stakeholders. The software converts phenotypes from this model to that of temporal abstraction prior to data processing. The model can represent all phenotypes in a quality improvement project and a growing set of phenotypes in a multi-site research study. Phenotyping that is accessible to investigators and IT personnel may enable its broader adoption.
Kang, Youn-Ah; Stasko, J
2012-12-01
While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.
Controlled English to facilitate human/machine analytical processing
NASA Astrophysics Data System (ADS)
Braines, Dave; Mott, David; Laws, Simon; de Mel, Geeth; Pham, Tien
2013-06-01
Controlled English is a human-readable information representation format that is implemented using a restricted subset of the English language, but which is unambiguous and directly accessible by simple machine processes. We have been researching the capabilities of CE in a number of contexts, and exploring the degree to which a flexible and more human-friendly information representation format could aid the intelligence analyst in a multi-agent collaborative operational environment; especially in cases where the agents are a mixture of other human users and machine processes aimed at assisting the human users. CE itself is built upon a formal logic basis, but allows users to easily specify models for a domain of interest in a human-friendly language. In our research we have been developing an experimental component known as the "CE Store" in which CE information can be quickly and flexibly processed and shared between human and machine agents. The CE Store environment contains a number of specialized machine agents for common processing tasks and also supports execution of logical inference rules that can be defined in the same CE language. This paper outlines the basic architecture of this approach, discusses some of the example machine agents that have been developed, and provides some typical examples of the CE language and the way in which it has been used to support complex analytical tasks on synthetic data sources. We highlight the fusion of human and machine processing supported through the use of the CE language and CE Store environment, and show this environment with examples of highly dynamic extensions to the model(s) and integration between different user-defined models in a collaborative setting.
2013-11-01
by existing cyber-attack detection tools far exceeds the analysts’ cognitive capabilities. Grounded in perceptual and cognitive theory , many visual...Processes Inspired by the sense-making theory discussed earlier, we model the analytical reasoning process of cyber analysts using three key...analyst are called “working hypotheses”); each hypothesis could trigger further actions to confirm or disconfirm it. New actions will lead to new
LoyalTracker: Visualizing Loyalty Dynamics in Search Engines.
Shi, Conglei; Wu, Yingcai; Liu, Shixia; Zhou, Hong; Qu, Huamin
2014-12-01
The huge amount of user log data collected by search engine providers creates new opportunities to understand user loyalty and defection behavior at an unprecedented scale. However, this also poses a great challenge to analyze the behavior and glean insights into the complex, large data. In this paper, we introduce LoyalTracker, a visual analytics system to track user loyalty and switching behavior towards multiple search engines from the vast amount of user log data. We propose a new interactive visualization technique (flow view) based on a flow metaphor, which conveys a proper visual summary of the dynamics of user loyalty of thousands of users over time. Two other visualization techniques, a density map and a word cloud, are integrated to enable analysts to gain further insights into the patterns identified by the flow view. Case studies and the interview with domain experts are conducted to demonstrate the usefulness of our technique in understanding user loyalty and switching behavior in search engines.
Design Requirements for Communication-Intensive Interactive Applications
NASA Astrophysics Data System (ADS)
Bolchini, Davide; Garzotto, Franca; Paolini, Paolo
Online interactive applications call for new requirements paradigms to capture the growing complexity of computer-mediated communication. Crafting successful interactive applications (such as websites and multimedia) involves modeling the requirements for the user experience, including those leading to content design, usable information architecture and interaction, in profound coordination with the communication goals of all stakeholders involved, ranging from persuasion to social engagement, to call for action. To face this grand challenge, we propose a methodology for modeling communication requirements and provide a set of operational conceptual tools to be used in complex projects with multiple stakeholders. Through examples from real-life projects and lessons-learned from direct experience, we draw on the concepts of brand, value, communication goals, information and persuasion requirements to systematically guide analysts to master the multifaceted connections of these elements as drivers to inform successful communication designs.
Defense Security Enterprise Architecture (DSEA) Product Reference Guide. Revision 1.0
2016-06-01
research and development efforts and functional requirements to provide an information sharing capability across all defense security domains. The...Office of the Secretary of Defense (OSD) Research and Development (RDT&E) initiative addressing vertical and horizontal information sharing across the...legal responsibilities to ensure data received by analysts meets user- specified criteria. This advancement in information sharing is made
Using FIESTA , an R-based tool for analysts, to look at temporal trends in forest estimates
Tracey S. Frescino; Paul L. Patterson; Elizabeth A. Freeman; Gretchen G. Moisen
2012-01-01
FIESTA (Forest Inventory Estimation for Analysis) is a user-friendly R package that supports the production of estimates for forest resources based on procedures from Bechtold and Patterson (2005). The package produces output consistent with current tools available for the Forest Inventory and Analysis National Program, such as FIDO (Forest Inventory Data Online) and...
Interfaces Visualize Data for Airline Safety, Efficiency
NASA Technical Reports Server (NTRS)
2014-01-01
As the A-Train Constellation orbits Earth to gather data, NASA scientists and partners visualize, analyze, and communicate the information. To this end, Langley Research Center awarded SBIR funding to Fairfax, Virginia-based WxAnalyst Ltd. to refine the company's existing user interface for Google Earth to visualize data. Hawaiian Airlines is now using the technology to help manage its flights.
ScatterBlogs2: real-time monitoring of microblog messages through user-guided filtering.
Bosch, Harald; Thom, Dennis; Heimerl, Florian; Püttmann, Edwin; Koch, Steffen; Krüger, Robert; Wörner, Michael; Ertl, Thomas
2013-12-01
The number of microblog posts published daily has reached a level that hampers the effective retrieval of relevant messages, and the amount of information conveyed through services such as Twitter is still increasing. Analysts require new methods for monitoring their topic of interest, dealing with the data volume and its dynamic nature. It is of particular importance to provide situational awareness for decision making in time-critical tasks. Current tools for monitoring microblogs typically filter messages based on user-defined keyword queries and metadata restrictions. Used on their own, such methods can have drawbacks with respect to filter accuracy and adaptability to changes in trends and topic structure. We suggest ScatterBlogs2, a new approach to let analysts build task-tailored message filters in an interactive and visual manner based on recorded messages of well-understood previous events. These message filters include supervised classification and query creation backed by the statistical distribution of terms and their co-occurrences. The created filter methods can be orchestrated and adapted afterwards for interactive, visual real-time monitoring and analysis of microblog feeds. We demonstrate the feasibility of our approach for analyzing the Twitter stream in emergency management scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Dustin Yewell
Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less
Azcorra, A; Chiroque, L F; Cuevas, R; Fernández Anta, A; Laniado, H; Lillo, R E; Romo, J; Sguera, C
2018-05-03
Billions of users interact intensively every day via Online Social Networks (OSNs) such as Facebook, Twitter, or Google+. This makes OSNs an invaluable source of information, and channel of actuation, for sectors like advertising, marketing, or politics. To get the most of OSNs, analysts need to identify influential users that can be leveraged for promoting products, distributing messages, or improving the image of companies. In this report we propose a new unsupervised method, Massive Unsupervised Outlier Detection (MUOD), based on outliers detection, for providing support in the identification of influential users. MUOD is scalable, and can hence be used in large OSNs. Moreover, it labels the outliers as of shape, magnitude, or amplitude, depending of their features. This allows classifying the outlier users in multiple different classes, which are likely to include different types of influential users. Applying MUOD to a subset of roughly 400 million Google+ users, it has allowed identifying and discriminating automatically sets of outlier users, which present features associated to different definitions of influential users, like capacity to attract engagement, capacity to attract a large number of followers, or high infection capacity.
Petroleum Market Model of the National Energy Modeling System. Part 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. The PMM models petroleum refining activities, the marketing of petroleum products to consumption regions, the production of natural gas liquids in gas processing plants, and domestic methanol production. The PMM projects petroleum product prices and sources of supply for meeting petroleum product demand. The sources of supply include crude oil, both domestic and imported; other inputs including alcoholsmore » and ethers; natural gas plant liquids production; petroleum product imports; and refinery processing gain. In addition, the PMM estimates domestic refinery capacity expansion and fuel consumption. Product prices are estimated at the Census division level and much of the refining activity information is at the Petroleum Administration for Defense (PAD) District level.« less
Using a Model of Analysts' Judgments to Augment an Item Calibration Process
ERIC Educational Resources Information Center
Hauser, Carl; Thum, Yeow Meng; He, Wei; Ma, Lingling
2015-01-01
When conducting item reviews, analysts evaluate an array of statistical and graphical information to assess the fit of a field test (FT) item to an item response theory model. The process can be tedious, particularly when the number of human reviews (HR) to be completed is large. Furthermore, such a process leads to decisions that are susceptible…
Learning to merge: a new tool for interactive mapping
NASA Astrophysics Data System (ADS)
Porter, Reid B.; Lundquist, Sheng; Ruggiero, Christy
2013-05-01
The task of turning raw imagery into semantically meaningful maps and overlays is a key area of remote sensing activity. Image analysts, in applications ranging from environmental monitoring to intelligence, use imagery to generate and update maps of terrain, vegetation, road networks, buildings and other relevant features. Often these tasks can be cast as a pixel labeling problem, and several interactive pixel labeling tools have been developed. These tools exploit training data, which is generated by analysts using simple and intuitive paint-program annotation tools, in order to tailor the labeling algorithm for the particular dataset and task. In other cases, the task is best cast as a pixel segmentation problem. Interactive pixel segmentation tools have also been developed, but these tools typically do not learn from training data like the pixel labeling tools do. In this paper we investigate tools for interactive pixel segmentation that also learn from user input. The input has the form of segment merging (or grouping). Merging examples are 1) easily obtained from analysts using vector annotation tools, and 2) more challenging to exploit than traditional labels. We outline the key issues in developing these interactive merging tools, and describe their application to remote sensing.
Elements of analytic style: Bion's clinical seminars.
Ogden, Thomas H
2007-10-01
The author finds that the idea of analytic style better describes significant aspects of the way he practices psychoanalysis than does the notion of analytic technique. The latter is comprised to a large extent of principles of practice developed by previous generations of analysts. By contrast, the concept of analytic style, though it presupposes the analyst's thorough knowledge of analytic theory and technique, emphasizes (1) the analyst's use of his unique personality as reflected in his individual ways of thinking, listening, and speaking, his own particular use of metaphor, humor, irony, and so on; (2) the analyst's drawing on his personal experience, for example, as an analyst, an analysand, a parent, a child, a spouse, a teacher, and a student; (3) the analyst's capacity to think in a way that draws on, but is independent of, the ideas of his colleagues, his teachers, his analyst, and his analytic ancestors; and (4) the responsibility of the analyst to invent psychoanalysis freshly for each patient. Close readings of three of Bion's 'Clinical seminars' are presented in order to articulate some of the elements of Bion's analytic style. Bion's style is not presented as a model for others to emulate or, worse yet, imitate; rather, it is described in an effort to help the reader consider from a different vantage point (provided by the concept of analytic style) the way in which he, the reader, practices psychoanalysis.
Training for spacecraft technical analysts
NASA Technical Reports Server (NTRS)
Ayres, Thomas J.; Bryant, Larry
1989-01-01
Deep space missions such as Voyager rely upon a large team of expert analysts who monitor activity in the various engineering subsystems of the spacecraft and plan operations. Senior teammembers generally come from the spacecraft designers, and new analysts receive on-the-job training. Neither of these methods will suffice for the creation of a new team in the middle of a mission, which may be the situation during the Magellan mission. New approaches are recommended, including electronic documentation, explicit cognitive modeling, and coached practice with archived data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Dustin Yewell
This document is a white paper marketing proposal for Echo™ is a data analysis platform designed for efficient, robust, and scalable creation and execution of complex workflows. Echo’s analysis management system refers to the ability to track, understand, and reproduce workflows used for arriving at results and decisions. Echo improves on traditional scripted data analysis in MATLAB, Python, R, and other languages to allow analysts to make better use of their time. Additionally, the Echo platform provides a powerful data management and curation solution allowing analysts to quickly find, access, and consume datasets. After two years of development and amore » first release in early 2016, Echo is now available for use with many data types in a wide range of application domains. Echo provides tools that allow users to focus on data analysis and decisions with confidence that results are reported accurately.« less
Sharing adverse drug event data using business intelligence technology.
Horvath, Monica M; Cozart, Heidi; Ahmad, Asif; Langman, Matthew K; Ferranti, Jeffrey
2009-03-01
Duke University Health System uses computerized adverse drug event surveillance as an integral part of medication safety at 2 community hospitals and an academic medical center. This information must be swiftly communicated to organizational patient safety stakeholders to find opportunities to improve patient care; however, this process is encumbered by highly manual methods of preparing the data. Following the examples of other industries, we deployed a business intelligence tool to provide dynamic safety reports on adverse drug events. Once data were migrated into the health system data warehouse, we developed census-adjusted reports with user-driven prompts. Drill down functionality enables navigation from aggregate trends to event details by clicking report graphics. Reports can be accessed by patient safety leadership either through an existing safety reporting portal or the health system performance improvement Web site. Elaborate prompt screens allow many varieties of reports to be created quickly by patient safety personnel without consultation with the research analyst. The reduction in research analyst workload because of business intelligence implementation made this individual available to additional patient safety projects thereby leveraging their talents more effectively. Dedicated liaisons are essential to ensure clear communication between clinical and technical staff throughout the development life cycle. Design and development of the business intelligence model for adverse drug event data must reflect the eccentricities of the operational system, especially as new areas of emphasis evolve. Future usability studies examining the data presentation and access model are needed.
A Conceptual Modeling Approach for OLAP Personalization
NASA Astrophysics Data System (ADS)
Garrigós, Irene; Pardillo, Jesús; Mazón, Jose-Norberto; Trujillo, Juan
Data warehouses rely on multidimensional models in order to provide decision makers with appropriate structures to intuitively analyze data with OLAP technologies. However, data warehouses may be potentially large and multidimensional structures become increasingly complex to be understood at a glance. Even if a departmental data warehouse (also known as data mart) is used, these structures would be also too complex. As a consequence, acquiring the required information is more costly than expected and decision makers using OLAP tools may get frustrated. In this context, current approaches for data warehouse design are focused on deriving a unique OLAP schema for all analysts from their previously stated information requirements, which is not enough to lighten the complexity of the decision making process. To overcome this drawback, we argue for personalizing multidimensional models for OLAP technologies according to the continuously changing user characteristics, context, requirements and behaviour. In this paper, we present a novel approach to personalizing OLAP systems at the conceptual level based on the underlying multidimensional model of the data warehouse, a user model and a set of personalization rules. The great advantage of our approach is that a personalized OLAP schema is provided for each decision maker contributing to better satisfy their specific analysis needs. Finally, we show the applicability of our approach through a sample scenario based on our CASE tool for data warehouse development.
Operational Reconnaissance for the Anti-Access /Area Denial environment
2015-04-01
locations, the Air Force Distributed Common Ground System ( DCGS ) collects, processes, analyzes, and disseminates over 1.3 million megabits of... DCGS ; satellite data link between the aircraft and ground based receiver; and fiber- optic connection between the receiver, RPA crew, and DCGS . This...analysts and end users. DCGS Integration The Air Force global ISR enterprise is not configured to efficiently receive, exploit, or disseminate fighter
Purpose-Driven Communities in Multiplex Networks: Thresholding User-Engaged Layer Aggregation
2016-06-01
dark networks is a non-trivial yet useful task. Because terrorists work hard to hide their relationships/network, analysts have an incomplete picture...them identify meaningful terrorist communities. This thesis introduces a general-purpose algorithm for community detection in multiplex dark networks...aggregation, dark networks, conductance, cluster adequacy, mod- ularity, Louvain method, shortest path interdiction 15. NUMBER OF PAGES 155 16. PRICE CODE
NASA Technical Reports Server (NTRS)
Carpenter, James R.; Berry, Kevin; Gregpru. Late; Speckman, Keith; Hur-Diaz, Sun; Surka, Derek; Gaylor, Dave
2010-01-01
The Orbit Determination Toolbox is an orbit determination (OD) analysis tool based on MATLAB and Java that provides a flexible way to do early mission analysis. The toolbox is primarily intended for advanced mission analysis such as might be performed in concept exploration, proposal, early design phase, or rapid design center environments. The emphasis is on flexibility, but it has enough fidelity to produce credible results. Insight into all flight dynamics source code is provided. MATLAB is the primary user interface and is used for piecing together measurement and dynamic models. The Java Astrodynamics Toolbox is used as an engine for things that might be slow or inefficient in MATLAB, such as high-fidelity trajectory propagation, lunar and planetary ephemeris look-ups, precession, nutation, polar motion calculations, ephemeris file parsing, and the like. The primary analysis functions are sequential filter/smoother and batch least-squares commands that incorporate Monte-Carlo data simulation, linear covariance analysis, measurement processing, and plotting capabilities at the generic level. These functions have a user interface that is based on that of the MATLAB ODE suite. To perform a specific analysis, users write MATLAB functions that implement truth and design system models. The user provides his or her models as inputs to the filter commands. The software provides a capability to publish and subscribe to a software bus that is compliant with the NASA Goddard Mission Services Evolution Center (GMSEC) standards, to exchange data with other flight dynamics tools to simplify the flight dynamics design cycle. Using the publish and subscribe approach allows for analysts in a rapid design center environment to seamlessly incorporate changes in spacecraft and mission design into navigation analysis and vice versa.
Enforced Sparse Non-Negative Matrix Factorization
2016-01-23
documents to find interesting pieces of information. With limited resources, analysts often employ automated text - mining tools that highlight common...represented as an undirected bipartite graph. It has become a common method for generating topic models of text data because it is known to produce good results...model and the convergence rate of the underlying algorithm. I. Introduction A common analyst challenge is searching through large quantities of text
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
NASA Astrophysics Data System (ADS)
Budde, M. E.; Rowland, J.; Anthony, M.; Palka, S.; Martinez, J.; Hussain, R.
2017-12-01
The U.S. Geological Survey (USGS) supports the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET). The USGS Earth Resources Observation and Science (EROS) Center has developed tools designed to aid food security analysts in developing assumptions of agro-climatological outcomes. There are four primary steps to developing agro-climatology assumptions; including: 1) understanding the climatology, 2) evaluating current climate modes, 3) interpretation of forecast information, and 4) incorporation of monitoring data. Analysts routinely forecast outcomes well in advance of the growing season, which relies on knowledge of climatology. A few months prior to the growing season, analysts can assess large-scale climate modes that might influence seasonal outcomes. Within two months of the growing season, analysts can evaluate seasonal forecast information as indicators. Once the growing season begins, monitoring data, based on remote sensing and field information, can characterize the start of season and remain integral monitoring tools throughout the duration of the season. Each subsequent step in the process can lead to modifications of the original climatology assumption. To support such analyses, we have created an agro-climatology analysis tool that characterizes each step in the assumption building process. Satellite-based rainfall and normalized difference vegetation index (NDVI)-based products support both the climatology and monitoring steps, sea-surface temperature data and knowledge of the global climate system inform the climate modes, and precipitation forecasts at multiple scales support the interpretation of forecast information. Organizing these data for a user-specified area provides a valuable tool for food security analysts to better formulate agro-climatology assumptions that feed into food security assessments. We have also developed a knowledge base for over 80 countries that provide rainfall and NDVI-based products, including annual and seasonal summaries, historical anomalies, coefficient of variation, and number of years below 70% of annual or seasonal averages. These products provide a quick look for analysts to assess the agro-climatology of a country.
EIA model documentation: Petroleum market model of the national energy modeling system
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-12-28
The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. Documentation of the model is in accordance with EIA`s legal obligation to provide adequate documentation in support of its models. The PMM models petroleum refining activities, the marketing of petroleum products to consumption regions, the production of natural gas liquids in gas processing plants, and domestic methanol production. The PMM projects petroleum product prices and sources of supplymore » for meeting petroleum product demand. The sources of supply include crude oil, both domestic and imported; other inputs including alcohols and ethers; natural gas plant liquids production; petroleum product imports; and refinery processing gain. In addition, the PMM estimates domestic refinery capacity expansion and fuel consumption. Product prices are estimated at the Census division level and much of the refining activity information is at the Petroleum Administration for Defense (PAD) District level.« less
Payload crew training scheduler (PACTS) user's manual
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1980-01-01
The operation of the payload specialist training scheduler (PACTS) is discussed in this user's manual which is used to schedule payload specialists for mission training on the Spacelab experiments. The PACTS program is a fully automated interactive, computerized scheduling program equipped with tutorial displays. The tutorial displays are sufficiently detailed for use by a program analyst having no computer experience. The PACTS program is designed to operate on the UNIVAC 1108 computer system, and has the capability to load output into a PDP 11/45 Interactive Graphics Display System for printing schedules. The program has the capacity to handle up to three overlapping Spacelab missions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
CHRISTON, MARK A.
2003-06-01
GILA is a finite element code that has been developed specifically to attack the class of transient, incompressible, viscous, fluid dynamics problems that are predominant in the world that surrounds us. The purpose for this document is to provide sufficient information for an experienced analyst to use GILA in an effective way. The GILA User's Manual presents a technical outline of the governing equations for time-dependent incompressible flow, and the explicit and semi-implicit projection methods used in GILA to solve the equations. This manual also presents a brief overview of some of GILA's capabilities along with the keyword input syntaxmore » and sample problems.« less
A Graphical User-Interface for Propulsion System Analysis
NASA Technical Reports Server (NTRS)
Curlett, Brian P.; Ryall, Kathleen
1992-01-01
NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.
A graphical user-interface for propulsion system analysis
NASA Technical Reports Server (NTRS)
Curlett, Brian P.; Ryall, Kathleen
1993-01-01
NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.
Advancing Climate Change and Impacts Science Through Climate Informatics
NASA Astrophysics Data System (ADS)
Lenhardt, W.; Pouchard, L. C.; King, A. W.; Branstetter, M. L.; Kao, S.; Wang, D.
2010-12-01
This poster will outline the work to date on developing a climate informatics capability at Oak Ridge National Laboratory (ORNL). The central proposition of this effort is that the application of informatics and information science to the domain of climate change science is an essential means to bridge the realm of high performance computing (HPC) and domain science. The goal is to facilitate knowledge capture and the creation of new scientific insights. For example, a climate informatics capability will help with the understanding and use of model results in domain sciences that were not originally in the scope. From there, HPC can also benefit from feedback as the new approaches may lead to better parameterization in the models. In this poster we will summarize the challenges associated with climate change science that can benefit from the systematic application of informatics and we will highlight our work to date in creating the climate informatics capability to address these types of challenges. We have identified three areas that are particularly challenging in the context of climate change science: 1) integrating model and observational data across different spatial and temporal scales, 2) model linkages, i.e. climate models linked to other models such as hydrologic models, and 3) model diagnostics. Each of these has a methodological component and an informatics component. Our project under way at ORNL seeks to develop new approaches and tools in the context of linking climate change and water issues. We are basing our work on the following four use cases: 1) Evaluation/test of CCSM4 biases in hydrology (precipitation, soil water, runoff, river discharge) over the Rio Grande Basin. User: climate modeler. 2) Investigation of projected changes in hydrology of Rio Grande Basin using the VIC (Variable Infiltration Capacity Macroscale) Hydrologic Model. User: watershed hydrologist/modeler. 3) Impact of climate change on agricultural productivity of the Rio Grande Basin. User: climate impact scientist, agricultural economist. 4) Renegotiation of the 1944 “Treaty for the Utilization of Waters of the Colorado and Tijuana Rivers and of the Rio Grande”. User: A US State Department analyst or their counterpart in Mexico.
An interactive program for computer-aided map design, display, and query: EMAPKGS2
Pouch, G.W.
1997-01-01
EMAPKGS2 is a user-friendly, PC-based electronic mapping tool for use in hydrogeologic exploration and appraisal. EMAPKGS2 allows the analyst to construct maps interactively from data stored in a relational database, perform point-oriented spatial queries such as locating all wells within a specified radius, perform geographic overlays, and export the data to other programs for further analysis. EMAPKGS2 runs under Microsoft?? Windows??? 3.1 and compatible operating systems. EMAPKGS2 is a public domain program available from the Kansas Geological Survey. EMAPKGS2 is the centerpiece of WHEAT, the Windows-based Hydrogeologic Exploration and Appraisal Toolkit, a suite of user-friendly Microsoft?? Windows??? programs for natural resource exploration and management. The principal goals in development of WHEAT have been ease of use, hardware independence, low cost, and end-user extensibility. WHEAT'S native data format is a Microsoft?? Access?? database. WHEAT stores a feature's geographic coordinates as attributes so they can be accessed easily by the user. The WHEAT programs are designed to be used in conjunction with other Microsoft?? Windows??? software to allow the natural resource scientist to perform work easily and effectively. WHEAT and EMAPKGS have been used at several of Kansas' Groundwater Management Districts and the Kansas Geological Survey on groundwater management operations, groundwater modeling projects, and geologic exploration projects. ?? 1997 Elsevier Science Ltd.
Questionnaires for eliciting evaluation data from users of interactive question answering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelly, Diane; Kantor, Paul B.; Morse, Emile
Evaluating interactive question answering (QA) systems with real users can be challenging because traditional evaluation measures based on the relevance of items returned are difficult to employ since relevance judgments can be unstable in multi-user evaluations. The work reported in this paper evaluates, in distinguishing among a set of interactive QA systems, the effectiveness of three questionnaires: a Cognitive Workload Questionnaire (NASA TLX), and Task and System Questionnaires customized to a specific interactive QA application. These Questionnaires were evaluated with four systems, seven analysts, and eight scenarios during a 2-week workshop. Overall, results demonstrate that all three Questionnaires are effectivemore » at distinguishing among systems, with the Task Questionnaire being the most sensitive. Results also provide initial support for the validity and reliability of the Questionnaires.« less
NASA Technical Reports Server (NTRS)
Cooke, C. H.
1975-01-01
STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.
2016-01-07
news. Both of these resemble typical activities of intelligence analysts in OSINT processing and production applications. We assessed two task...intelligence analysts in a number of OSINT processing and production applications. (5) Summary of the most important results In both settings
ASPECTS: an automation-assisted SPE method development system.
Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu
2013-07-01
A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.
Toward a Sustained, Multi-disciplinary Socioeconomic Community
NASA Astrophysics Data System (ADS)
Pearlman, J.; Pearlman, F.
2014-12-01
Over the last several years the availability of geospatial data has evolved from a scarce and expensive resource, primarily provided by governmental organizations to an abundant resource, often sourced at no or minimum charge by a much broader community including citizen scientists. In an upcoming workshop (October 28/29, 2014), the consequences of the changing technology, data, and policy landscape will be examined thus evaluating the emerging new data-driven paradigms, and advancing the state-of-the-art methodologies to measure the resulting socioeconomic impacts. Providers and users of geospatial data span a broad range of multi-disciplinary areas include policy makers and analysts, financial analysts, economists, geospatial practitioners and other experts from government, academia and the private sector. This presentation will focus on the emerging plan for a sustained, multi-disciplinary community to identify and pursue exemplary use cases for further research and applications. Considerations will include the necessary outreach enablers for such a project.
User guidelines and best practices for CASL VUQ analysis using Dakota.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Swiler, Laura Painton; Hooper, Russell
2014-03-01
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. This manual offers Consortium for Advanced Simulation of Light Water Reactors (LWRs) (CASL) partners a guide to conducting Dakota-based VUQ studies for CASL problems. It motivates various classes of Dakota methods and includes examples of their use on representative application problems. On reading, a CASL analyst should understand why and howmore » to apply Dakota to a simulation problem. This SAND report constitutes the product of CASL milestone L3:VUQ.V&V.P8.01 and is also being released as a CASL unlimited release report with number CASL-U-2014-0038-000.« less
Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.
Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam
2018-01-01
During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.
Advancing the Implementation of Hydrologic Models as Web-based Applications
NASA Astrophysics Data System (ADS)
Dahal, P.; Tarboton, D. G.; Castronova, A. M.
2017-12-01
Advanced computer simulations are required to understand hydrologic phenomenon such as rainfall-runoff response, groundwater hydrology, snow hydrology, etc. Building a hydrologic model instance to simulate a watershed requires investment in data (diverse geospatial datasets such as terrain, soil) and computer resources, typically demands a wide skill set from the analyst, and the workflow involved is often difficult to reproduce. This work introduces a web-based prototype infrastructure in the form of a web application that provides researchers with easy to use access to complete hydrological modeling functionality. This includes creating the necessary geospatial and forcing data, preparing input files for a model by applying complex data preprocessing, running the model for a user defined watershed, and saving the results to a web repository. The open source Tethys Platform was used to develop the web app front-end Graphical User Interface (GUI). We used HydroDS, a webservice that provides data preparation processing capability to support backend computations used by the app. Results are saved in HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. The TOPographic Kinematic APproximation and Integration (TOPKAPI) model served as the example for which we developed a complete hydrologic modeling service to demonstrate the approach. The final product is a complete modeling system accessible through the web to create input files, and run the TOPKAPI hydrologic model for a watershed of interest. We are investigating similar functionality for the preparation of input to Regional Hydro-Ecological Simulation System (RHESSys). Key Words: hydrologic modeling, web services, hydrologic information system, HydroShare, HydroDS, Tethys Platform
NASA Technical Reports Server (NTRS)
Butler, T. G.
1985-01-01
Some of the problems that confront an analyst in free body modeling, to satisfy rigid body conditions are discussed and with some remedies for these problems are presented. The problems of detecting these culprits at various levels within the analysis are examined. A new method within NASTRAN for checking the model for defects very early in the analysis without requiring the analyst to bear the expense of an eigenvalue analysis before discovering these defects is outlined.
NASA Technical Reports Server (NTRS)
Conway, R.; Matuck, G. N.; Roe, J. M.; Taylor, J.; Turner, A.
1975-01-01
A vortex information display system is described which provides flexible control through system-user interaction for collecting wing-tip-trailing vortex data, processing this data in real time, displaying the processed data, storing raw data on magnetic tape, and post processing raw data. The data is received from two asynchronous laser Doppler velocimeters (LDV's) and includes position, velocity, and intensity information. The raw data is written onto magnetic tape for permanent storage and is also processed in real time to locate vortices and plot their positions as a function of time. The interactive capability enables the user to make real time adjustments in processing data and provides a better definition of vortex behavior. Displaying the vortex information in real time produces a feedback capability to the LDV system operator allowing adjustments to be made in the collection of raw data. Both raw data and processing can be continually upgraded during flyby testing to improve vortex behavior studies. The post-analysis capability permits the analyst to perform in-depth studies of test data and to modify vortex behavior models to improve transport predictions.
Creative user-centered visualization design for energy analysts and modelers.
Goodwin, Sarah; Dykes, Jason; Jones, Sara; Dillingham, Iain; Dove, Graham; Duffy, Alison; Kachkaev, Alexander; Slingsby, Aidan; Wood, Jo
2013-12-01
We enhance a user-centered design process with techniques that deliberately promote creativity to identify opportunities for the visualization of data generated by a major energy supplier. Visualization prototypes developed in this way prove effective in a situation whereby data sets are largely unknown and requirements open - enabling successful exploration of possibilities for visualization in Smart Home data analysis. The process gives rise to novel designs and design metaphors including data sculpting. It suggests: that the deliberate use of creativity techniques with data stakeholders is likely to contribute to successful, novel and effective solutions; that being explicit about creativity may contribute to designers developing creative solutions; that using creativity techniques early in the design process may result in a creative approach persisting throughout the process. The work constitutes the first systematic visualization design for a data rich source that will be increasingly important to energy suppliers and consumers as Smart Meter technology is widely deployed. It is novel in explicitly employing creativity techniques at the requirements stage of visualization design and development, paving the way for further use and study of creativity methods in visualization design.
KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process
NASA Technical Reports Server (NTRS)
Gettig, Gary A.
1988-01-01
Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.
NASA Technical Reports Server (NTRS)
Zanley, Nancy L.
1991-01-01
The NASA Science Internet (NSI) Network Operations Staff is responsible for providing reliable communication connectivity for the NASA science community. As the NSI user community expands, so does the demand for greater interoperability with users and resources on other networks (e.g., NSFnet, ESnet), both nationally and internationally. Coupled with the science community's demand for greater access to other resources is the demand for more reliable communication connectivity. Recognizing this, the NASA Science Internet Project Office (NSIPO) expands its Operations activities. By January 1990, Network Operations was equipped with a telephone hotline, and its staff was expanded to six Network Operations Analysts. These six analysts provide 24-hour-a-day, 7-day-a-week coverage to assist site managers with problem determination and resolution. The NSI Operations staff monitors network circuits and their associated routers. In most instances, NSI Operations diagnoses and reports problems before users realize a problem exists. Monitoring of the NSI TCP/IP Network is currently being done with Proteon's Overview monitoring system. The Overview monitoring system displays a map of the NSI network utilizing various colors to indicate the conditions of the components being monitored. Each node or site is polled via the Simple Network Monitoring Protocol (SNMP). If a circuit goes down, Overview alerts the Network Operations staff with an audible alarm and changes the color of the component. When an alert is received, Network Operations personnel immediately verify and diagnose the problem, coordinate repair with other networking service groups, track problems, and document problem and resolution into a trouble ticket data base. NSI Operations offers the NSI science community reliable connectivity by exercising prompt assessment and resolution of network problems.
The Pentagon's Military Analyst Program
ERIC Educational Resources Information Center
Valeri, Andy
2014-01-01
This article provides an investigatory overview of the Pentagon's military analyst program, what it is, how it was implemented, and how it constitutes a form of propaganda. A technical analysis of the program is applied using the theoretical framework of the propaganda model first developed by Noam Chomsky and Edward S. Herman. Definitions…
Evaluation of SNS Beamline Shielding Configurations using MCNPX Accelerated by ADVANTG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Risner, Joel M; Johnson, Seth R.; Remec, Igor
2015-01-01
Shielding analyses for the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory pose significant computational challenges, including highly anisotropic high-energy sources, a combination of deep penetration shielding and an unshielded beamline, and a desire to obtain well-converged nearly global solutions for mapping of predicted radiation fields. The majority of these analyses have been performed using MCNPX with manually generated variance reduction parameters (source biasing and cell-based splitting and Russian roulette) that were largely based on the analyst's insight into the problem specifics. Development of the variance reduction parameters required extensive analyst time, and was often tailored to specific portionsmore » of the model phase space. We previously applied a developmental version of the ADVANTG code to an SNS beamline study to perform a hybrid deterministic/Monte Carlo analysis and showed that we could obtain nearly global Monte Carlo solutions with essentially uniform relative errors for mesh tallies that cover extensive portions of the model with typical voxel spacing of a few centimeters. The use of weight window maps and consistent biased sources produced using the FW-CADIS methodology in ADVANTG allowed us to obtain these solutions using substantially less computer time than the previous cell-based splitting approach. While those results were promising, the process of using the developmental version of ADVANTG was somewhat laborious, requiring user-developed Python scripts to drive much of the analysis sequence. In addition, limitations imposed by the size of weight-window files in MCNPX necessitated the use of relatively coarse spatial and energy discretization for the deterministic Denovo calculations that we used to generate the variance reduction parameters. We recently applied the production version of ADVANTG to this beamline analysis, which substantially streamlined the analysis process. We also tested importance function collapsing (in space and energy) capabilities in ADVANTG. These changes, along with the support for parallel Denovo calculations using the current version of ADVANTG, give us the capability to improve the fidelity of the deterministic portion of the hybrid analysis sequence, obtain improved weight-window maps, and reduce both the analyst and computational time required for the analysis process.« less
Concept Development and Experimentation Policy and Process: How Analysis Provides Rigour
2010-04-01
modelling and simulation techniques, but in reality the main tool in use is common sense and logic. The main goal of OA analyst is to bring forward those...doing so she should distinguish between the ideal and the intended or desired models to approach the reality as much as possible. Subsequently, the...and collection of measurements to be conducted. In doing so the analyst must ensure to distinguish between the actual and the perceived reality . From
Venture Evaluation and Review Technique (VERT). Users’/Analysts’ Manual
1979-10-01
real world. Additionally, activity pro- cessing times could be entered as a normal, uniform or triangular distribution. Activity times can also be...work or tasks, or if the unit activities are such abstractions of the real world that the estimation of the time , cost and performance parameters for...utilized in that con- straining capacity. 7444 The network being processed has passed all the previous error checks. It currently has a real time
Specialized Binary Analysis for Vetting Android APPS Using GUI Logic
2016-04-01
the use of high- level reasoning based on the GUI design logic of an app to enable a security analyst to diagnose and triage the potentially sensitive...execution paths of an app. Levels of Inconsistency We have identified three- levels of logical inconsistencies: Event- level inconsistency A sensitive...operation (e.g., taking a picture) is not trigged by user action on a GUI component. Layout- level inconsistency A sensitive operation is triggered by
Evaluation of the Presentation of Network Data via Visualization Tools for Network Analysts
2014-03-01
A. (eds.) The Human Computer Interaction Handbook, pp.544–582. Lawrence Erlbaum Associates, Mawah, NJ, 2003. 4. Goodall , John R. Introduction to...of either display type being used in the analysis of cyber security tasks. Goodall (19) is one of few whose work focused on comparing user...relating source IP address to destination IP address and time, Goodall remains the only known approach comparing tabular and graphical displays
Computerized Design Synthesis (CDS), A database-driven multidisciplinary design tool
NASA Technical Reports Server (NTRS)
Anderson, D. M.; Bolukbasi, A. O.
1989-01-01
The Computerized Design Synthesis (CDS) system under development at McDonnell Douglas Helicopter Company (MDHC) is targeted to make revolutionary improvements in both response time and resource efficiency in the conceptual and preliminary design of rotorcraft systems. It makes the accumulated design database and supporting technology analysis results readily available to designers and analysts of technology, systems, and production, and makes powerful design synthesis software available in a user friendly format.
Can Social Networks Assist Analysts Fight Terrorism?
2011-06-01
protecting America from terrorism in a 2007 article. He proposed the intelligence community ought to build a social networking database to track... filming the event, mostly with mobile phones (Shirky 2009). BBC and the U.S. Geological Survey agencies learned of the event from Twitter minutes...North America , Facebook has over 500,000 unique users visit its site every month (eBizMBA 2011). Third only to QQ, China’s top social network, and Skype
End-User Evaluations of Semantic Web Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCool, Rob; Cowell, Andrew J.; Thurman, David A.
Stanford University's Knowledge Systems Laboratory (KSL) is working in partnership with Battelle Memorial Institute and IBM Watson Research Center to develop a suite of technologies for information extraction, knowledge representation & reasoning, and human-information interaction, in unison entitled 'Knowledge Associates for Novel Intelligence' (KANI). We have developed an integrated analytic environment composed of a collection of analyst associates, software components that aid the user at different stages of the information analysis process. An important part of our participatory design process has been to ensure our technologies and designs are tightly integrate with the needs and requirements of our end users,more » To this end, we perform a sequence of evaluations towards the end of the development process that ensure the technologies are both functional and usable. This paper reports on that process.« less
Development of a software tool to support chemical and biological terrorism intelligence analysis
NASA Astrophysics Data System (ADS)
Hunt, Allen R.; Foreman, William
1997-01-01
AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.
Trade Space Specification Tool (TSST) for Rapid Mission Architecture (Version 1.2)
NASA Technical Reports Server (NTRS)
Wang, Yeou-Fang; Schrock, Mitchell; Borden, Chester S.; Moeller, Robert C.
2013-01-01
Trade Space Specification Tool (TSST) is designed to capture quickly ideas in the early spacecraft and mission architecture design and categorize them into trade space dimensions and options for later analysis. It is implemented as an Eclipse RCP Application, which can be run as a standalone program. Users rapidly create concept items with single clicks on a graphical canvas, and can organize and create linkages between the ideas using drag-and-drop actions within the same graphical view. Various views such as a trade view, rules view, and architecture view are provided to help users to visualize the trade space. This software can identify, explore, and assess aspects of the mission trade space, as well as capture and organize linkages/dependencies between trade space components. The tool supports a user-in-the-loop preliminary logical examination and filtering of trade space options to help identify which paths in the trade space are feasible (and preferred) and what analyses need to be done later with executable models. This tool provides multiple user views of the trade space to guide the analyst/team to facilitate interpretation and communication of the trade space components and linkages, identify gaps in combining and selecting trade space options, and guide user decision-making for which combinations of architectural options should be pursued for further evaluation. This software provides an environment to capture mission trade space elements rapidly and assist users for their architecture analysis. This is primarily focused on mission and spacecraft architecture design, rather than general-purpose design application. In addition, it provides more flexibility to create concepts and organize the ideas. The software is developed as an Eclipse plug-in and potentially can be integrated with other Eclipse-based tools.
Burner liner thermal-structural load modeling
NASA Technical Reports Server (NTRS)
Maffeo, R.
1986-01-01
The software package Transfer Analysis Code to Interface Thermal/Structural Problems (TRANCITS) was developed. The TRANCITS code is used to interface temperature data between thermal and structural analytical models. The use of this transfer module allows the heat transfer analyst to select the thermal mesh density and thermal analysis code best suited to solve the thermal problem and gives the same freedoms to the stress analyst, without the efficiency penalties associated with common meshes and the accuracy penalties associated with the manual transfer of thermal data.
Netzel, Pawel
2017-01-01
The United States is increasingly becoming a multi-racial society. To understand multiple consequences of this overall trend to our neighborhoods we need a methodology capable of spatio-temporal analysis of racial diversity at the local level but also across the entire U.S. Furthermore, such methodology should be accessible to stakeholders ranging from analysts to decision makers. In this paper we present a comprehensive framework for visualizing and analyzing diversity data that fulfills such requirements. The first component of our framework is a U.S.-wide, multi-year database of race sub-population grids which is freely available for download. These 30 m resolution grids have being developed using dasymetric modeling and are available for 1990-2000-2010. We summarize numerous advantages of gridded population data over commonly used Census tract-aggregated data. Using these grids frees analysts from constructing their own and allows them to focus on diversity analysis. The second component of our framework is a set of U.S.-wide, multi-year diversity maps at 30 m resolution. A diversity map is our product that classifies the gridded population into 39 communities based on their degrees of diversity, dominant race, and population density. It provides spatial information on diversity in a single, easy-to-understand map that can be utilized by analysts and end users alike. Maps based on subsequent Censuses provide information about spatio-temporal dynamics of diversity. Diversity maps are accessible through the GeoWeb application SocScape (http://sil.uc.edu/webapps/socscape_usa/) for an immediate online exploration. The third component of our framework is a proposal to quantitatively analyze diversity maps using a set of landscape metrics. Because of its form, a grid-based diversity map could be thought of as a diversity “landscape” and analyzed quantitatively using landscape metrics. We give a brief summary of most pertinent metrics and demonstrate how they can be applied to diversity maps. PMID:28358862
Riccardo, Flavia; Shigematsu, Mika; Chow, Catherine; McKnight, C Jason; Linge, Jens; Doherty, Brian; Dente, Maria Grazia; Declich, Silvia; Barker, Mike; Barboza, Philippe; Vaillant, Laetitia; Donachie, Alastair; Mawudeku, Abla; Blench, Michael; Arthur, Ray
2014-01-01
The Early Alerting and Reporting (EAR) project, launched in 2008, is aimed at improving global early alerting and risk assessment and evaluating the feasibility and opportunity of integrating the analysis of biological, chemical, radionuclear (CBRN), and pandemic influenza threats. At a time when no international collaborations existed in the field of event-based surveillance, EAR's innovative approach involved both epidemic intelligence experts and internet-based biosurveillance system providers in the framework of an international collaboration called the Global Health Security Initiative, which involved the ministries of health of the G7 countries and Mexico, the World Health Organization, and the European Commission. The EAR project pooled data from 7 major internet-based biosurveillance systems onto a common portal that was progressively optimized for biological threat detection under the guidance of epidemic intelligence experts from public health institutions in Canada, the European Centre for Disease Prevention and Control, France, Germany, Italy, Japan, the United Kingdom, and the United States. The group became the first end users of the EAR portal, constituting a network of analysts working with a common standard operating procedure and risk assessment tools on a rotation basis to constantly screen and assess public information on the web for events that could suggest an intentional release of biological agents. Following the first 2-year pilot phase, the EAR project was tested in its capacity to monitor biological threats, proving that its working model was feasible and demonstrating the high commitment of the countries and international institutions involved. During the testing period, analysts using the EAR platform did not miss intentional events of a biological nature and did not issue false alarms. Through the findings of this initial assessment, this article provides insights into how the field of epidemic intelligence can advance through an international network and, more specifically, how it was further developed in the EAR project.
Shigematsu, Mika; Chow, Catherine; McKnight, C. Jason; Linge, Jens; Doherty, Brian; Dente, Maria Grazia; Declich, Silvia; Barker, Mike; Barboza, Philippe; Vaillant, Laetitia; Donachie, Alastair; Mawudeku, Abla; Blench, Michael; Arthur, Ray
2014-01-01
The Early Alerting and Reporting (EAR) project, launched in 2008, is aimed at improving global early alerting and risk assessment and evaluating the feasibility and opportunity of integrating the analysis of biological, chemical, radionuclear (CBRN), and pandemic influenza threats. At a time when no international collaborations existed in the field of event-based surveillance, EAR's innovative approach involved both epidemic intelligence experts and internet-based biosurveillance system providers in the framework of an international collaboration called the Global Health Security Initiative, which involved the ministries of health of the G7 countries and Mexico, the World Health Organization, and the European Commission. The EAR project pooled data from 7 major internet-based biosurveillance systems onto a common portal that was progressively optimized for biological threat detection under the guidance of epidemic intelligence experts from public health institutions in Canada, the European Centre for Disease Prevention and Control, France, Germany, Italy, Japan, the United Kingdom, and the United States. The group became the first end users of the EAR portal, constituting a network of analysts working with a common standard operating procedure and risk assessment tools on a rotation basis to constantly screen and assess public information on the web for events that could suggest an intentional release of biological agents. Following the first 2-year pilot phase, the EAR project was tested in its capacity to monitor biological threats, proving that its working model was feasible and demonstrating the high commitment of the countries and international institutions involved. During the testing period, analysts using the EAR platform did not miss intentional events of a biological nature and did not issue false alarms. Through the findings of this initial assessment, this article provides insights into how the field of epidemic intelligence can advance through an international network and, more specifically, how it was further developed in the EAR project. PMID:25470464
A Systems Model for Power Technology Assessment
NASA Technical Reports Server (NTRS)
Hoffman, David J.
2002-01-01
A computer model is under continuing development at NASA Glenn Research Center that enables first-order assessments of space power technology. The model, an evolution of NASA Glenn's Array Design Assessment Model (ADAM), is an Excel workbook that consists of numerous spreadsheets containing power technology performance data and sizing algorithms. Underlying the model is a number of databases that contain default values for various power generation, energy storage and power management and distribution component parameters. These databases are actively maintained by a team of systems analysts so that they contain state-of-art data as well as the most recent technology performance projections. Sizing of the power subsystems can be accomplished either by using an assumed mass specific power (W/kg) or energy (Wh/kg) or by a bottoms-up calculation that accounts for individual component performance and masses. The power generation, energy storage and power management and distribution subsystems are sized for given mission requirements for a baseline case and up to three alternatives. This allows four different power systems to be sized and compared using consistent assumptions and sizing algorithms. The component sizing models contained in the workbook are modular so that they can be easily maintained and updated. All significant input values have default values loaded from the databases that can be over-written by the user. The default data and sizing algorithms for each of the power subsystems are described in some detail. The user interface and workbook navigational features are also discussed. Finally, an example study case that illustrates the model's capability is presented.
Interpersonal psychoanalysis' radical façade.
Hirsch, Irwin
2002-01-01
The participant-observation model initiated the relational turn, as well as the shift from modernism to postmodernism in psychoanalysis. This two-person, coparticipant conceptualization of the psychoanalytic situation moved psychoanalysis from the realm of alleged objective science toward intersubjectivity and hermeneutics. From this perspective, the analyst as subjective other is constantly engaged affectively with the patient in ways that are very often out of awareness. Analyst and patient both, for better or for worse, are believed to unwittingly influence one another. This description of the analytic dyad has led many to mistakingly conclude that interpersonal psychoanalysts advocate wittinly affective expressiveness, often in the form of deliberate self-disclosure of feelings, as part of a standard analytic stance. Upon closer examination, radical interventions are no more emblematic of interpersonal analysts than they are of analysts from most other traditions, though the interpersonalists have indeed expanded what had theretofore been a rather narrow repertoire of interventions.
Thermal Hardware for the Thermal Analyst
NASA Technical Reports Server (NTRS)
Steinfeld, David
2015-01-01
The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Space Flight Center (GSFC) Thermal Engineering Branch (Code 545). NCTS 21070-1. Most Thermal analysts do not have a good background into the hardware which thermally controls the spacecraft they design. SINDA and Thermal Desktop models are nice, but knowing how this applies to the actual thermal hardware (heaters, thermostats, thermistors, MLI blanketing, optical coatings, etc...) is just as important. The course will delve into the thermal hardware and their application techniques on actual spacecraft. Knowledge of how thermal hardware is used and applied will make a thermal analyst a better engineer.
The XSD-Builder Specification Language—Toward a Semantic View of XML Schema Definition
NASA Astrophysics Data System (ADS)
Fong, Joseph; Cheung, San Kuen
In the present database market, XML database model is a main structure for the forthcoming database system in the Internet environment. As a conceptual schema of XML database, XML Model has its limitation on presenting its data semantics. System analyst has no toolset for modeling and analyzing XML system. We apply XML Tree Model (shown in Figure 2) as a conceptual schema of XML database to model and analyze the structure of an XML database. It is important not only for visualizing, specifying, and documenting structural models, but also for constructing executable systems. The tree model represents inter-relationship among elements inside different logical schema such as XML Schema Definition (XSD), DTD, Schematron, XDR, SOX, and DSD (shown in Figure 1, an explanation of the terms in the figure are shown in Table 1). The XSD-Builder consists of XML Tree Model, source language, translator, and XSD. The source language is called XSD-Source which is mainly for providing an environment with concept of user friendliness while writing an XSD. The source language will consequently be translated by XSD-Translator. Output of XSD-Translator is an XSD which is our target and is called as an object language.
Cancer surveillance using data warehousing, data mining, and decision support systems.
Forgionne, G A; Gangopadhyay, A; Adya, M
2000-08-01
This article discusses how data warehousing, data mining, and decision support systems can reduce the national cancer burden or the oral complications of cancer therapies, especially as related to oral and pharyngeal cancers. An information system is presented that will deliver the necessary information technology to clinical, administrative, and policy researchers and analysts in an effective and efficient manner. The system will deliver the technology and knowledge that users need to readily: (1) organize relevant claims data, (2) detect cancer patterns in general and special populations, (3) formulate models that explain the patterns, and (4) evaluate the efficacy of specified treatments and interventions with the formulations. Such a system can be developed through a proven adaptive design strategy, and the implemented system can be tested on State of Maryland Medicaid data (which includes women, minorities, and children).
Intelligence Virtual Analyst Capability: Governing Concepts and Science and Technology Roadmap
2014-12-01
system’s perspective. That is to say : what is the information the user needs to achieve his tasks and objective; and what information does the system need...be able to learn from demonstration, which is to say by looking at examples of how a given task is usually performed. Learning is an important part...address, and phone number. Finally it can also include biometric and genetic information such as face attributes, fingerprints, handwriting , DNA. Time
Horn’s Curve Estimation Through Multi-Dimensional Interpolation
2013-03-01
complex nature of human behavior has not yet been broached. This is not to say analysts play favorites in reaching conclusions, only that varied...Chapter III, Section 3.7. For now, it is sufficient to say underdetermined data presents technical challenges and all such datasets will be excluded from...database lookup table and then use the method of linear interpolation to instantaneously estimate the unknown points on an as-needed basis ( say from a user
Assessing the public health impacts of legalizing recreational cannabis use in the USA.
Hall, W; Weier, M
2015-06-01
A major challenge in assessing the public health impact of legalizing cannabis use in Colorado and Washington State is the absence of any experience with legal cannabis markets. The Netherlands created a de facto legalized cannabis market for recreational use, but policy analysts disagree about how it has affected rates of cannabis use. Some US states have created de facto legal supply of cannabis for medical use. So far this policy does not appear to have increased cannabis use or cannabis-related harm. Given experience with more liberal alcohol policies, the legalization of recreational cannabis use is likely to increase use among current users. It is also likely that legalization will increase the number of new users among young adults but it remains uncertain how many may be recruited, within what time frame, among which groups within the population, and how many of these new users will become regular users. © 2015 American Society for Clinical Pharmacology and Therapeutics.
Towards Evaluating and Enhancing the Reach of Online Health Forums for Smoking Cessation
Stearns, Michael; Nambiar, Siddhartha; Nikolaev, Alexander; Semenov, Alexander; McIntosh, Scott
2015-01-01
Online pro-health social networks facilitating smoking cessation through web-assisted interventions have flourished in the past decade. In order to properly evaluate and increase the impact of this form of treatment on society, one needs to understand and be able to quantify its reach, as defined within the widely-adopted RE-AIM framework. In the online communication context, user engagement is an integral component of reach. This paper quantitatively studies the effect of engagement on the users of the Alt.Support.Stop-Smoking forum that served the needs of an online smoking cessation community for more than ten years. The paper then demonstrates how online service evaluation and planning by social network analysts can be applied towards strategic interventions targeting increased user engagement in online health forums. To this end, the challenges and opportunities are identified in the development of thread recommendation systems using core-users as a strategic resource for effective and efficient spread of healthy behaviors, in particular smoking cessation. PMID:26075158
Specialty Task Force: A Strategic Component to Electronic Health Record (EHR) Optimization.
Romero, Mary Rachel; Staub, Allison
2016-01-01
Post-implementation stage comes after an electronic health record (EHR) deployment. Analyst and end users deal with the reality that some of the concepts and designs initially planned and created may not be complementary to the workflow; creating anxiety, dissatisfaction, and failure with early adoption of system. Problems encountered during deployment are numerous and can vary from simple to complex. Redundant ticket submission creates backlog for Information Technology personnel resulting in delays in resolving concerns with EHR system. The process of optimization allows for evaluation of system and reassessment of users' needs. A solid and well executed optimization infrastructure can help minimize unexpected end-user disruptions and help tailor the system to meet regulatory agency goals and practice standards. A well device plan to resolve problems during post implementation is necessary for cost containment and to streamline communication efforts. Creating a specialty specific collaborative task force is efficacious and expedites resolution of users' concerns through a more structured process.
Development of guidance for states transitioning to new safety analysis tools
NASA Astrophysics Data System (ADS)
Alluri, Priyanka
With about 125 people dying on US roads each day, the US Department of Transportation heightened the awareness of critical safety issues with the passage of SAFETEA-LU (Safe Accountable Flexible Efficient Transportation Equity Act---a Legacy for Users) legislation in 2005. The legislation required each of the states to develop a Strategic Highway Safety Plan (SHSP) and incorporate data-driven approaches to prioritize and evaluate program outcomes: Failure to do so resulted in funding sanctioning. In conjunction with the legislation, research efforts have also been progressing toward the development of new safety analysis tools such as IHSDM (Interactive Highway Safety Design Model), SafetyAnalyst, and HSM (Highway Safety Manual). These software and analysis tools are comparatively more advanced in statistical theory and level of accuracy, and have a tendency to be more data intensive. A review of the 2009 five-percent reports and excerpts from the nationwide survey revealed astonishing facts about the continuing use of traditional methods including crash frequencies and rates for site selection and prioritization. The intense data requirements and statistical complexity of advanced safety tools are considered as a hindrance to their adoption. In this context, this research aims at identifying the data requirements and data availability for SafetyAnalyst and HSM by working with both the tools. This research sets the stage for working with the Empirical Bayes approach by highlighting some of the biases and issues associated with the traditional methods of selecting projects such as greater emphasis on traffic volume and regression-to-mean phenomena. Further, the not-so-obvious issue with shorter segment lengths, which effect the results independent of the methods used, is also discussed. The more reliable and statistically acceptable Empirical Bayes methodology requires safety performance functions (SPFs), regression equations predicting the relation between crashes and exposure for a subset of roadway network. These SPFs, specific to a region and the analysis period are often unavailable. Calibration of already existing default national SPFs to the state's data could be a feasible solution, but, how well the state's data is represented is a legitimate question. With this background, SPFs were generated for various classifications of segments in Georgia and compared against the national default SPFs used in SafetyAnalyst calibrated to Georgia data. Dwelling deeper into the development of SPFs, the influence of actual and estimated traffic data on the fit of the equations is also studied questioning the accuracy and reliability of traffic estimations. In addition to SafetyAnalyst, HSM aims at performing quantitative safety analysis. Applying HSM methodology to two-way two-lane rural roads, the effect of using multiple CMFs (Crash Modification Factors) is studied. Lastly, data requirements, methodology, constraints, and results are compared between SafetyAnalyst and HSM.
General Mission Analysis Tool (GMAT) Architectural Specification. Draft
NASA Technical Reports Server (NTRS)
Hughes, Steven P.; Conway, Darrel, J.
2007-01-01
Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.
Lamb, Berton Lee; Burkardt, Nina
2008-01-01
When Linda Pilkey- Jarvis and Orrin Pilkey state in their article, "Useless Arithmetic," that "mathematical models are simplified, generalized representations of a process or system," they probably do not mean to imply that these models are simple. Rather, the models are simpler than nature and that is the heart of the problem with predictive models. We have had a long professional association with the developers and users of one of these simplifications of nature in the form of a mathematical model known as Physical Habitat Simulation (PHABSIM), which is part of the Instream Flow Incremental Methodology (IFIM). The IFIM is a suite of techniques, including PHABSIM, that allows the analyst to incorporate hydrology , hydraulics, habitat, water quality, stream temperature, and other variables into a tradeoff analysis that decision makers can use to design a flow regime to meet management objectives (Stalnaker et al. 1995). Although we are not the developers of the IFIM, we have worked with those who did design it, and we have tried to understand how the IFIM and PHABSIM are actually used in decision making (King, Burkardt, and Clark 2006; Lamb 1989).
Assessing performance of an Electronic Health Record (EHR) using Cognitive Task Analysis.
Saitwal, Himali; Feng, Xuan; Walji, Muhammad; Patel, Vimla; Zhang, Jiajie
2010-07-01
Many Electronic Health Record (EHR) systems fail to provide user-friendly interfaces due to the lack of systematic consideration of human-centered computing issues. Such interfaces can be improved to provide easy to use, easy to learn, and error-resistant EHR systems to the users. To evaluate the usability of an EHR system and suggest areas of improvement in the user interface. The user interface of the AHLTA (Armed Forces Health Longitudinal Technology Application) was analyzed using the Cognitive Task Analysis (CTA) method called GOMS (Goals, Operators, Methods, and Selection rules) and an associated technique called KLM (Keystroke Level Model). The GOMS method was used to evaluate the AHLTA user interface by classifying each step of a given task into Mental (Internal) or Physical (External) operators. This analysis was performed by two analysts independently and the inter-rater reliability was computed to verify the reliability of the GOMS method. Further evaluation was performed using KLM to estimate the execution time required to perform the given task through application of its standard set of operators. The results are based on the analysis of 14 prototypical tasks performed by AHLTA users. The results show that on average a user needs to go through 106 steps to complete a task. To perform all 14 tasks, they would spend about 22 min (independent of system response time) for data entry, of which 11 min are spent on more effortful mental operators. The inter-rater reliability analysis performed for all 14 tasks was 0.8 (kappa), indicating good reliability of the method. This paper empirically reveals and identifies the following finding related to the performance of AHLTA: (1) large number of average total steps to complete common tasks, (2) high average execution time and (3) large percentage of mental operators. The user interface can be improved by reducing (a) the total number of steps and (b) the percentage of mental effort, required for the tasks. 2010 Elsevier Ireland Ltd. All rights reserved.
Microcomputer based software for biodynamic simulation
NASA Technical Reports Server (NTRS)
Rangarajan, N.; Shams, T.
1993-01-01
This paper presents a description of a microcomputer based software package, called DYNAMAN, which has been developed to allow an analyst to simulate the dynamics of a system consisting of a number of mass segments linked by joints. One primary application is in predicting the motion of a human occupant in a vehicle under the influence of a variety of external forces, specially those generated during a crash event. Extensive use of a graphical user interface has been made to aid the user in setting up the input data for the simulation and in viewing the results from the simulation. Among its many applications, it has been successfully used in the prototype design of a moving seat that aids in occupant protection during a crash, by aircraft designers in evaluating occupant injury in airplane crashes, and by users in accident reconstruction for reconstructing the motion of the occupant and correlating the impacts with observed injuries.
Visual interface for space and terrestrial analysis
NASA Technical Reports Server (NTRS)
Dombrowski, Edmund G.; Williams, Jason R.; George, Arthur A.; Heckathorn, Harry M.; Snyder, William A.
1995-01-01
The management of large geophysical and celestial data bases is now, more than ever, the most critical path to timely data analysis. With today's large volume data sets from multiple satellite missions, analysts face the task of defining useful data bases from which data and metadata (information about data) can be extracted readily in a meaningful way. Visualization, following an object-oriented design, is a fundamental method of organizing and handling data. Humans, by nature, easily accept pictorial representations of data. Therefore graphically oriented user interfaces are appealing, as long as they remain simple to produce and use. The Visual Interface for Space and Terrestrial Analysis (VISTA) system, currently under development at the Naval Research Laboratory's Backgrounds Data Center (BDC), has been designed with these goals in mind. Its graphical user interface (GUI) allows the user to perform queries, visualization, and analysis of atmospheric and celestial backgrounds data.
Wildland-urban interface maps vary with purpose and context
Stewart, S.I.; Wilmer, B.; Hammer, R.B.; Aplet, G.H.; Hawbaker, T.J.; Miller, C.; Radeloff, V.C.
2009-01-01
Maps of the wildland-urban interface (WUI) are both policy tools and powerful visual images. Although the growing number of WUI maps serve similar purposes, this article indicates that WUI maps derived from the same data sets can differ in important ways related to their original intended application. We discuss the use of ancillary data in modifying census data to improve WUI maps and offer a cautionary note about this practice. A comparison of two WUI mapping approaches suggests that no single map is "best" because users' needs vary. The analysts who create maps are responsible for ensuring that users understand their purpose, data, and methods; map users are responsible for paying attention to these features and using each map accordingly. These considerations should apply to any analysis but are especially important to analyses of the WUI on which policy decisions will be made.
Using PAFEC as a preprocessor for COSMIC/NASTRAN
NASA Technical Reports Server (NTRS)
Gray, W. H.; Baudry, T. V.
1983-01-01
Programs for Automatic Finite Element Calculations (PAFEC) is a general purpose, three dimensional linear and nonlinear finite element program (ref. 1). PAFEC's features include free format input utilizing engineering keywords, powerful mesh generating facilities, sophisticated data base management procedures, and extensive data validation checks. Presented here is a description of a software interface that permits PAFEC to be used as a preprocessor for COSMIC/NASTRAN. This user friendly software, called PAFCOS, frees the stress analyst from the laborious and error prone procedure of creating and debugging a rigid format COSMIC/NASTRAN bulk data deck. By interactively creating and debugging a finite element model with PAFEC, thus taking full advantage of the free format engineering keyword oriented data structure of PAFEC, the amount of time spent during model generation can be drastically reduced. The PAFCOS software will automatically convert a PAFEC data structure into a COSMIC/NASTRAN bulk data deck. The capabilities and limitations of the PAFCOS software are fully discussed in the following report.
Cybersim: geographic, temporal, and organizational dynamics of malware propagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santhi, Nandakishore; Yan, Guanhua; Eidenbenz, Stephan
2010-01-01
Cyber-infractions into a nation's strategic security envelope pose a constant and daunting challenge. We present the modular CyberSim tool which has been developed in response to the need to realistically simulate at a national level, software vulnerabilities and resulting mal ware propagation in online social networks. CyberSim suite (a) can generate realistic scale-free networks from a database of geocoordinated computers to closely model social networks arising from personal and business email contacts and online communities; (b) maintains for each,bost a list of installed software, along with the latest published vulnerabilities; (d) allows designated initial nodes where malware gets introduced; (e)more » simulates, using distributed discrete event-driven technology, the spread of malware exploiting a specific vulnerability, with packet delay and user online behavior models; (f) provides a graphical visualization of spread of infection, its severity, businesses affected etc to the analyst. We present sample simulations on a national level network with millions of computers.« less
Towards health care process description framework: an XML DTD design.
Staccini, P.; Joubert, M.; Quaranta, J. F.; Aymard, S.; Fieschi, D.; Fieschi, M.
2001-01-01
The development of health care and hospital information systems has to meet users needs as well as requirements such as the tracking of all care activities and the support of quality improvement. The use of process-oriented analysis is of-value to provide analysts with: (i) a systematic description of activities; (ii) the elicitation of the useful data to perform and record care tasks; (iii) the selection of relevant decision-making support. But paper-based tools are not a very suitable way to manage and share the documentation produced during this step. The purpose of this work is to propose a method to implement the results of process analysis according to XML techniques (eXtensible Markup Language). It is based on the IDEF0 activity modeling language (Integration DEfinition for Function modeling). A hierarchical description of a process and its components has been defined through a flat XML file with a grammar of proper metadata tags. Perspectives of this method are discussed. PMID:11825265
NIRP Core Software Suite v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitener, Dustin Heath; Folz, Wesley; Vo, Duong
The NIRP Core Software Suite is a core set of code that supports multiple applications. It includes miscellaneous base code for data objects, mathematic equations, and user interface components; and the framework includes several fully-developed software applications that exist as stand-alone tools to compliment other applications. The stand-alone tools are described below. Analyst Manager: An application to manage contact information for people (analysts) that use the software products. This information is often included in generated reports and may be used to identify the owners of calculations. Radionuclide Viewer: An application for viewing the DCFPAK radiological data. Compliments the Mixture Managermore » tool. Mixture Manager: An application to create and manage radionuclides mixtures that are commonly used in other applications. High Explosive Manager: An application to manage explosives and their properties. Chart Viewer: An application to view charts of data (e.g. meteorology charts). Other applications may use this framework to create charts specific to their data needs.« less
An Exploratory Analysis of Economic Factors in the Navy Total Force Strength Model (NTFSM)
2015-12-01
NTFSM is still in the testing phase and its overall behavior is largely unknown. In particular, the analysts that NTFSM was designed to help are...NTFSM is still in the testing phase and its overall behavior is largely unknown. In particular, the analysts that NTFSM was designed to help are...7 B. NTFSM VERIFICATION AND TESTING ......................................... 8 C
1978-04-15
analyst who is concerned with preparing the data base for a war game, selecting optional features of QUICK, designating control parameters, submitting...i/.,-j-t r? 70 ~ CoMPUIfE YsTIEM MANUAL CSM UM 9-77 VOLUME IIIC15 APRIL 1978 Lod COMMAND 9 \\.., & CONTROL 09 TECHNICAL . CENTER CCTC QUICK-REACTING...RECALC Mode ............................... 31 3.1.1.2 Non -RECALC Mode ........................... 31 3.1.1.3 Mode Selecti-n and JCL Consideration
Rogers, Patrick; Erdal, Selnur; Santangelo, Jennifer; Liu, Jianhua; Schuster, Dara; Kamal, Jyoti
2008-11-06
The Ohio State University Medical Center (OSUMC) Information Warehouse (IW) is a comprehensive data warehousing facility incorporating operational, clinical, and biological data sets from multiple enterprise system. It is common for users of the IW to request complex ad-hoc queries that often require significant intervention by data analyst. In response to this challenge, we have designed a workflow that leverages synthesized data elements to support such queries in an more timely, efficient manner.
Austin, Jennifer L; Marshall, Jason A
2008-01-01
The field of applied behavior analysis has suffered from a relative dearth of user-friendly books appropriate to a lay audience. Bailey and Burch's book fills this niche with a work that is both entertaining and informative. The book is reviewed in terms of the strengths and limitations of its content, as well as in the context of the importance of effective marketing of behavior analysis.
2005-03-01
minimizing fatalities, permanent injury to personnel, and undesired damage to property and the environment " (DoD, 1996). Various types of weapons are...who influence the sociopolitical environments in which these NLWs might be developed and deployed. The framework walks the analyst and decision-maker...connection occurred, ask if the surface was wet or dry and its nature (concrete, asphalt, or soil/grass). * The first time a user submits a report, ask
1988-11-01
system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and
Signal Quality and the Reliability of Seismic Observations
NASA Astrophysics Data System (ADS)
Zeiler, C. P.; Velasco, A. A.; Pingitore, N. E.
2009-12-01
The ability to detect, time and measure seismic phases depends on the location, size, and quality of the recorded signals. Additional constraints are an analyst’s familiarity with a seismogenic zone and with the seismic stations that record the energy. Quantification and qualification of an analyst’s ability to detect, time and measure seismic signals has not been calculated or fully assessed. The fundamental measurement for computing the accuracy of a seismic measurement is the signal quality. Several methods have been proposed to measure signal quality; however, the signal-to-noise ratio (SNR) has been adopted as a short-term average over the long-term average. While the standard SNR is an easy and computationally inexpensive term, the overall statistical significance has not been computed for seismic measurement analysis. The prospect of canonizing the process of cataloging seismic arrivals hinges on the ability to repeat measurements made by different methods and analysts. The first step in canonizing phase measurements has been done by the IASPEI, which established a reference for accepted practices in naming seismic phases. The New Manual for Seismological Observatory Practices (NMSOP, 2002) outlines key observations for seismic phases recorded at different distances and proposes to quantify timing uncertainty with a user-specified windowing technique. However, this added measurement would not completely remove bias introduced by different techniques used by analysts to time seismic arrivals. The general guideline to time a seismic arrival is to record the time where a noted change in frequency and/or amplitude begins. This is generally achieved by enhancing the arrivals through filtering or beam forming. However, these enhancements can alter the characteristics of the arrival and how the arrival will be measured. Furthermore, each enhancement has user-specified parameters that can vary between analysts and this results in reduced ability to repeat measurements between analysts. The SPEAR project (Zeiler and Velasco, 2009) has started to explore the effects of comparing measurements from the same seismograms. Initial results showed that experience and the signal quality are the leading contributors to pick differences. However, the traditional SNR method of measuring signal quality was replaced by a Wide-band Spectral Ratio (WSR) due to a decrease in scatter. This observation brings up an important question of what is the best way to measure signal quality. We compare various methods (traditional SNR, WSR, power spectral density plots, Allan Variance) that have been proposed to measure signal quality and discuss which method provides the best tool to compare arrival time uncertainty.
Foundations for context-aware information retrieval for proactive decision support
NASA Astrophysics Data System (ADS)
Mittu, Ranjeev; Lin, Jessica; Li, Qingzhe; Gao, Yifeng; Rangwala, Huzefa; Shargo, Peter; Robinson, Joshua; Rose, Carolyn; Tunison, Paul; Turek, Matt; Thomas, Stephen; Hanselman, Phil
2016-05-01
Intelligence analysts and military decision makers are faced with an onslaught of information. From the now ubiquitous presence of intelligence, surveillance, and reconnaissance (ISR) platforms providing large volumes of sensor data, to vast amounts of open source data in the form of news reports, blog postings, or social media postings, the amount of information available to a modern decision maker is staggering. Whether tasked with leading a military campaign or providing support for a humanitarian mission, being able to make sense of all the information available is a challenge. Due to the volume and velocity of this data, automated tools are required to help support reasoned, human decisions. In this paper we describe several automated techniques that are targeted at supporting decision making. Our approaches include modeling the kinematics of moving targets as motifs; developing normalcy models and detecting anomalies in kinematic data; automatically classifying the roles of users in social media; and modeling geo-spatial regions based on the behavior that takes place in them. These techniques cover a wide-range of potential decision maker needs.
Multivariate Statistics Applied to Seismic Phase Picking
NASA Astrophysics Data System (ADS)
Velasco, A. A.; Zeiler, C. P.; Anderson, D.; Pingitore, N. E.
2008-12-01
The initial effort of the Seismogram Picking Error from Analyst Review (SPEAR) project has been to establish a common set of seismograms to be picked by the seismological community. Currently we have 13 analysts from 4 institutions that have provided picks on the set of 26 seismograms. In comparing the picks thus far, we have identified consistent biases between picks from different institutions; effects of the experience of analysts; and the impact of signal-to-noise on picks. The institutional bias in picks brings up the important concern that picks will not be the same between different catalogs. This difference means less precision and accuracy when combing picks from multiple institutions. We also note that depending on the experience level of the analyst making picks for a catalog the error could fluctuate dramatically. However, the experience level is based off of number of years in picking seismograms and this may not be an appropriate criterion for determining an analyst's precision. The common data set of seismograms provides a means to test an analyst's level of precision and biases. The analyst is also limited by the quality of the signal and we show that the signal-to-noise ratio and pick error are correlated to the location, size and distance of the event. This makes the standard estimate of picking error based on SNR more complex because additional constraints are needed to accurately constrain the measurement error. We propose to extend the current measurement of error by adding the additional constraints of institutional bias and event characteristics to the standard SNR measurement. We use multivariate statistics to model the data and provide constraints to accurately assess earthquake location and measurement errors.
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey; McNeese, Michael; Hall, David
2013-05-01
Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.
Analysis of the Research and Studies Program at the United States Military Academy
2004-09-01
operational assessment methodology, efficiency analysis, recruiting analysis especially marketing effects and capability analysis and modeling. Lieutenant...Finally, and arguably the most compelling rationale is the market force of increased funding. Figure 3 below shows the increase in funding received by...to integrate in a team of analysts from other departments to assist in the effort. First, bringing in analysts from other departments gave those
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
System-of-Systems Technology-Portfolio-Analysis Tool
NASA Technical Reports Server (NTRS)
O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne
2012-01-01
Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.
Economic Consequence Analysis of Disasters: The ECAT Software Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Adam; Prager, Fynn; Chen, Zhenhua
This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less
Trust metrics in information fusion
NASA Astrophysics Data System (ADS)
Blasch, Erik
2014-05-01
Trust is an important concept for machine intelligence and is not consistent across many applications. In this paper, we seek to understand trust from a variety of factors: humans, sensors, communications, intelligence processing algorithms and human-machine displays of information. In modeling the various aspects of trust, we provide an example from machine intelligence that supports the various attributes of measuring trust such as sensor accuracy, communication timeliness, machine processing confidence, and display throughput to convey the various attributes that support user acceptance of machine intelligence results. The example used is fusing video and text whereby an analyst needs trust information in the identified imagery track. We use the proportional conflict redistribution rule as an information fusion technique that handles conflicting data from trusted and mistrusted sources. The discussion of the many forms of trust explored in the paper seeks to provide a systems-level design perspective for information fusion trust quantification.
Model Performance Evaluation and Scenario Analysis ...
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit measures that capture magnitude only, sequence only, and combined magnitude and sequence errors. The performance measures include error analysis, coefficient of determination, Nash-Sutcliffe efficiency, and a new weighted rank method. These performance metrics only provide useful information about the overall model performance. Note that MPESA is based on the separation of observed and simulated time series into magnitude and sequence components. The separation of time series into magnitude and sequence components and the reconstruction back to time series provides diagnostic insights to modelers. For example, traditional approaches lack the capability to identify if the source of uncertainty in the simulated data is due to the quality of the input data or the way the analyst adjusted the model parameters. This report presents a suite of model diagnostics that identify if mismatches between observed and simulated data result from magnitude or sequence related errors. MPESA offers graphical and statistical options that allow HSPF users to compare observed and simulated time series and identify the parameter values to adjust or the input data to modify. The scenario analysis part of the too
Analysis of Modeling Parameters on Threaded Screws.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vigil, Miquela S.; Brake, Matthew Robert; Vangoethem, Douglas
2015-06-01
Assembled mechanical systems often contain a large number of bolted connections. These bolted connections (joints) are integral aspects of the load path for structural dynamics, and, consequently, are paramount for calculating a structure's stiffness and energy dissipation prop- erties. However, analysts have not found the optimal method to model appropriately these bolted joints. The complexity of the screw geometry cause issues when generating a mesh of the model. This paper will explore different approaches to model a screw-substrate connec- tion. Model parameters such as mesh continuity, node alignment, wedge angles, and thread to body element size ratios are examined. Themore » results of this study will give analysts a better understanding of the influences of these parameters and will aide in finding the optimal method to model bolted connections.« less
NASA Technical Reports Server (NTRS)
Myers, Jerry G.; Young, M.; Goodenow, Debra A.; Keenan, A.; Walton, M.; Boley, L.
2015-01-01
Model and simulation (MS) credibility is defined as, the quality to elicit belief or trust in MS results. NASA-STD-7009 [1] delineates eight components (Verification, Validation, Input Pedigree, Results Uncertainty, Results Robustness, Use History, MS Management, People Qualifications) that address quantifying model credibility, and provides guidance to the model developers, analysts, and end users for assessing the MS credibility. Of the eight characteristics, input pedigree, or the quality of the data used to develop model input parameters, governing functions, or initial conditions, can vary significantly. These data quality differences have varying consequences across the range of MS application. NASA-STD-7009 requires that the lowest input data quality be used to represent the entire set of input data when scoring the input pedigree credibility of the model. This requirement provides a conservative assessment of model inputs, and maximizes the communication of the potential level of risk of using model outputs. Unfortunately, in practice, this may result in overly pessimistic communication of the MS output, undermining the credibility of simulation predictions to decision makers. This presentation proposes an alternative assessment mechanism, utilizing results parameter robustness, also known as model input sensitivity, to improve the credibility scoring process for specific simulations.
Achieving Robustness to Uncertainty for Financial Decision-making
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.
2014-01-10
This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gapmore » of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models simultaneously. When two models reflect past data with similar accuracy, the more robust of the two is preferable for decision-making because its predictions are, by definition, less sensitive to the uncertainty.« less
Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.
Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A
2018-01-01
Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.
General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft
NASA Technical Reports Server (NTRS)
Dove, Edwin; Hughes, Steve
2007-01-01
The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.
NASA Technical Reports Server (NTRS)
Jones, Scott M.
2007-01-01
This document is intended as an introduction to the analysis of gas turbine engine cycles using the Numerical Propulsion System Simulation (NPSS) code. It is assumed that the analyst has a firm understanding of fluid flow, gas dynamics, thermodynamics, and turbomachinery theory. The purpose of this paper is to provide for the novice the information necessary to begin cycle analysis using NPSS. This paper and the annotated example serve as a starting point and by no means cover the entire range of information and experience necessary for engine performance simulation. NPSS syntax is presented but for a more detailed explanation of the code the user is referred to the NPSS User Guide and Reference document (ref. 1).
Collaborative damage mapping for emergency response: the role of Cognitive Systems Engineering
NASA Astrophysics Data System (ADS)
Kerle, N.; Hoffman, R. R.
2013-01-01
Remote sensing is increasingly used to assess disaster damage, traditionally by professional image analysts. A recent alternative is crowdsourcing by volunteers experienced in remote sensing, using internet-based mapping portals. We identify a range of problems in current approaches, including how volunteers can best be instructed for the task, ensuring that instructions are accurately understood and translate into valid results, or how the mapping scheme must be adapted for different map user needs. The volunteers, the mapping organizers, and the map users all perform complex cognitive tasks, yet little is known about the actual information needs of the users. We also identify problematic assumptions about the capabilities of the volunteers, principally related to the ability to perform the mapping, and to understand mapping instructions unambiguously. We propose that any robust scheme for collaborative damage mapping must rely on Cognitive Systems Engineering and its principal method, Cognitive Task Analysis (CTA), to understand the information and decision requirements of the map and image users, and how the volunteers can be optimally instructed and their mapping contributions merged into suitable map products. We recommend an iterative approach involving map users, remote sensing specialists, cognitive systems engineers and instructional designers, as well as experimental psychologists.
DIDEM - An integrated model for comparative health damage costs calculation of air pollution
NASA Astrophysics Data System (ADS)
Ravina, Marco; Panepinto, Deborah; Zanetti, Maria Chiara
2018-01-01
Air pollution represents a continuous hazard to human health. Administration, companies and population need efficient indicators of the possible effects given by a change in decision, strategy or habit. The monetary quantification of health effects of air pollution through the definition of external costs is increasingly recognized as a useful indicator to support decision and information at all levels. The development of modelling tools for the calculation of external costs can provide support to analysts in the development of consistent and comparable assessments. In this paper, the DIATI Dispersion and Externalities Model (DIDEM) is presented. The DIDEM model calculates the delta-external costs of air pollution comparing two alternative emission scenarios. This tool integrates CALPUFF's advanced dispersion modelling with the latest WHO recommendations on concentration-response functions. The model is based on the impact pathway method. It was designed to work with a fine spatial resolution and a local or national geographic scope. The modular structure allows users to input their own data sets. The DIDEM model was tested on a real case study, represented by a comparative analysis of the district heating system in Turin, Italy. Additional advantages and drawbacks of the tool are discussed in the paper. A comparison with other existing models worldwide is reported.
Model documentation: Renewable Fuels Module of the National Energy Modeling System
NASA Astrophysics Data System (ADS)
1994-04-01
This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it related to the production of the 1994 Annual Energy Outlook (AEO94) forecasts. The report catalogues and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. This documentation report serves two purposes. First, it is a reference document for model analysts, model users, and the public interested in the construction and application of the RFM. Second, it meets the legal requirement of the Energy Information Administration (EIA) to provide adequate documentation in support of its models. The RFM consists of six analytical submodules that represent each of the major renewable energy resources -- wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. Of these six, four are documented in the following chapters: municipal solid waste, wind, solar and biofuels. Geothermal and wood are not currently working components of NEMS. The purpose of the RFM is to define the technological and cost characteristics of renewable energy technologies, and to pass these characteristics to other NEMS modules for the determination of mid-term forecasted renewable energy demand.
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Hooper, Russell W.
2016-10-04
In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers. More specifically, the CASL VUQ Strategy [33] prescribes the use of Predictive Capability Maturity Model (PCMM) assessments [37]. PCMM is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated with an intended application. Exercising a computational model with the methodsmore » in Dakota will yield, in part, evidence for a predictive capability maturity model (PCMM) assessment. Table 1.1 summarizes some key predictive maturity related activities (see details in [33]), with examples of how Dakota fits in. This manual offers CASL partners a guide to conducting Dakota-based VUQ studies for CASL problems. It motivates various classes of Dakota methods and includes examples of their use on representative application problems. On reading, a CASL analyst should understand why and how to apply Dakota to a simulation problem.« less
A Reduced Order Model of Force Displacement Curves for the Failure of Mechanical Bolts in Tension.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Keegan J.; Sandia National Lab.; Brake, Matthew Robert
2015-12-01
Assembled mechanical systems often contain a large number of bolted connections. These bolted connections (joints) are integral aspects of the load path for structural dynamics, and, consequently, are paramount for calculating a structure's stiffness and energy dissipation prop- erties. However, analysts have not found the optimal method to model appropriately these bolted joints. The complexity of the screw geometry causes issues when generating a mesh of the model. This report will explore different approaches to model a screw-substrate connec- tion. Model parameters such as mesh continuity, node alignment, wedge angles, and thread to body element size ratios are examined. Themore » results of this study will give analysts a better understanding of the influences of these parameters and will aide in finding the optimal method to model bolted connections.« less
Design Through Manufacturing: The Solid Model-Finite Element Analysis Interface
NASA Technical Reports Server (NTRS)
Rubin, Carol
2002-01-01
State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts reflecting every detail of the finished product. Ideally, in the aerospace industry, these models should fulfill two very important functions: (1) provide numerical. control information for automated manufacturing of precision parts, and (2) enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in aircraft and space vehicles. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. Presently, the process of preparing CAD models for FEA consumes a great deal of the analyst's time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolton, P.
The purpose of this task was to support ESH-3 in providing Airborne Release Fraction and Respirable Fraction training to safety analysts at LANL who perform accident analysis, hazard analysis, safety analysis, and/or risk assessments at nuclear facilities. The task included preparation of materials for and the conduct of two 3-day training courses covering the following topics: safety analysis process; calculation model; aerosol physic concepts for safety analysis; and overview of empirically derived airborne release fractions and respirable fractions.
SPICE Module for the Satellite Orbit Analysis Program (SOAP)
NASA Technical Reports Server (NTRS)
Coggi, John; Carnright, Robert; Hildebrand, Claude
2008-01-01
A SPICE module for the Satellite Orbit Analysis Program (SOAP) precisely represents complex motion and maneuvers in an interactive, 3D animated environment with support for user-defined quantitative outputs. (SPICE stands for Spacecraft, Planet, Instrument, Camera-matrix, and Events). This module enables the SOAP software to exploit NASA mission ephemeris represented in the JPL Ancillary Information Facility (NAIF) SPICE formats. Ephemeris types supported include position, velocity, and orientation for spacecraft and planetary bodies including the Sun, planets, natural satellites, comets, and asteroids. Entire missions can now be imported into SOAP for 3D visualization, playback, and analysis. The SOAP analysis and display features can now leverage detailed mission files to offer the analyst both a numerically correct and aesthetically pleasing combination of results that can be varied to study many hypothetical scenarios. The software provides a modeling and simulation environment that can encompass a broad variety of problems using orbital prediction. For example, ground coverage analysis, communications analysis, power and thermal analysis, and 3D visualization that provide the user with insight into complex geometric relations are included. The SOAP SPICE module allows distributed science and engineering teams to share common mission models of known pedigree, which greatly reduces duplication of effort and the potential for error. The use of the software spans all phases of the space system lifecycle, from the study of future concepts to operations and anomaly analysis. It allows SOAP software to correctly position and orient all of the principal bodies of the Solar System within a single simulation session along with multiple spacecraft trajectories and the orientation of mission payloads. In addition to the 3D visualization, the user can define numeric variables and x-y plots to quantitatively assess metrics of interest.
Online Interactive Tutorials for Creating Graphs With Excel 2007 or 2010
Vanselow, Nicholas R
2012-01-01
Graphic display of clinical data is a useful tool for the behavior-analytic clinician. However, graphs can sometimes be difficult to create. We describe how to access and use an online interactive tutorial that teaches the user to create a variety of graphs often used by behavior analysts. Three tutorials are provided that cover the basics of Microsoft Excel 2007 or 2010, creating graphs for clinical purposes, and creating graphs for research purposes. The uses for this interactive tutorial and other similar programs are discussed. PMID:23326629
Online interactive tutorials for creating graphs with excel 2007 or 2010.
Vanselow, Nicholas R; Bourret, Jason C
2012-01-01
Graphic display of clinical data is a useful tool for the behavior-analytic clinician. However, graphs can sometimes be difficult to create. We describe how to access and use an online interactive tutorial that teaches the user to create a variety of graphs often used by behavior analysts. Three tutorials are provided that cover the basics of Microsoft Excel 2007 or 2010, creating graphs for clinical purposes, and creating graphs for research purposes. The uses for this interactive tutorial and other similar programs are discussed.
NASA Astrophysics Data System (ADS)
Bambace, Luís Antonio Waack; Ceballos, Décio Castilho
CDMA Mobile Satellite Systems (CDMA MSS) are able to co-directional, co-frequency and co-coverage sharing, and they are strongly interdependent in case of such a sharing. It is also known that the success of any telecommunication project is the use of the correct media to each task. Operators have a clear sight of such a media adequacy in traditional systems, but not necessarily in the case of Mobile Satellite Systems. This creates a risk that a wrong market objective operator causes trouble to other systems. This paper deals with the sharing alternatives for up to four CDMA MSS operating in the same frequency band, and analysts both: satellite to user downlink and user to satellite uplink. The influence of several items in capacity is here treated. The scope includes: downlink power flux density: code availability; single system internal interference; inter-system interference; diversity schemes: average link impairments, margins; user cooperation; terminal specifications and the dependence of the insulation between RHCP and LHCP with fade.
Elimination sequence optimization for SPAR
NASA Technical Reports Server (NTRS)
Hogan, Harry A.
1986-01-01
SPAR is a large-scale computer program for finite element structural analysis. The program allows user specification of the order in which the joints of a structure are to be eliminated since this order can have significant influence over solution performance, in terms of both storage requirements and computer time. An efficient elimination sequence can improve performance by over 50% for some problems. Obtaining such sequences, however, requires the expertise of an experienced user and can take hours of tedious effort to affect. Thus, an automatic elimination sequence optimizer would enhance productivity by reducing the analysts' problem definition time and by lowering computer costs. Two possible methods for automating the elimination sequence specifications were examined. Several algorithms based on the graph theory representations of sparse matrices were studied with mixed results. Significant improvement in the program performance was achieved, but sequencing by an experienced user still yields substantially better results. The initial results provide encouraging evidence that the potential benefits of such an automatic sequencer would be well worth the effort.
Interactive Data Exploration with Smart Drill-Down
Joglekar, Manas; Garcia-Molina, Hector; Parameswaran, Aditya
2017-01-01
We present smart drill-down, an operator for interactively exploring a relational table to discover and summarize “interesting” groups of tuples. Each group of tuples is described by a rule. For instance, the rule (a, b, ⋆, 1000) tells us that there are a thousand tuples with value a in the first column and b in the second column (and any value in the third column). Smart drill-down presents an analyst with a list of rules that together describe interesting aspects of the table. The analyst can tailor the definition of interesting, and can interactively apply smart drill-down on an existing rule to explore that part of the table. We demonstrate that the underlying optimization problems are NP-Hard, and describe an algorithm for finding the approximately optimal list of rules to display when the user uses a smart drill-down, and a dynamic sampling scheme for efficiently interacting with large tables. Finally, we perform experiments on real datasets on our experimental prototype to demonstrate the usefulness of smart drill-down and study the performance of our algorithms. PMID:28210096
Reference manual for the Thermal Analyst's Help Desk Expert System
NASA Technical Reports Server (NTRS)
Ormsby, Rachel A.
1994-01-01
This document provides technical information and programming guidance for the maintenance and future development of the Thermal Analyst's Help Desk. Help Desk is an expert system that operates within the EXSYSTM expert system shell, and is used to determine first approximations of thermal capacity for spacecraft and instruments. The five analyses supported in Help Desk are: (1) surface area required for a radiating surface, (2) equilibrium temperature of a surface, (3) enclosure temperature and heat loads for a defined position in orbit, (4) enclosure temperature and heat loads over a complete orbit and, (5) selection of appropriate surface properties. The two geometries supported by Help Desk are a single flat plate and a rectangular box enclosure. The technical information includes the mathematical approach and analytical derivations used in the analyses such as: radiation heat balance, view factor calculation, and orbit determination with coordinate transformation. The programming guide for developers describes techniques for enhancement of Help Desk. Examples are provided showing the addition of new features, user interface development and enhancement, and external program interfaces.
Domino: Extracting, Comparing, and Manipulating Subsets across Multiple Tabular Datasets
Gratzl, Samuel; Gehlenborg, Nils; Lex, Alexander; Pfister, Hanspeter; Streit, Marc
2016-01-01
Answering questions about complex issues often requires analysts to take into account information contained in multiple interconnected datasets. A common strategy in analyzing and visualizing large and heterogeneous data is dividing it into meaningful subsets. Interesting subsets can then be selected and the associated data and the relationships between the subsets visualized. However, neither the extraction and manipulation nor the comparison of subsets is well supported by state-of-the-art techniques. In this paper we present Domino, a novel multiform visualization technique for effectively representing subsets and the relationships between them. By providing comprehensive tools to arrange, combine, and extract subsets, Domino allows users to create both common visualization techniques and advanced visualizations tailored to specific use cases. In addition to the novel technique, we present an implementation that enables analysts to manage the wide range of options that our approach offers. Innovative interactive features such as placeholders and live previews support rapid creation of complex analysis setups. We introduce the technique and the implementation using a simple example and demonstrate scalability and effectiveness in a use case from the field of cancer genomics. PMID:26356916
Explosion Monitoring with Machine Learning: A LSTM Approach to Seismic Event Discrimination
NASA Astrophysics Data System (ADS)
Magana-Zook, S. A.; Ruppert, S. D.
2017-12-01
The streams of seismic data that analysts look at to discriminate natural from man- made events will soon grow from gigabytes of data per day to exponentially larger rates. This is an interesting problem as the requirement for real-time answers to questions of non-proliferation will remain the same, and the analyst pool cannot grow as fast as the data volume and velocity will. Machine learning is a tool that can solve the problem of seismic explosion monitoring at scale. Using machine learning, and Long Short-term Memory (LSTM) models in particular, analysts can become more efficient by focusing their attention on signals of interest. From a global dataset of earthquake and explosion events, a model was trained to recognize the different classes of events, given their spectrograms. Optimal recurrent node count and training iterations were found, and cross validation was performed to evaluate model performance. A 10-fold mean accuracy of 96.92% was achieved on a balanced dataset of 30,002 instances. Given that the model is 446.52 MB it can be used to simultaneously characterize all incoming signals by researchers looking at events in isolation on desktop machines, as well as at scale on all of the nodes of a real-time streaming platform. LLNL-ABS-735911
Directed area search using socio-biological vision algorithms and cognitive Bayesian reasoning
NASA Astrophysics Data System (ADS)
Medasani, S.; Owechko, Y.; Allen, D.; Lu, T. C.; Khosla, D.
2010-04-01
Volitional search systems that assist the analyst by searching for specific targets or objects such as vehicles, factories, airports, etc in wide area overhead imagery need to overcome multiple problems present in current manual and automatic approaches. These problems include finding targets hidden in terabytes of information, relatively few pixels on targets, long intervals between interesting regions, time consuming analysis requiring many analysts, no a priori representative examples or templates of interest, detecting multiple classes of objects, and the need for very high detection rates and very low false alarm rates. This paper describes a conceptual analyst-centric framework that utilizes existing technology modules to search and locate occurrences of targets of interest (e.g., buildings, mobile targets of military significance, factories, nuclear plants, etc.), from video imagery of large areas. Our framework takes simple queries from the analyst and finds the queried targets with relatively minimum interaction from the analyst. It uses a hybrid approach that combines biologically inspired bottom up attention, socio-biologically inspired object recognition for volitionally recognizing targets, and hierarchical Bayesian networks for modeling and representing the domain knowledge. This approach has the benefits of high accuracy, low false alarm rate and can handle both low-level visual information and high-level domain knowledge in a single framework. Such a system would be of immense help for search and rescue efforts, intelligence gathering, change detection systems, and other surveillance systems.
Oncology Modeling for Fun and Profit! Key Steps for Busy Analysts in Health Technology Assessment.
Beca, Jaclyn; Husereau, Don; Chan, Kelvin K W; Hawkins, Neil; Hoch, Jeffrey S
2018-01-01
In evaluating new oncology medicines, two common modeling approaches are state transition (e.g., Markov and semi-Markov) and partitioned survival. Partitioned survival models have become more prominent in oncology health technology assessment processes in recent years. Our experience in conducting and evaluating models for economic evaluation has highlighted many important and practical pitfalls. As there is little guidance available on best practices for those who wish to conduct them, we provide guidance in the form of 'Key steps for busy analysts,' who may have very little time and require highly favorable results. Our guidance highlights the continued need for rigorous conduct and transparent reporting of economic evaluations regardless of the modeling approach taken, and the importance of modeling that better reflects reality, which includes better approaches to considering plausibility, estimating relative treatment effects, dealing with post-progression effects, and appropriate characterization of the uncertainty from modeling itself.
Software life cycle dynamic simulation model: The organizational performance submodel
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1985-01-01
The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.
NSRD-10: Leak Path Factor Guidance Using MELCOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Humphries, Larry L.
Estimates of the source term from a U.S. Department of Energy (DOE) nuclear facility requires that the analysts know how to apply the simulation tools used, such as the MELCOR code, particularly for a complicated facility that may include an air ventilation system and other active systems that can influence the environmental pathway of the materials released. DOE has designated MELCOR 1.8.5, an unsupported version, as a DOE ToolBox code in its Central Registry, which includes a leak-path-factor guidance report written in 2004 that did not include experimental validation data. To continue to use this MELCOR version requires additional verificationmore » and validations, which may not be feasible from a project cost standpoint. Instead, the recent MELCOR should be used. Without any developer support and lack of experimental data validation, it is difficult to convince regulators that the calculated source term from the DOE facility is accurate and defensible. This research replaces the obsolete version in the 2004 DOE leak path factor guidance report by using MELCOR 2.1 (the latest version of MELCOR with continuing modeling development and user support) and by including applicable experimental data from the reactor safety arena and from applicable experimental data used in the DOE-HDBK-3010. This research provides best practice values used in MELCOR 2.1 specifically for the leak path determination. With these enhancements, the revised leak-path-guidance report should provide confidence to the DOE safety analyst who would be using MELCOR as a source-term determination tool for mitigated accident evaluations.« less
DOT National Transportation Integrated Search
2009-01-01
This booklet provides an overview of SafetyAnalyst. SafetyAnalyst is a set of software tools under development to help State and local highway agencies advance their programming of site-specific safety improvements. SafetyAnalyst will incorporate sta...
A crisis in the analyst's life: self-containment, symbolization, and the holding space.
Michelle, Flax
2011-04-01
Most analysts will experience some degree of crisis in the course of their working life. This paper explores the complex interplay between the analyst's affect during a crisis in her lifeü and the affective dynamics of the patient. The central question is "who or what holds the analyst"--especially in times of crisis. Symbolization of affect, facilitated by the analyst's self-created holding environment, is seen as a vital process in order for containment to take place. In the clinical case presented, the analyst's dog was an integral part of the analyst's self-righting through this difficult period; the dog functioned as an "analytic object" within the analysis.
A Visual Analytics Framework for Identifying Topic Drivers in Media Events.
Lu, Yafeng; Wang, Hong; Landis, Steven; Maciejewski, Ross
2017-09-14
Media data has been the subject of large scale analysis with applications of text mining being used to provide overviews of media themes and information flows. Such information extracted from media articles has also shown its contextual value of being integrated with other data, such as criminal records and stock market pricing. In this work, we explore linking textual media data with curated secondary textual data sources through user-guided semantic lexical matching for identifying relationships and data links. In this manner, critical information can be identified and used to annotate media timelines in order to provide a more detailed overview of events that may be driving media topics and frames. These linked events are further analyzed through an application of causality modeling to model temporal drivers between the data series. Such causal links are then annotated through automatic entity extraction which enables the analyst to explore persons, locations, and organizations that may be pertinent to the media topic of interest. To demonstrate the proposed framework, two media datasets and an armed conflict event dataset are explored.
Human Processes in Intelligence Analysis: Phase I Overview
1979-12-01
Inodrtpusehsreac, several operating definitions were thv model, and is based on field obser- adopted. A basic defnition was that vations made from tha...Similarly, the IMINT bo•xes of different analysts, analyst who understands the problems Comnputer data bases, such as those of of the reconnaissance pilot has...TaiulanNevl Franea 3 USA Aviation Test K, Pt Rucler. ATTN: STlG-P( I Prin Scientific Off. Ar-1 HIm mngr Rich Olv. Miniatry 1 USA Apy hr Av4iao SAWe
Large scale track analysis for wide area motion imagery surveillance
NASA Astrophysics Data System (ADS)
van Leeuwen, C. J.; van Huis, J. R.; Baan, J.
2016-10-01
Wide Area Motion Imagery (WAMI) enables image based surveillance of areas that can cover multiple square kilometers. Interpreting and analyzing information from such sources, becomes increasingly time consuming as more data is added from newly developed methods for information extraction. Captured from a moving Unmanned Aerial Vehicle (UAV), the high-resolution images allow detection and tracking of moving vehicles, but this is a highly challenging task. By using a chain of computer vision detectors and machine learning techniques, we are capable of producing high quality track information of more than 40 thousand vehicles per five minutes. When faced with such a vast number of vehicular tracks, it is useful for analysts to be able to quickly query information based on region of interest, color, maneuvers or other high-level types of information, to gain insight and find relevant activities in the flood of information. In this paper we propose a set of tools, combined in a graphical user interface, which allows data analysts to survey vehicles in a large observed area. In order to retrieve (parts of) images from the high-resolution data, we developed a multi-scale tile-based video file format that allows to quickly obtain only a part, or a sub-sampling of the original high resolution image. By storing tiles of a still image according to a predefined order, we can quickly retrieve a particular region of the image at any relevant scale, by skipping to the correct frames and reconstructing the image. Location based queries allow a user to select tracks around a particular region of interest such as landmark, building or street. By using an integrated search engine, users can quickly select tracks that are in the vicinity of locations of interest. Another time-reducing method when searching for a particular vehicle, is to filter on color or color intensity. Automatic maneuver detection adds information to the tracks that can be used to find vehicles based on their behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean; Plaisant, Catherine; Whiting, Mark A.
The evaluation of visual analytics environments was a topic in Illuminating the Path [Thomas 2005] as a critical aspect of moving research into practice. For a thorough understanding of the utility of the systems available, evaluation not only involves assessing the visualizations, interactions or data processing algorithms themselves, but also the complex processes that a tool is meant to support (such as exploratory data analysis and reasoning, communication through visualization, or collaborative data analysis [Lam 2012; Carpendale 2007]). Researchers and practitioners in the field have long identified many of the challenges faced when planning, conducting, and executing an evaluation ofmore » a visualization tool or system [Plaisant 2004]. Evaluation is needed to verify that algorithms and software systems work correctly and that they represent improvements over the current infrastructure. Additionally to effectively transfer new software into a working environment, it is necessary to ensure that the software has utility for the end-users and that the software can be incorporated into the end-user’s infrastructure and work practices. Evaluation test beds require datasets, tasks, metrics and evaluation methodologies. As noted in [Thomas 2005] it is difficult and expensive for any one researcher to setup an evaluation test bed so in many cases evaluation is setup for communities of researchers or for various research projects or programs. Examples of successful community evaluations can be found [Chinchor 1993; Voorhees 2007; FRGC 2012]. As visual analytics environments are intended to facilitate the work of human analysts, one aspect of evaluation needs to focus on the utility of the software to the end-user. This requires representative users, representative tasks, and metrics that measure the utility to the end-user. This is even more difficult as now one aspect of the test methodology is access to representative end-users to participate in the evaluation. In many cases the sensitive nature of data and tasks and difficult access to busy analysts puts even more of a burden on researchers to complete this type of evaluation. User-centered design goes beyond evaluation and starts with the user [Beyer 1997, Shneiderman 2009]. Having some knowledge of the type of data, tasks, and work practices helps researchers and developers know the correct paths to pursue in their work. When access to the end-users is problematic at best and impossible at worst, user-centered design becomes difficult. Researchers are unlikely to go to work on the type of problems faced by inaccessible users. Commercial vendors have difficulties evaluating and improving their products when they cannot observe real users working with their products. In well-established fields such as web site design or office software design, user-interface guidelines have been developed based on the results of empirical studies or the experience of experts. Guidelines can speed up the design process and replace some of the need for observation of actual users [heuristics review references]. In 2006 when the visual analytics community was initially getting organized, no such guidelines existed. Therefore, we were faced with the problem of developing an evaluation framework for the field of visual analytics that would provide representative situations and datasets, representative tasks and utility metrics, and finally a test methodology which would include a surrogate for representative users, increase interest in conducting research in the field, and provide sufficient feedback to the researchers so that they could improve their systems.« less
Petroleum Market Model of the National Energy Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-01-01
The purpose of this report is to define the objectives of the Petroleum Market Model (PMM), describe its basic approach, and provide detail on how it works. This report is intended as a reference document for model analysts, users, and the public. The PMM models petroleum refining activities, the marketing of petroleum products to consumption regions. The production of natural gas liquids in gas processing plants, and domestic methanol production. The PMM projects petroleum product prices and sources of supply for meeting petroleum product demand. The sources of supply include crude oil, both domestic and imported; other inputs including alcoholsmore » and ethers; natural gas plant liquids production; petroleum product imports; and refinery processing gain. In addition, the PMM estimates domestic refinery capacity expansion and fuel consumption. Product prices are estimated at the Census division level and much of the refining activity information is at the Petroleum Administration for Defense (PAD) District level. This report is organized as follows: Chapter 2, Model Purpose; Chapter 3, Model Overview and Rationale; Chapter 4, Model Structure; Appendix A, Inventory of Input Data, Parameter Estimates, and Model Outputs; Appendix B, Detailed Mathematical Description of the Model; Appendix C, Bibliography; Appendix D, Model Abstract; Appendix E, Data Quality; Appendix F, Estimation methodologies; Appendix G, Matrix Generator documentation; Appendix H, Historical Data Processing; and Appendix I, Biofuels Supply Submodule.« less
Developing an Automated Method for Detection of Operationally Relevant Ocean Fronts and Eddies
NASA Astrophysics Data System (ADS)
Rogers-Cotrone, J. D.; Cadden, D. D. H.; Rivera, P.; Wynn, L. L.
2016-02-01
Since the early 90's, the U.S. Navy has utilized an observation-based process for identification of frontal systems and eddies. These Ocean Feature Assessments (OFA) rely on trained analysts to identify and position ocean features using satellite-observed sea surface temperatures. Meanwhile, as enhancements and expansion of the navy's Hybrid Coastal Ocean Model (HYCOM) and Regional Navy Coastal Ocean Model (RNCOM) domains have proceeded, the Naval Oceanographic Office (NAVO) has provided Tactical Oceanographic Feature Assessments (TOFA) that are based on data-validated model output but also rely on analyst identification of significant features. A recently completed project has migrated OFA production to the ArcGIS-based Acoustic Reach-back Cell Ocean Analysis Suite (ARCOAS), enabling use of additional observational datasets and significantly decreasing production time; however, it has highlighted inconsistencies inherent to this analyst-based identification process. Current efforts are focused on development of an automated method for detecting operationally significant fronts and eddies that integrates model output and observational data on a global scale. Previous attempts to employ techniques from the scientific community have been unable to meet the production tempo at NAVO. Thus, a system that incorporates existing techniques (Marr-Hildreth, Okubo-Weiss, etc.) with internally-developed feature identification methods (from model-derived physical and acoustic properties) is required. Ongoing expansions to the ARCOAS toolset have shown promising early results.
NASA Astrophysics Data System (ADS)
Rogowitz, Bernice E.; Rabenhorst, David A.; Gerth, John A.; Kalin, Edward B.
1996-04-01
This paper describes a set of visual techniques, based on principles of human perception and cognition, which can help users analyze and develop intuitions about tabular data. Collections of tabular data are widely available, including, for example, multivariate time series data, customer satisfaction data, stock market performance data, multivariate profiles of companies and individuals, and scientific measurements. In our approach, we show how visual cues can help users perform a number of data mining tasks, including identifying correlations and interaction effects, finding clusters and understanding the semantics of cluster membership, identifying anomalies and outliers, and discovering multivariate relationships among variables. These cues are derived from psychological studies on perceptual organization, visual search, perceptual scaling, and color perception. These visual techniques are presented as a complement to the statistical and algorithmic methods more commonly associated with these tasks, and provide an interactive interface for the human analyst.
NASA Technical Reports Server (NTRS)
Aucoin, P. J.; Stewart, J.; Mckay, M. F. (Principal Investigator)
1980-01-01
This document presents instructions for analysts who use the EOD-LARSYS as programmed on the Purdue University IBM 370/148 (recently replaced by the IBM 3031) computer. It presents sample applications, control cards, and error messages for all processors in the system and gives detailed descriptions of the mathematical procedures and information needed to execute the system and obtain the desired output. EOD-LARSYS is the JSC version of an integrated batch system for analysis of multispectral scanner imagery data. The data included is designed for use with the as built documentation (volume 3) and the program listings (volume 4). The system is operational from remote terminals at Johnson Space Center under the virtual machine/conversational monitor system environment.
Tile prediction schemes for wide area motion imagery maps in GIS
NASA Astrophysics Data System (ADS)
Michael, Chris J.; Lin, Bruce Y.
2017-11-01
Wide-area surveillance, traffic monitoring, and emergency management are just several of many applications benefiting from the incorporation of Wide-Area Motion Imagery (WAMI) maps into geographic information systems. Though the use of motion imagery as a GIS base map via the Web Map Service (WMS) standard is not a new concept, effectively streaming imagery is particularly challenging due to its large scale and the multidimensionally interactive nature of clients that use WMS. Ineffective streaming from a server to one or more clients can unnecessarily overwhelm network bandwidth and cause frustratingly large amounts of latency in visualization to the user. Seamlessly streaming WAMI through GIS requires good prediction to accurately guess the tiles of the video that will be traversed in the near future. In this study, we present an experimental framework for such prediction schemes by presenting a stochastic interaction model that represents a human user's interaction with a GIS video map. We then propose several algorithms by which the tiles of the stream may be predicted. Results collected both within the experimental framework and using human analyst trajectories show that, though each algorithm thrives under certain constraints, the novel Markovian algorithm yields the best results overall. Furthermore, we make the argument that the proposed experimental framework is sufficient for the study of these prediction schemes.
Mapping forest height in Alaska using GLAS, Landsat composites, and airborne LiDAR
Peterson, Birgit; Nelson, Kurtis
2014-01-01
Vegetation structure, including forest canopy height, is an important input variable to fire behavior modeling systems for simulating wildfire behavior. As such, forest canopy height is one of a nationwide suite of products generated by the LANDFIRE program. In the past, LANDFIRE has relied on a combination of field observations and Landsat imagery to develop existing vegetation structure products. The paucity of field data in the remote Alaskan forests has led to a very simple forest canopy height classification for the original LANDFIRE forest height map. To better meet the needs of data users and refine the map legend, LANDFIRE incorporated ICESat Geoscience Laser Altimeter System (GLAS) data into the updating process when developing the LANDFIRE 2010 product. The high latitude of this region enabled dense coverage of discrete GLAS samples, from which forest height was calculated. Different methods for deriving height from the GLAS waveform data were applied, including an attempt to correct for slope. These methods were then evaluated and integrated into the final map according to predefined criteria. The resulting map of forest canopy height includes more height classes than the original map, thereby better depicting the heterogeneity of the landscape, and provides seamless data for fire behavior analysts and other users of LANDFIRE data.
Fusing modeling techniques to support domain analysis for reuse opportunities identification
NASA Technical Reports Server (NTRS)
Hall, Susan Main; Mcguire, Eileen
1993-01-01
Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.
Toward a Cognitive Task Analysis for Biomedical Query Mediation
Hruby, Gregory W.; Cimino, James J.; Patel, Vimla; Weng, Chunhua
2014-01-01
In many institutions, data analysts use a Biomedical Query Mediation (BQM) process to facilitate data access for medical researchers. However, understanding of the BQM process is limited in the literature. To bridge this gap, we performed the initial steps of a cognitive task analysis using 31 BQM instances conducted between one analyst and 22 researchers in one academic department. We identified five top-level tasks, i.e., clarify research statement, explain clinical process, identify related data elements, locate EHR data element, and end BQM with either a database query or unmet, infeasible information needs, and 10 sub-tasks. We evaluated the BQM task model with seven data analysts from different clinical research institutions. Evaluators found all the tasks completely or semi-valid. This study contributes initial knowledge towards the development of a generalizable cognitive task representation for BQM. PMID:25954589
Toward a cognitive task analysis for biomedical query mediation.
Hruby, Gregory W; Cimino, James J; Patel, Vimla; Weng, Chunhua
2014-01-01
In many institutions, data analysts use a Biomedical Query Mediation (BQM) process to facilitate data access for medical researchers. However, understanding of the BQM process is limited in the literature. To bridge this gap, we performed the initial steps of a cognitive task analysis using 31 BQM instances conducted between one analyst and 22 researchers in one academic department. We identified five top-level tasks, i.e., clarify research statement, explain clinical process, identify related data elements, locate EHR data element, and end BQM with either a database query or unmet, infeasible information needs, and 10 sub-tasks. We evaluated the BQM task model with seven data analysts from different clinical research institutions. Evaluators found all the tasks completely or semi-valid. This study contributes initial knowledge towards the development of a generalizable cognitive task representation for BQM.
Effects of Motivation: Rewarding Hackers for Undetected Attacks Cause Analysts to Perform Poorly.
Maqbool, Zahid; Makhijani, Nidhi; Pammi, V S Chandrasekhar; Dutt, Varun
2017-05-01
The aim of this study was to determine how monetary motivations influence decision making of humans performing as security analysts and hackers in a cybersecurity game. Cyberattacks are increasing at an alarming rate. As cyberattacks often cause damage to existing cyber infrastructures, it is important to understand how monetary rewards may influence decision making of hackers and analysts in the cyber world. Currently, only limited attention has been given to this area. In an experiment, participants were randomly assigned to three between-subjects conditions ( n = 26 for each condition): equal payoff, where the magnitude of monetary rewards for hackers and defenders was the same; rewarding hacker, where the magnitude of monetary reward for hacker's successful attack was 10 times the reward for analyst's successful defense; and rewarding analyst, where the magnitude of monetary reward for analyst's successful defense was 10 times the reward for hacker's successful attack. In all conditions, half of the participants were human hackers playing against Nash analysts and half were human analysts playing against Nash hackers. Results revealed that monetary rewards for human hackers and analysts caused a decrease in attack and defend actions compared with the baseline. Furthermore, rewarding human hackers for undetected attacks made analysts deviate significantly from their optimal behavior. If hackers are rewarded for their undetected attack actions, then this causes analysts to deviate from optimal defend proportions. Thus, analysts need to be trained not become overenthusiastic in defending networks. Applications of our results are to networks where the influence of monetary rewards may cause information theft and system damage.
Cross-industry Performance Modeling: Toward Cooperative Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reece, Wendy Jane; Blackman, Harold Stabler
One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several roadblocks to widespread sharing of data and lessons learned from operating experiencemore » and simulation, including the fact that very few publicly accessible data bases exist (Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.« less
Cross-Industry Performance Modeling: Toward Cooperative Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
H. S. Blackman; W. J. Reece
One of the current unsolved problems in human factors is the difficulty in acquiring information from lessons learned and data collected among human performance analysts in different domains. There are several common concerns and generally accepted issues of importance for human factors, psychology and industry analysts of performance and safety. Among these are the need to incorporate lessons learned in design, to carefully consider implementation of new designs and automation, and the need to reduce human performance-based contributions to risk. In spite of shared concerns, there are several road blocks to widespread sharing of data and lessons learned from operatingmore » experience and simulation, including the fact that very few publicly accessible data bases exist(Gertman & Blackman, 1994, and Kirwan, 1997). There is a need to draw together analysts and analytic methodologies to comprise a centralized source of data with sufficient detail to be meaningful while ensuring source anonymity. We propose that a generic source of performance data and a multi-domain data store may provide the first steps toward cooperative performance modeling and analysis across industries.« less
General MACOS Interface for Modeling and Analysis for Controlled Optical Systems
NASA Technical Reports Server (NTRS)
Sigrist, Norbert; Basinger, Scott A.; Redding, David C.
2012-01-01
The General MACOS Interface (GMI) for Modeling and Analysis for Controlled Optical Systems (MACOS) enables the use of MATLAB as a front-end for JPL s critical optical modeling package, MACOS. MACOS is JPL s in-house optical modeling software, which has proven to be a superb tool for advanced systems engineering of optical systems. GMI, coupled with MACOS, allows for seamless interfacing with modeling tools from other disciplines to make possible integration of dynamics, structures, and thermal models with the addition of control systems for deformable optics and other actuated optics. This software package is designed as a tool for analysts to quickly and easily use MACOS without needing to be an expert at programming MACOS. The strength of MACOS is its ability to interface with various modeling/development platforms, allowing evaluation of system performance with thermal, mechanical, and optical modeling parameter variations. GMI provides an improved means for accessing selected key MACOS functionalities. The main objective of GMI is to marry the vast mathematical and graphical capabilities of MATLAB with the powerful optical analysis engine of MACOS, thereby providing a useful tool to anyone who can program in MATLAB. GMI also improves modeling efficiency by eliminating the need to write an interface function for each task/project, reducing error sources, speeding up user/modeling tasks, and making MACOS well suited for fast prototyping.
NASA Astrophysics Data System (ADS)
Zhao, Qunhua; Santos, Eugene; Nguyen, Hien; Mohamed, Ahmed
One of the biggest challenges for intelligence analysts who participate in prevention or response to a terrorism act is to quickly find relevant information from massive amounts of data. Along with research on information retrieval and filtering, text summarization is an effective technique to help intelligence analysts shorten their time to find critical information and make timely decisions. Multi-document summarization is particularly useful as it serves to quickly describe a collection of information. The obvious shortcoming lies in what it cannot capture especially in more diverse collections. Thus, the question lies in the adequacy and/or usefulness of such summarizations to the target analyst. In this chapter, we report our experimental study on the sensitivity of users to the quality and content of multi-document summarization. We used the DUC 2002 collection for multi-document summarization as our testbed. Two groups of document sets were considered: (I) the sets consisting of closely correlated documents with highly overlapped content; and (II) the sets consisting of diverse documents covering a wide scope of topics. Intuitively, this suggests that creating a quality summary would be more difficult for the latter case. However, human evaluators were discovered to be fairly insensitive to this difference. This occurred when they were asked to rank the performance of various automated summarizers. In this chapter, we examine and analyze our experiments in order to better understand this phenomenon and how we might address it to improve summarization quality. In particular, we present a new metric based on document graphs that can distinguish between the two types of document sets.
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Task-Driven Comparison of Topic Models.
Alexander, Eric; Gleicher, Michael
2016-01-01
Topic modeling, a method of statistically extracting thematic content from a large collection of texts, is used for a wide variety of tasks within text analysis. Though there are a growing number of tools and techniques for exploring single models, comparisons between models are generally reduced to a small set of numerical metrics. These metrics may or may not reflect a model's performance on the analyst's intended task, and can therefore be insufficient to diagnose what causes differences between models. In this paper, we explore task-centric topic model comparison, considering how we can both provide detail for a more nuanced understanding of differences and address the wealth of tasks for which topic models are used. We derive comparison tasks from single-model uses of topic models, which predominantly fall into the categories of understanding topics, understanding similarity, and understanding change. Finally, we provide several visualization techniques that facilitate these tasks, including buddy plots, which combine color and position encodings to allow analysts to readily view changes in document similarity.
NASA Technical Reports Server (NTRS)
Chamberlain, Robert G.; Duquette, William H.
2013-01-01
TRISA, the U.S. Army TRADOC G2 Intelligence Support Activity, received Athena 1 in 2009. They first used Athena 3 to support studies in 2011. This paper describes Athena 4, which they started using in October 2012. A final section discusses issues that are being considered for incorporation into Athena 5 and later. Athena's objective is to help skilled intelligence analysts anticipate the likely consequences of complex courses of action that use our country's entire power base, not just our military capabilities, for operations in troubled regions of the world. Measures of effectiveness emphasize who is in control and the effects of our actions on the attitudes and well-being of civilians. The planning horizon encompasses not weeks or months, but years. Athena is a scalable, laptop-based simulation with weekly resolution. Up to three months of simulated time can pass between game turns that require user interaction. Athena's geographic scope is nominally a country, but can be a region within a county. Geographic resolution is "neighborhoods", which are defined by the user and may be actual neighborhoods, provinces, or anything in between. Models encompass phenomena whose effects are expected to be relevant over a medium-term planning horizon-three months to three years. The scope and intrinsic complexity of the problem dictate a spiral development process. That is, the model is used during development and lessons learned are used to improve the model. Even more important is that while every version must consider the "big picture" at some level of detail, development priority is given to those issues that are most relevant to currently anticipated studies. For example, models of the delivery and effectiveness of information operations messaging were among the additions in Athena 4.
NASA Technical Reports Server (NTRS)
Chamberlain, Robert G.; Duquette, William H.
2013-01-01
TRISA, the U.S. Army TRADOC G2 Intelligence Support Activity, received Athena 1 in 2009. They first used Athena 3 to support studies in 2011. This paper describes Athena 4, which they started using in October 2012. A final section discusses issues that are being considered for incorporation into Athena 5 and later. Athena's objective is to help skilled intelligence analysts anticipate the likely consequences of complex courses of action that use our country's entire power base, not just our military capabilities, for operations in troubled regions of the world. Measures of effectiveness emphasize who is in control and the effects of our actions on the attitudes and well being of civilians. The planning horizon encompasses not weeks or months, but years.Athena is a scalable, laptop-based simulation with weekly resolution. Up to three months of simulated time can pass between game turns that require user interaction. Athena's geographic scope is nominally a country, but can be a region within a county. Geographic resolution is "neighborhoods", which are defined by the user and may be actual neighborhoods, provinces, or anything in between. Models encompass phenomena whose effects are expected to be relevant over a medium-term planning horizon--three months to three years.The scope and intrinsic complexity of the problem dictate a spiral development process. That is, the model is used during development and lessons learned are used to improve the model. Even more important is that while every version must consider the "big picture" at some level of detail, development priority is given to those issues that are most relevant to currently anticipated studies. For example, models of the delivery and effectiveness of information operations messaging were among the additions in Athena 4.
Interpretation and the psychic future.
Cooper, S H
1997-08-01
The author applies the analyst's multi-faceted awareness of his or her view of the patient's psychic future to analytic process. Loewald's (1960) interest in the way in which the analyst anticipates the future of the patient was linked to his epistemological assumptions about the analyst's superior objectivity and maturity relative to the patient. The elucidation of the authority of the analyst (e.g. Hoffman, 1991, 1994) allows us to begin to disentangle the analyst's view of the patient's psychic future from some of these epistemological assumptions. Clinical illustrations attempt to show how the analyst's awareness of this aspect of the interpretive process is often deconstructed over time and can help to understand aspects of resistance from both analyst and patient. This perspective may provide one more avenue for understanding our various modes of influence through interpretive process.
O'Mahony, James F; Newall, Anthony T; van Rosmalen, Joost
2015-12-01
Time is an important aspect of health economic evaluation, as the timing and duration of clinical events, healthcare interventions and their consequences all affect estimated costs and effects. These issues should be reflected in the design of health economic models. This article considers three important aspects of time in modelling: (1) which cohorts to simulate and how far into the future to extend the analysis; (2) the simulation of time, including the difference between discrete-time and continuous-time models, cycle lengths, and converting rates and probabilities; and (3) discounting future costs and effects to their present values. We provide a methodological overview of these issues and make recommendations to help inform both the conduct of cost-effectiveness analyses and the interpretation of their results. For choosing which cohorts to simulate and how many, we suggest analysts carefully assess potential reasons for variation in cost effectiveness between cohorts and the feasibility of subgroup-specific recommendations. For the simulation of time, we recommend using short cycles or continuous-time models to avoid biases and the need for half-cycle corrections, and provide advice on the correct conversion of transition probabilities in state transition models. Finally, for discounting, analysts should not only follow current guidance and report how discounting was conducted, especially in the case of differential discounting, but also seek to develop an understanding of its rationale. Our overall recommendations are that analysts explicitly state and justify their modelling choices regarding time and consider how alternative choices may impact on results.
NASA Astrophysics Data System (ADS)
Dabolt, T. O.
2016-12-01
The proliferation of open data and data services continues to thrive and is creating new challenges on how researchers, policy analysts and other decision makes can quickly discover and use relevant data. While traditional metadata catalog approaches used by applications such as data.gov prove to be useful starting points for data search they can quickly frustrate end users who are seeking ways to quickly find and then use data in machine to machine environs. The Geospatial Platform is overcoming these obstacles and providing end users and applications developers a richer more productive user experience. The Geospatial Platform leverages a collection of open source and commercial technology hosted on Amazon Web Services providing an ecosystem of services delivering trusted, consistent data in open formats to all users as well as a shared infrastructure for federal partners to serve their spatial data assets. It supports a diverse array of communities of practice ranging on topics from the 16 National Geospatial Data Assets Themes, to homeland security and climate adaptation. Come learn how you can contribute your data and leverage others or check it out on your own at https://www.geoplatform.gov/
A Dose of Reality: Radiation Analysis for Realistic Human Spacecraft
NASA Technical Reports Server (NTRS)
Barzilla, J. E.; Lee, K. T.
2017-01-01
INTRODUCTION As with most computational analyses, a tradeoff exists between problem complexity, resource availability and response accuracy when modeling radiation transport from the source to a detector. The largest amount of analyst time for setting up an analysis is often spent ensuring that any simplifications made have minimal impact on the results. The vehicle shield geometry of interest is typically simplified from the original CAD design in order to reduce computation time, but this simplification requires the analyst to "re-draw" the geometry with a limited set of volumes in order to accommodate a specific radiation transport software package. The resulting low-fidelity geometry model cannot be shared with or compared to other radiation transport software packages, and the process can be error prone with increased model complexity. The work presented here demonstrates the use of the DAGMC (Direct Accelerated Geometry for Monte Carlo) Toolkit from the University of Wisconsin, to model the impacts of several space radiation sources on a CAD drawing of the US Lab module. METHODS The DAGMC toolkit workflow begins with the export of an existing CAD geometry from the native CAD to the ACIS format. The ACIS format file is then cleaned using SpaceClaim to remove small holes and component overlaps. Metadata is then assigned to the cleaned geometry file using CUBIT/Trelis from csimsoft (Registered Trademark). The DAGMC plugin script removes duplicate shared surfaces, facets the geometry to a specified tolerance, and ensures that the faceted geometry is water tight. This step also writes the material and scoring information to a standard input file format that the analyst can alter as desired prior to running the radiation transport program. The scoring results can be transformed, via python script, into a 3D format that is viewable in a standard graphics program. RESULTS The CAD model of the US Lab module of the International Space Station, inclusive of all the racks and components, was simplified to remove holes and volume overlaps. Problematic features within the drawing were also removed or repaired to prevent runtime issues. The cleaned drawing was then run through the DAGMC workflow to prepare for analysis. Pilot tests modeling transport of 1GeV proton and 800MeV/A oxygen sources show that reasonable results are converged upon in an acceptable amount of overall computation time from drawing preparation to data analysis. The FLUKA radiation transport code will next be used to model both a GCR and a trapped radiation source. These results will then be compared with measurements that have been made by the radiation instrumentation deployed inside the US Lab module. DISCUSSION Early analyses have indicated that the DAGMC workflow is a promising toolkit for running vehicle geometries of interest to NASA through multiple radiation transport codes. In addition, recent work has shown that a realistic human phantom, provided via a subcontract with the University of Florida, can be placed inside any vehicle geometry for a combinatorial analysis. This added functionality gives the user the ability to score various parameters at the organ level, and the results can then be used as input for cancer risk models.
Some Observations on the Current Status of Performing Finite Element Analyses
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.
2015-01-01
Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.
Developing an intelligence analysis process through social network analysis
NASA Astrophysics Data System (ADS)
Waskiewicz, Todd; LaMonica, Peter
2008-04-01
Intelligence analysts are tasked with making sense of enormous amounts of data and gaining an awareness of a situation that can be acted upon. This process can be extremely difficult and time consuming. Trying to differentiate between important pieces of information and extraneous data only complicates the problem. When dealing with data containing entities and relationships, social network analysis (SNA) techniques can be employed to make this job easier. Applying network measures to social network graphs can identify the most significant nodes (entities) and edges (relationships) and help the analyst further focus on key areas of concern. Strange developed a model that identifies high value targets such as centers of gravity and critical vulnerabilities. SNA lends itself to the discovery of these high value targets and the Air Force Research Laboratory (AFRL) has investigated several network measures such as centrality, betweenness, and grouping to identify centers of gravity and critical vulnerabilities. Using these network measures, a process for the intelligence analyst has been developed to aid analysts in identifying points of tactical emphasis. Organizational Risk Analyzer (ORA) and Terrorist Modus Operandi Discovery System (TMODS) are the two applications used to compute the network measures and identify the points to be acted upon. Therefore, the result of leveraging social network analysis techniques and applications will provide the analyst and the intelligence community with more focused and concentrated analysis results allowing them to more easily exploit key attributes of a network, thus saving time, money, and manpower.
Towards an automated intelligence product generation capability
NASA Astrophysics Data System (ADS)
Smith, Alison M.; Hawes, Timothy W.; Nolan, James J.
2015-05-01
Creating intelligence information products is a time consuming and difficult process for analysts faced with identifying key pieces of information relevant to a complex set of information requirements. Complicating matters, these key pieces of information exist in multiple modalities scattered across data stores, buried in huge volumes of data. This results in the current predicament analysts find themselves; information retrieval and management consumes huge amounts of time that could be better spent performing analysis. The persistent growth in data accumulation rates will only increase the amount of time spent on these tasks without a significant advance in automated solutions for information product generation. We present a product generation tool, Automated PrOduct Generation and Enrichment (APOGEE), which aims to automate the information product creation process in order to shift the bulk of the analysts' effort from data discovery and management to analysis. APOGEE discovers relevant text, imagery, video, and audio for inclusion in information products using semantic and statistical models of unstructured content. APOGEEs mixed-initiative interface, supported by highly responsive backend mechanisms, allows analysts to dynamically control the product generation process ensuring a maximally relevant result. The combination of these capabilities results in significant reductions in the time it takes analysts to produce information products while helping to increase the overall coverage. Through evaluation with a domain expert, APOGEE has been shown the potential to cut down the time for product generation by 20x. The result is a flexible end-to-end system that can be rapidly deployed in new operational settings.
Should I use that model? Assessing the transferability of ecological models to new settings
Analysts and scientists frequently apply existing models that estimate ecological endpoints or simulate ecological processes to settings where the models have not been used previously, and where data to parameterize and validate the model may be sparse. Prior to transferring an ...
Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development
NASA Technical Reports Server (NTRS)
Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William
1987-01-01
Spacelab Data Processing Facility (SLDPF) expert system prototypes were developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. The SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.
Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development
NASA Technical Reports Server (NTRS)
Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William
1987-01-01
Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.
Constant-Elasticity-of-Substitution Simulation
NASA Technical Reports Server (NTRS)
Reiter, G.
1986-01-01
Program simulates constant elasticity-of-substitution (CES) production function. CES function used by economic analysts to examine production costs as well as uncertainties in production. User provides such input parameters as price of labor, price of capital, and dispersion levels. CES minimizes expected cost to produce capital-uncertainty pair. By varying capital-value input, one obtains series of capital-uncertainty pairs. Capital-uncertainty pairs then used to generate several cost curves. CES program menu driven and features specific print menu for examining selected output curves. Program written in BASIC for interactive execution and implemented on IBM PC-series computer.
CLEAR: Communications Link Expert Assistance Resource
NASA Technical Reports Server (NTRS)
Hull, Larry G.; Hughes, Peter M.
1987-01-01
Communications Link Expert Assistance Resource (CLEAR) is a real time, fault diagnosis expert system for the Cosmic Background Explorer (COBE) Mission Operations Room (MOR). The CLEAR expert system is an operational prototype which assists the MOR operator/analyst by isolating and diagnosing faults in the spacecraft communication link with the Tracking and Data Relay Satellite (TDRS) during periods of realtime data acquisition. The mission domain, user requirements, hardware configuration, expert system concept, tool selection, development approach, and system design were discussed. Development approach and system implementation are emphasized. Also discussed are system architecture, tool selection, operation, and future plans.
Use of the Collaborative Optimization Architecture for Launch Vehicle Design
NASA Technical Reports Server (NTRS)
Braun, R. D.; Moore, A. A.; Kroo, I. M.
1996-01-01
Collaborative optimization is a new design architecture specifically created for large-scale distributed-analysis applications. In this approach, problem is decomposed into a user-defined number of subspace optimization problems that are driven towards interdisciplinary compatibility and the appropriate solution by a system-level coordination process. This decentralized design strategy allows domain-specific issues to be accommodated by disciplinary analysts, while requiring interdisciplinary decisions to be reached by consensus. The present investigation focuses on application of the collaborative optimization architecture to the multidisciplinary design of a single-stage-to-orbit launch vehicle. Vehicle design, trajectory, and cost issues are directly modeled. Posed to suit the collaborative architecture, the design problem is characterized by 5 design variables and 16 constraints. Numerous collaborative solutions are obtained. Comparison of these solutions demonstrates the influence which an priori ascent-abort criterion has on development cost. Similarly, objective-function selection is discussed, demonstrating the difference between minimum weight and minimum cost concepts. The operational advantages of the collaborative optimization
Reinforcements, ammunition limits, and termination of neutralization engagements in ASSESS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paulus, W.K.; Mondragon, J.
1991-01-01
This paper reports on the ASSESS Neutralization Analysis module (Neutralization) which is part of Analytic system and Software for Evaluation of Safeguards and Security, ASSESS, a vulnerability assessment tool. Neutralization models a fire fight engagement between security inspectors (SIs) and adversaries. The model has been improved to represent more realistically the addition of reinforcements to an engagement, the criteria for declaring an engagement terminated, and the amount of ammunition which security forces can use. SI reinforcements must prevent adversaries from achieving their purpose even if an initial security force has been overcome. The reinforcements must be timely. A variety ofmore » reinforcement timeliness cases can be modeled. Reinforcements that are not timely are shown to be ineffective in the calculated results. Engagements may terminate before all combatants on one side are neutralized if they recognize that they are losing. A winner is declared when the number of survivors on one side is reduced to a user specified level. Realistically, the amount of ammunition that can be carried into an engagement is limited. Neutralization now permits the analyst to specify the number of rounds available to the security forces initially and the quantity of resupply that is introduced with reinforcements. These new capabilities all contribute toward more realistic modeling of neutralization engagements.« less
Staccini, Pascal; Joubert, Michel; Quaranta, Jean-François; Fieschi, Marius
2005-03-01
Today, the economic and regulatory environment, involving activity-based and prospective payment systems, healthcare quality and risk analysis, traceability of the acts performed and evaluation of care practices, accounts for the current interest in clinical and hospital information systems. The structured gathering of information relative to users' needs and system requirements is fundamental when installing such systems. This stage takes time and is generally misconstrued by caregivers and is of limited efficacy to analysts. We used a modelling technique designed for manufacturing processes (IDEF0/SADT). We enhanced the basic model of an activity with descriptors extracted from the Ishikawa cause-and-effect diagram (methods, men, materials, machines, and environment). We proposed an object data model of a process and its components, and programmed a web-based tool in an object-oriented environment. This tool makes it possible to extract the data dictionary of a given process from the description of its elements and to locate documents (procedures, recommendations, instructions) according to each activity or role. Aimed at structuring needs and storing information provided by directly involved teams regarding the workings of an institution (or at least part of it), the process-mapping approach has an important contribution to make in the analysis of clinical information systems.
Optical architecture of HoloLens mixed reality headset
NASA Astrophysics Data System (ADS)
Kress, Bernard C.; Cummings, William J.
2017-06-01
HoloLens by Microsoft Corp. is the world's first untethered Mixed Reality (MR) Head Mounted Display (HMD) system, released to developers in March 2016 as a Development Kit. We review in this paper the various display requirements and subsequent optical hardware choices we made for HoloLens. Its main achievements go along performance and comfort for the user: it is the first fully untethered MR headset, with the highest angular resolution and the industry's largest eyebox. It has the first inside-out global sensor fusion system including precise head tracking and 3D mapping all controlled by a fully custom on-board GPU. Based on such achievements, HoloLens came out as the most advanced MR system today. Additional features may be implemented in next generations MR headsets, leading to the ultimate experience for the user, and securing the upcoming fabulous AR/MR market predicted by most analysts.
Zhang, Xindi; Warren, Jim; Corter, Arden; Goodyear-Smith, Felicity
2016-01-01
This paper describes development of a prototype data analytics portal for analysis of accumulated screening results from eCHAT (electronic Case-finding and Help Assessment Tool). eCHAT allows individuals to conduct a self-administered lifestyle and mental health screening assessment, with usage to date chiefly in the context of primary care waiting rooms. The intention is for wide roll-out to primary care clinics, including secondary school based clinics, resulting in the accumulation of population-level data. Data from a field trial of eCHAT with sexual health questions tailored to youth were used to support design of a data analytics portal for population-level data. The design process included user personas and scenarios, screen prototyping and a simulator for generating large-scale data sets. The prototype demonstrates the promise of wide-scale self-administered screening data to support a range of users including practice managers, clinical directors and health policy analysts.
A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice
ERIC Educational Resources Information Center
Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.
2015-01-01
To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…
"This strange disease": adolescent transference and the analyst's sexual orientation.
Burton, John K; Gilmore, Karen
2010-08-01
The treatment of adolescents by gay analysts is uncharted territory regarding the impact of the analyst's sexuality on the analytic process. Since a core challenge of adolescence involves the integration of the adult sexual body, gender role, and reproductive capacities into evolving identity, and since adolescents seek objects in their environment to facilitate both identity formation and the establishment of autonomy from primary objects, the analyst's sexual orientation is arguably a potent influence on the outcome of adolescent development. However, because sexual orientation is a less visible characteristic of the analyst than gender, race, or age, for example, the line between reality and fantasy is less clearly demarcated. This brings up special considerations regarding discovery and disclosure in the treatment. To explore these issues, the case of a late adolescent girl in treatment with a gay male analyst is presented. In this treatment, the question of the analyst's sexual orientation, and the demand by the patient for the analyst's self-disclosure, became a transference nucleus around which the patient's individual dynamics and adolescent dilemmas could be explored and clarified.
Tutorial: Parallel Computing of Simulation Models for Risk Analysis.
Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D
2016-10-01
Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.
Simulation of linear mechanical systems
NASA Technical Reports Server (NTRS)
Sirlin, S. W.
1993-01-01
A dynamics and controls analyst is typically presented with a structural dynamics model and must perform various input/output tests and design control laws. The required time/frequency simulations need to be done many times as models change and control designs evolve. This paper examines some simple ways that open and closed loop frequency and time domain simulations can be done using the special structure of the system equations usually available. Routines were developed to run under Pro-Matlab in a mixture of the Pro-Matlab interpreter and FORTRAN (using the .mex facility). These routines are often orders of magnitude faster than trying the typical 'brute force' approach of using built-in Pro-Matlab routines such as bode. This makes the analyst's job easier since not only does an individual run take less time, but much larger models can be attacked, often allowing the whole model reduction step to be eliminated.
The future of climate science analysis in a coming era of exascale computing
NASA Astrophysics Data System (ADS)
Bates, S. C.; Strand, G.
2013-12-01
Projections of Community Earth System Model (CESM) output based on the growth of data archived over 2000-2012 at all of our computing sites (NCAR, NERSC, ORNL) show that we can expect to reach 1,000 PB (1 EB) sometime in the next decade or so. The current paradigms of using site-based archival systems to hold these data that are then accessed via portals or gateways, downloading the data to a local system, and then processing/analyzing the data will be irretrievably broken before then. From a climate modeling perspective, the expertise involved in making climate models themselves efficient on HPC systems will need to be applied to the data as well - providing fast parallel analysis tools co-resident in memory with the data, because the disk I/O bandwidth simply will not keep up with the expected arrival of exaflop systems. The ability of scientists, analysts, stakeholders and others to use climate model output to turn these data into understanding and knowledge will require significant advances in the current typical analysis tools and packages to enable these processes for these vast volumes of data. Allowing data users to enact their own analyses on model output is virtually a requirement as well - climate modelers cannot anticipate all the possibilities for analysis that users may want to do. In addition, the expertise of data scientists, and their knowledge of the model output and their knowledge of best practices in data management (metadata, curation, provenance and so on) will need to be rewarded and exploited to gain the most understanding possible from these volumes of data. In response to growing data size, demand, and future projections, the CESM output has undergone a structure evolution and the data management plan has been reevaluated and updated. The major evolution of the CESM data structure is presented here, along with the CESM experience and role within the CMIP3/CMIP5.
Mission planning for space based satellite surveillance experiments with the MSX
NASA Technical Reports Server (NTRS)
Sridharan, R.; Fishman, T.; Robinson, E.; Viggh, H.; Wiseman, A.
1994-01-01
The Midcourse Space Experiment is a BMDO-sponsored scientific satellite set for launch within the year. The satellite will collect phenomenology data on missile targets, plumes, earth limb backgrounds and deep space backgrounds in the LWIR, visible and ultra-violet spectral bands. It will also conduct functional demonstrations for space-based space surveillance. The Space-Based Visible sensor, built by Lincoln Laboratory, Massachusetts Institute of Technology, is the primary sensor on board the MSX for demonstration of space surveillance. The SBV Processing, Operations and Control Center (SPOCC) is the mission planning and commanding center for all space surveillance experiments using the SBV and other MSX instruments. The guiding principle in the SPOCC Mission Planning System was that all routine functions be automated. Manual analyst input should be minimal. Major concepts are: (I) A high level language, called SLED, for user interface to the system; (2) A group of independent software processes which would generally be run in a pipe-line mode for experiment commanding but can be run independently for analyst assessment; (3) An integrated experiment cost computation function that permits assessment of the feasibility of the experiment. This paper will report on the design, implementation and testing of the Mission Planning System.
High school students as a seismic network analysts
NASA Astrophysics Data System (ADS)
Filatov, P.; Fedorenko, Yu.; Beketova, E.; Husebye, E.
2003-04-01
Many research organizations have a large amount of collected seismological data. Some data centers keep data closed from scientists, others have a specific interfaces for access, what is not acceptable for education. For SeisSchool Network in Norway we have developed an universal interface for research and study. The main principles of our interface are: bullet Accessibility - it should provides data access for everybody any where via Internet without restrictions of hardware platform, operational system, Internet browser or bandwidth of connection. bullet Informativity - it should visualize data, have examples of processing routines (filters, envelopes) including phase picking and event location. Also it provides access to various seismology information. bullet Scalability - provide storage for various types of seismic data and a multitude of services for many user levels. This interface (http://pcg1.ifjf.uib.no) helps analysts in basic research and together with information of our Web site we introduces students to theory and practice of seismology. Based on our Web interface group of students won a Norwegian Young Scientists award. In this presentation we demonstrate advantages of our interface, on-line data processing and how to monitoring our network in near real time.
Cost approach of health care entity intangible asset valuation.
Reilly, Robert F
2012-01-01
In the valuation synthesis and conclusion process, the analyst should consider the following question: Does the selected valuation approach(es) and method(s) accomplish the analyst's assignment? Also, does the selected valuation approach and method actually quantify the desired objective of the intangible asset analysis? The analyst should also consider if the selected valuation approach and method analyzes the appropriate bundle of legal rights. The analyst should consider if there were sufficient empirical data available to perform the selected valuation approach and method. The valuation synthesis should consider if there were sufficient data available to make the analyst comfortable with the value conclusion. The valuation analyst should consider if the selected approach and method will be understandable to the intended audience. In the valuation synthesis and conclusion, the analyst should also consider which approaches and methods deserve the greatest consideration with respect to the intangible asset's RUL. The intangible asset RUL is a consideration of each valuation approach. In the income approach, the RUL may affect the projection period for the intangible asset income subject to either yield capitalization or direct capitalization. In the cost approach, the RUL may affect the total amount of obsolescence, if any, from the estimate cost measure (that is, the intangible reproduction cost new or replacement cost new). In the market approach, the RUL may effect the selection, rejection, and/or adjustment of the comparable or guideline intangible asset sale and license transactional data. The experienced valuation analyst will use professional judgment to weight the various value indications to conclude a final intangible asset value, based on: The analyst's confidence in the quantity and quality of available data; The analyst's level of due diligence performed on that data; The relevance of the valuation method to the intangible asset life cycle stage and degree of marketability; and The degree of variation in the range of value indications. Valuation analysts value health care intangible assets for a number of reasons. In addition to regulatory compliance reasons, these reasons include various transaction, taxation, financing, litigation, accounting, bankruptcy, and planning purposes. The valuation analyst should consider all generally accepted intangible asset valuation approaches, methods, and procedures. Many valuation analysts are more familiar with market approach and income approach valuation methods. However, there are numerous instances when cost approach valuation methods are also applicable to the health care intangible asset valuation. This discussion summarized the analyst's procedures and considerations with regard to the cost approach. The cost approach is often applicable to the valuation of intangible assets in the health care industry. However, the cost approach is only applicable if the valuation analyst (1) appropriately considers all of the cost components and (2) appropriately identifies and quantifies all obsolescence allowances. Regardless of the health care intangible asset or the reason for the valuation, the analyst should be familiar with all generally accepted valuation approaches and methods. And, the valuation analyst should have a clear, convincing, and cogent rationale for (1) accepting each approach and method applied and (2) rejecting each approach and method not applied. That way, the valuation analyst will best achieve the purpose and objective of the health care intangible asset valuation.
Cross-Dataset Analysis and Visualization Driven by Expressive Web Services
NASA Astrophysics Data System (ADS)
Alexandru Dumitru, Mircea; Catalin Merticariu, Vlad
2015-04-01
The deluge of data that is hitting us every day from satellite and airborne sensors is changing the workflow of environmental data analysts and modelers. Web geo-services play now a fundamental role, and are no longer needed to preliminary download and store the data, but rather they interact in real-time with GIS applications. Due to the very large amount of data that is curated and made available by web services, it is crucial to deploy smart solutions for optimizing network bandwidth, reducing duplication of data and moving the processing closer to the data. In this context we have created a visualization application for analysis and cross-comparison of aerosol optical thickness datasets. The application aims to help researchers identify and visualize discrepancies between datasets coming from various sources, having different spatial and time resolutions. It also acts as a proof of concept for integration of OGC Web Services under a user-friendly interface that provides beautiful visualizations of the explored data. The tool was built on top of the World Wind engine, a Java based virtual globe built by NASA and the open source community. For data retrieval and processing we exploited the OGC Web Coverage Service potential: the most exciting aspect being its processing extension, a.k.a. the OGC Web Coverage Processing Service (WCPS) standard. A WCPS-compliant service allows a client to execute a processing query on any coverage offered by the server. By exploiting a full grammar, several different kinds of information can be retrieved from one or more datasets together: scalar condensers, cross-sectional profiles, comparison maps and plots, etc. This combination of technology made the application versatile and portable. As the processing is done on the server-side, we ensured that the minimal amount of data is transferred and that the processing is done on a fully-capable server, leaving the client hardware resources to be used for rendering the visualization. The application offers a set of features to visualize and cross-compare the datasets. Users can select a region of interest in space and time on which an aerosol map layer is plotted. Hovmoeller time-latitude and time-longitude profiles can be displayed by selecting orthogonal cross-sections on the globe. Statistics about the selected dataset are also displayed in different text and plot formats. The datasets can also be cross-compared either by using the delta map tool or the merged map tool. For more advanced users, a WCPS query console is also offered allowing users to process their data with ad-hoc queries and then choose how to display the results. Overall, the user has a rich set of tools that can be used to visualize and cross-compare the aerosol datasets. With our application we have shown how the NASA WorldWind framework can be used to display results processed efficiently - and entirely - on the server side using the expressiveness of the OGC WCPS web-service. The application serves not only as a proof of concept of a new paradigm in working with large geospatial data but also as an useful tool for environmental data analysts.
ERIC Educational Resources Information Center
Arellano, Eduardo C.; Martinez, Mario C.
2009-01-01
This study compares the extent to which higher education policy analysts and master's and doctoral faculty of higher education and public affairs programs match on a set of competencies thought to be important to higher education policy analysis. Analysts matched master's faculty in three competencies while analysts and doctoral faculty matched in…
Modeling Spacecraft Fuel Slosh at Embry-Riddle Aeronautical University
NASA Technical Reports Server (NTRS)
Schlee, Keith L.
2007-01-01
As a NASA-sponsored GSRP Fellow, I worked with other researchers and analysts at Embry-Riddle Aeronautical University and NASA's ELV Division to investigate the effect of spacecraft fuel slosh. NASA's research into the effects of fuel slosh includes modeling the response in full-sized tanks using equipment such as the Spinning Slosh Test Rig (SSTR), located at Southwest Research Institute (SwRI). NASA and SwRI engineers analyze data taken from SSTR runs and hand-derive equations of motion to identify model parameters and characterize the sloshing motion. With guidance from my faculty advisor, Dr. Sathya Gangadharan, and NASA flight controls analysts James Sudermann and Charles Walker, I set out to automate this parameter identification process by building a simple physical experimental setup to model free surface slosh in a spherical tank with a simple pendulum analog. This setup was then modeled using Simulink and SimMechanics. The Simulink Parameter Estimation Tool was then used to identify the model parameters.
An agile acquisition decision-support workbench for evaluating ISR effectiveness
NASA Astrophysics Data System (ADS)
Stouch, Daniel W.; Champagne, Valerie; Mow, Christopher; Rosenberg, Brad; Serrin, Joshua
2011-06-01
The U.S. Air Force is consistently evolving to support current and future operations through the planning and execution of intelligence, surveillance and reconnaissance (ISR) missions. However, it is a challenge to maintain a precise awareness of current and emerging ISR capabilities to properly prepare for future conflicts. We present a decisionsupport tool for acquisition managers to empirically compare ISR capabilities and approaches to employing them, thereby enabling the DoD to acquire ISR platforms and sensors that provide the greatest return on investment. We have developed an analysis environment to perform modeling and simulation-based experiments to objectively compare alternatives. First, the analyst specifies an operational scenario for an area of operations by providing terrain and threat information; a set of nominated collections; sensor and platform capabilities; and processing, exploitation, and dissemination (PED) capacities. Next, the analyst selects and configures ISR collection strategies to generate collection plans. The analyst then defines customizable measures of effectiveness or performance to compute during the experiment. Finally, the analyst empirically compares the efficacy of each solution and generates concise reports to document their conclusions, providing traceable evidence for acquisition decisions. Our capability demonstrates the utility of using a workbench environment for analysts to design and run experiments. Crafting impartial metrics enables the acquisition manager to focus on evaluating solutions based on specific military needs. Finally, the metric and collection plan visualizations provide an intuitive understanding of the suitability of particular solutions. This facilitates a more agile acquisition strategy that handles rapidly changing technology in response to current military needs.
Animation of multi-flexible body systems and its use in control system design
NASA Technical Reports Server (NTRS)
Juengst, Carl; Stahlberg, Ron
1993-01-01
Animation can greatly assist the structural dynamicist and control system analyst with better understanding of how multi-flexible body systems behave. For multi-flexible body systems, the structural characteristics (mode frequencies, mode shapes, and damping) change, sometimes dramatically with large angles of rotation between bodies. With computer animation, the analyst can visualize these changes and how the system responds to active control forces and torques. A characterization of the type of system we wish to animate is presented. The lack of clear understanding of the above effects was a key element leading to the development of a multi-flexible body animation software package. The resulting animation software is described in some detail here, followed by its application to the control system analyst. Other applications of this software can be determined on an individual need basis. A number of software products are currently available that make the high-speed rendering of rigid body mechanical system simulation possible. However, such options are not available for use in rendering flexible body mechanical system simulations. The desire for a high-speed flexible body visualization tool led to the development of the Flexible Or Rigid Mechanical System (FORMS) software. This software was developed at the Center for Simulation and Design Optimization of Mechanical Systems at the University of Iowa. FORMS provides interactive high-speed rendering of flexible and/or rigid body mechanical system simulations, and combines geometry and motion information to produce animated output. FORMS is designed to be both portable and flexible, and supports a number of different user interfaces and graphical display devices. Additional features have been added to FORMS that allow special visualization results related to the nature of the flexible body geometric representations.
Assessing the performance of regional landslide early warning models: the EDuMaP method
NASA Astrophysics Data System (ADS)
Calvello, M.; Piciullo, L.
2016-01-01
A schematic of the components of regional early warning systems for rainfall-induced landslides is herein proposed, based on a clear distinction between warning models and warning systems. According to this framework an early warning system comprises a warning model as well as a monitoring and warning strategy, a communication strategy and an emergency plan. The paper proposes the evaluation of regional landslide warning models by means of an original approach, called the "event, duration matrix, performance" (EDuMaP) method, comprising three successive steps: identification and analysis of the events, i.e., landslide events and warning events derived from available landslides and warnings databases; definition and computation of a duration matrix, whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model performance by means of performance criteria and indicators applied to the duration matrix. During the first step the analyst identifies and classifies the landslide and warning events, according to their spatial and temporal characteristics, by means of a number of model parameters. In the second step, the analyst computes a time-based duration matrix with a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warning data from the municipal early warning system operating in Rio de Janeiro (Brazil).
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Analyst-to-Analyst Variability in Simulation-Based Prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glickman, Matthew R.; Romero, Vicente J.
This report describes findings from the culminating experiment of the LDRD project entitled, "Analyst-to-Analyst Variability in Simulation-Based Prediction". For this experiment, volunteer participants solving a given test problem in engineering and statistics were interviewed at different points in their solution process. These interviews are used to trace differing solutions to differing solution processes, and differing processes to differences in reasoning, assumptions, and judgments. The issue that the experiment was designed to illuminate -- our paucity of understanding of the ways in which humans themselves have an impact on predictions derived from complex computational simulations -- is a challenging and openmore » one. Although solution of the test problem by analyst participants in this experiment has taken much more time than originally anticipated, and is continuing past the end of this LDRD, this project has provided a rare opportunity to explore analyst-to-analyst variability in significant depth, from which we derive evidence-based insights to guide further explorations in this important area.« less
The patient who believes and the analyst who does not (1).
Lijtmaer, Ruth M
2009-01-01
A patient's religious beliefs and practices challenge the clinical experience and self-knowledge of the analyst owing to a great complexity of factors, and often take the form of the analyst's resistances and countertransference reactions to spiritual and religious issues. The analyst's feelings about the patient's encounters with religion and other forms of healing experiences may result in impasses and communication breakdown for a variety of reasons. These reasons include the analyst's own unresolved issues around her role as a psychoanalyst-which incorporates in some way psychoanalysis's views of religious belief-and these old conflicts may be irritated by the religious themes expressed by the patient. Vignettes from the treatments of two patients provide examples of the analyst's countertransference conflicts, particularly envy in the case of a therapist who is an atheist.
Realistic computer network simulation for network intrusion detection dataset generation
NASA Astrophysics Data System (ADS)
Payer, Garrett
2015-05-01
The KDD-99 Cup dataset is dead. While it can continue to be used as a toy example, the age of this dataset makes it all but useless for intrusion detection research and data mining. Many of the attacks used within the dataset are obsolete and do not reflect the features important for intrusion detection in today's networks. Creating a new dataset encompassing a large cross section of the attacks found on the Internet today could be useful, but would eventually fall to the same problem as the KDD-99 Cup; its usefulness would diminish after a period of time. To continue research into intrusion detection, the generation of new datasets needs to be as dynamic and as quick as the attacker. Simply examining existing network traffic and using domain experts such as intrusion analysts to label traffic is inefficient, expensive, and not scalable. The only viable methodology is simulation using technologies including virtualization, attack-toolsets such as Metasploit and Armitage, and sophisticated emulation of threat and user behavior. Simulating actual user behavior and network intrusion events dynamically not only allows researchers to vary scenarios quickly, but enables online testing of intrusion detection mechanisms by interacting with data as it is generated. As new threat behaviors are identified, they can be added to the simulation to make quicker determinations as to the effectiveness of existing and ongoing network intrusion technology, methodology and models.
Using the living laboratory framework as a basis for understanding next-generation analyst work
NASA Astrophysics Data System (ADS)
McNeese, Michael D.; Mancuso, Vincent; McNeese, Nathan; Endsley, Tristan; Forster, Pete
2013-05-01
The preparation of next generation analyst work requires alternative levels of understanding and new methodological departures from the way current work transpires. Current work practices typically do not provide a comprehensive approach that emphasizes the role of and interplay between (a) cognition, (b) emergent activities in a shared situated context, and (c) collaborative teamwork. In turn, effective and efficient problem solving fails to take place, and practice is often composed of piecemeal, techno-centric tools that isolate analysts by providing rigid, limited levels of understanding of situation awareness. This coupled with the fact that many analyst activities are classified produces a challenging situation for researching such phenomena and designing and evaluating systems to support analyst cognition and teamwork. Through our work with cyber, image, and intelligence analysts we have realized that there is more required of researchers to study human-centered designs to provide for analyst's needs in a timely fashion. This paper identifies and describes how The Living Laboratory Framework can be utilized as a means to develop a comprehensive, human-centric, and problem-focused approach to next generation analyst work, design, and training. We explain how the framework is utilized for specific cases in various applied settings (e.g., crisis management analysis, image analysis, and cyber analysis) to demonstrate its value and power in addressing an area of utmost importance to our national security. Attributes of analyst work settings are delineated to suggest potential design affordances that could help improve cognitive activities and awareness. Finally, the paper puts forth a research agenda for the use of the framework for future work that will move the analyst profession in a viable manner to address the concerns identified.
NASA Astrophysics Data System (ADS)
Knox, S.; Meier, P.; Mohammed, K.; Korteling, B.; Matrosov, E. S.; Hurford, A.; Huskova, I.; Harou, J. J.; Rosenberg, D. E.; Thilmant, A.; Medellin-Azuara, J.; Wicks, J.
2015-12-01
Capacity expansion on resource networks is essential to adapting to economic and population growth and pressures such as climate change. Engineered infrastructure systems such as water, energy, or transport networks require sophisticated and bespoke models to refine management and investment strategies. Successful modeling of such complex systems relies on good data management and advanced methods to visualize and share data.Engineered infrastructure systems are often represented as networks of nodes and links with operating rules describing their interactions. Infrastructure system management and planning can be abstracted to simulating or optimizing new operations and extensions of the network. By separating the data storage of abstract networks from manipulation and modeling we have created a system where infrastructure modeling across various domains is facilitated.We introduce Hydra Platform, a Free Open Source Software designed for analysts and modelers to store, manage and share network topology and data. Hydra Platform is a Python library with a web service layer for remote applications, called Apps, to connect. Apps serve various functions including network or results visualization, data export (e.g. into a proprietary format) or model execution. This Client-Server architecture allows users to manipulate and share centrally stored data. XML templates allow a standardised description of the data structure required for storing network data such that it is compatible with specific models.Hydra Platform represents networks in an abstract way and is therefore not bound to a single modeling domain. It is the Apps that create domain-specific functionality. Using Apps researchers from different domains can incorporate different models within the same network enabling cross-disciplinary modeling while minimizing errors and streamlining data sharing. Separating the Python library from the web layer allows developers to natively expand the software or build web-based apps in other languages for remote functionality. Partner CH2M is developing a commercial user-interface for Hydra Platform however custom interfaces and visualization tools can be built. Hydra Platform is available on GitHub while Apps will be shared on a central repository.
Using CASE to Exploit Process Modeling in Technology Transfer
NASA Technical Reports Server (NTRS)
Renz-Olar, Cheryl
2003-01-01
A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).
A meteorologically driven maize stress indicator model
NASA Technical Reports Server (NTRS)
Taylor, T. W.; Ravet, F. W. (Principal Investigator)
1981-01-01
A maize soil moisture and temperature stress model is described which was developed to serve as a meteorological data filter to alert commodity analysts to potential stress conditions in the major maize-producing areas of the world. The model also identifies optimum climatic conditions and planting/harvest problems associated with poor tractability.
Multilevel Analysis of Structural Equation Models via the EM Algorithm.
ERIC Educational Resources Information Center
Jo, See-Heyon
The question of how to analyze unbalanced hierarchical data generated from structural equation models has been a common problem for researchers and analysts. Among difficulties plaguing statistical modeling are estimation bias due to measurement error and the estimation of the effects of the individual's hierarchical social milieu. This paper…
NASA Astrophysics Data System (ADS)
Sadler, Laurel
2017-05-01
In today's battlefield environments, analysts are inundated with real-time data received from the tactical edge that must be evaluated and used for managing and modifying current missions as well as planning for future missions. This paper describes a framework that facilitates a Value of Information (VoI) based data analytics tool for information object (IO) analysis in a tactical and command and control (C2) environment, which reduces analyst work load by providing automated or analyst assisted applications. It allows the analyst to adjust parameters for data matching of the IOs that will be received and provides agents for further filtering or fusing of the incoming data. It allows for analyst enhancement and markup to be made to and/or comments to be attached to the incoming IOs, which can then be re-disseminated utilizing the VoI based dissemination service. The analyst may also adjust the underlying parameters before re-dissemination of an IO, which will subsequently adjust the value of the IO based on this new/additional information that has been added, possibly increasing the value from the original. The framework is flexible and extendable, providing an easy to use, dynamically changing Command and Control decision aid that focuses and enhances the analyst workflow.
The analyst: his professional novel.
Ambrosiano, Laura
2005-12-01
The psychoanalyst needs to be in touch with a community of colleagues; he needs to feel part of a group with which he can share cognitive tension and therapeutic knowledge. Yet group ties are an aspect we analysts seldom discuss. The author defines the analyst's 'professional novel' as the emotional vicissitudes with the group that have marked the professional itinerary of every analyst; his relationship with institutions and with theories, and the emotional nuance of these relationships. The analyst's professional novel is the narrative elaboration of his professional autobiography. It is capable of transforming the individual's need to belong and the paths of identification and de-identification. Experience of the oedipal configuration allows the analyst to begin psychic work aimed at gaining spaces of separateness in his relationship with the group. This passage is marked by the work on mourning that separation involves, but also of mourning implicit in the awareness of the representative limits of our theories. Right from the start of analysis, the patient observes the emotional nuance of the analyst's connection to his group and theories; the patient notices how much this connection is governed by rigid needs to belong, and how much freedom of thought and exploration it allows the analyst. The author uses clinical examples to illustrate these hypotheses.
GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration
Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng
2015-01-01
The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
Jackson, James; Dixon, Mark R
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows Moble operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection system. The program will allow the user to select the type of behavior to be recorded, choose between interval and frequency data collection, and summarize data for graphing and analysis. We also provide suggestions for customizing the data-collection system for idiosyncratic research and clinical needs. PMID:17624078
Monitoring Object Library Usage and Changes
NASA Technical Reports Server (NTRS)
Owen, R. K.; Craw, James M. (Technical Monitor)
1995-01-01
The NASA Ames Numerical Aerodynamic Simulation program Aeronautics Consolidated Supercomputing Facility (NAS/ACSF) supercomputing center services over 1600 users, and has numerous analysts with root access. Several tools have been developed to monitor object library usage and changes. Some of the tools do "noninvasive" monitoring and other tools implement run-time logging even for object-only libraries. The run-time logging identifies who, when, and what is being used. The benefits are that real usage can be measured, unused libraries can be discontinued, training and optimization efforts can be focused at those numerical methods that are actually used. An overview of the tools will be given and the results will be discussed.
COPPERHEAD Operational Performance Evaluation (COPE): Computer Program User and Analyst Manual.
1981-03-01
I - ft N C * % S C. 4 ft - 2 4 4 0. 44 ft 2 -J - OIl . ft...C CDD)( A - 4 I < 4- 0 C) C LUV ) I < C)(. I -L =L I - >C) =~ LU;- CU "-. I -7 CiWC n Li u <E m C) mw. vi tl <LU ~J 0- C-W CI- CD C)0 Li0 ( 0 C-j C- Li w...creating a new TAPL II; when PREPMS is modifying an 14-1 Initlaii..-eand uperl f i 1,- oil TAPE 11 ,L ,, Ittempt to read Print "Creation Run" Pj_, N P C
1980-12-01
0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 FLT TF1 ,TFS XM1»XH2...2110 FORMAT A WRITE ( WRITE ( A 2120 FORMAT A B C D WRITE 1 ) 60 TO 2200 FLT » TF1 ,TFS »STD1 *ST02 jXB #SIGYL...SIGXR #SIGYR *PGS ,PGCH *PGCS ,CDRX /CDRY #HSX *SIGYBH/XBL *YBL *SIGXBS,SIGYBS ) FLT » TF1 ,TFS *STD1
The CommonGround Visual Paradigm for Biosurveillance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livnat, Yarden; Jurrus, Elizabeth R.; Gundlapalli, Adi V.
2013-06-14
Biosurveillance is a critical area in the intelligence community for real-time detection of disease outbreaks. Identifying epidemics enables analysts to detect and monitor disease outbreaks that might be spread from natural causes or from possible biological warfare attacks. Containing these events and disseminating alerts requires the ability to rapidly find, classify and track harmful biological signatures. In this paper, we describe a novel visual paradigm to conduct biosurveillance using an Infectious Disease Weather Map. Our system provides a visual common ground in which users can view, explore and discover emerging concepts and correlations such as symptoms, syndromes, pathogens, and geographicmore » locations.« less
Godsil, Geraldine
2018-02-01
This paper discusses the residues of a somatic countertransference that revealed its meaning several years after apparently successful analytic work had ended. Psychoanalytic and Jungian analytic ideas on primitive communication, dissociation and enactment are explored in the working through of a shared respiratory symptom between patient and analyst. Growth in the analyst was necessary so that the patient's communication at a somatic level could be understood. Bleger's concept that both the patient's and analyst's body are part of the setting was central in the working through. © 2018, The Society of Analytical Psychology.
The VRFurnace: A Virtual Reality Application for Energy System Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Peter Eric
2001-01-01
The VRFurnace is a unique VR application designed to analyze a complete coal-combustion CFD model of a power plant furnace. Although other applications have been created that analyze furnace performance, no other has included the added complications of particle tracking and the reactions associated with coal combustion. Currently the VRFurnace is a versatile analysis tool. Data translators have been written to allow data from most of the major commercial CFD software packages as well as standard data formats of hand-written code to be uploaded into the VR application. Because of this almost any type of CFD model of any powermore » plant component can be analyzed immediately. The ease of use of the VRFurnace is another of its qualities. The menu system created for the application not only guides first time users through the various button combinations but it also helps the experienced user keep track of which tool is being used. Because the VRFurnace was designed for use in the C6 device at Iowa State University's Virtual Reality Applications Center it is naturally a collaborative project. The projection-based system allows many people to be involved in the analysis process. This type of environment opens the design process to not only include CFD analysts but management teams and plant operators as well by making it easier for engineers to explain design changes. The 3D visualization allows power plant components to be studied in the context of their natural physical environments giving engineers a chance to use their innate pattern recognition and intuitive skills to bring to light key relationships that may have previously gone unrecognized. More specifically, the tools that have been developed make better use of the third dimension that the synthetic environment provides. Whereas the plane tools make it easier to track down interesting features of a given flow field, the box tools allow the user to focus on these features and reduce the data load on the computer.« less
Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.
Xia, Jianguo; Wishart, David S
2016-09-07
MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
Recommendation Systems for Geoscience Data Portals Built by Analyzing Usage Patterns
NASA Astrophysics Data System (ADS)
Crosby, C.; Nandigam, V.; Baru, C.
2009-04-01
Since its launch five years ago, the National Science Foundation-funded GEON Project (www.geongrid.org) has been providing access to a variety of geoscience data sets such as geologic maps and other geographic information system (GIS)-oriented data, paleontologic databases, gravity and magnetics data and LiDAR topography via its online portal interface. In addition to data, the GEON Portal also provides web-based tools and other resources that enable users to process and interact with data. Examples of these tools include functions to dynamically map and integrate GIS data, compute synthetic seismograms, and to produce custom digital elevation models (DEMs) with user defined parameters such as resolution. The GEON portal built on the Gridsphere-portal framework allows us to capture user interaction with the system. In addition to the site access statistics captured by tools like Google Analystics which capture hits per unit time, search key words, operating systems, browsers, and referring sites, we also record additional statistics such as which data sets are being downloaded and in what formats, processing parameters, and navigation pathways through the portal. With over four years of data now available from the GEON Portal, this record of usage is a rich resource for exploring how earth scientists discover and utilize online data sets. Furthermore, we propose that this data could ultimately be harnessed to optimize the way users interact with the data portal, design intelligent processing and data management systems, and to make recommendations on algorithm settings and other available relevant data. The paradigm of integrating popular and commonly used patterns to make recommendations to a user is well established in the world of e-commerce where users receive suggestions on books, music and other products that they may find interesting based on their website browsing and purchasing history, as well as the patterns of fellow users who have made similar selections. However, this paradigm has not yet been explored for geoscience data portals. In this presentation we will present an initial analysis of user interaction and access statistics for the GEON OpenTopography LiDAR data distribution and processing system to illustrate what they reveal about user's spatial and temporal data access patterns, data processing parameter selections, and pathways through the data portal. We also demonstrate what these usage statistics can illustrate about aspects of the data sets that are of greatest interest. Finally, we explore how these usage statistics could be used to improve the user's experience in the data portal and to optimize how data access interfaces and tools are designed and implemented.
NASA Astrophysics Data System (ADS)
Kamber, Balz S.; Chew, David M.; Petrus, Joseph A.
2014-05-01
Compared to non-destructive geochemical analyses, LA-ICP-MS consumes ca. 0.1 μm of material per ablation pulse. It is therefore to be expected that the combined analyses of ca. 200 pulses will encounter geochemical and isotopic complexities in all but the most perfect minerals. Experienced LA-ICP-MS analysts spot down-hole complexities and choose signal integration areas accordingly. In U-Pb geochronology, the task of signal integration choice is complex as the analyst wants to avoid areas of common Pb and Pb-loss and separate true (concordant) age complexity. Petrus and Kamber (2012) developed VizualAge as a tool for reducing and visualising, in real time, U-Pb geochronology data obtained by LA-ICP-MS as an add-on for the freely available U-Pb geochronology data reduction scheme of Paton et al. (2010) in Iolite. The most important feature of VizualAge is its ability to display a live concordia diagram, allowing users to inspect the data of a signal on a concordia diagram as the integration area it is being adjusted, thus providing immediate visual feedback regarding discordance, uncertainty, and common lead for different regions of the signal. It can also be used to construct histograms and probability distributions, standard and Tera-Wasserburg style concordia diagrams, as well as 3D U-Th-Pb and total U-Pb concordia diagrams. More recently, Chew et al. (2014) presented a new data reduction scheme (VizualAge_UcomPbine) with much improved common Pb correction functionality. Common Pb is a problem for many U-bearing accessory minerals and an under-appreciated difficulty is the potential presence of (possibly unevenly distributed) common Pb in calibration standards, introducing systematic inaccuracy into entire datasets. One key feature of the new method is that it can correct for variable amounts of common Pb in any U-Pb accessory mineral standard as long as the standard is concordant in the U/Pb (and Th/Pb) systems after common Pb correction. Common Pb correction can be undertaken using either the 204Pb, 207Pb or 208Pb(no Th) methods. After common Pb correction to the user-selected age standard integrations, the scheme fits session-wide model U-Pb fractionation curves to the time-resolved U-Pb standard data. This down hole fractionation model is next applied to the unknowns and sample-standard bracketing (using a user specified interpolation method) is used to calculate final isotopic ratios and ages. 204Pb- and 208Pb(no Th)-corrected concordia diagrams and 204Pb-, 207Pb- and 208Pb(no Th)-corrected age channels can be calculated for user-specified initial Pb ratio(s). All other conventional common Pb correction methods (e.g. intercept or isochron methods on co-genetic analyses) can be performed offline. Apatite, titanite, rutile and very young zircon data will be presented, obtained using a Thermo Scientific iCAP-Qc (Q-ICP-MS) coupled to a Photon Machines Analyte Excite 193 nm ArF Excimer laser with a novel signal smoothing device Chew, D.M., Petrus, J.A., and Kamber, B.S. (2014); Chemical Geology, 363, 185-199. Paton C., Woodhead J.D., Hellstrom J.C., Hergt J.M., Greig A. and Maas R. (2010); Geochemistry Geophysics Geosystems, 11, 1-36. Petrus, J.A. and Kamber, B.S. (2012): Geostandards and Geoanalytical Research, 36, 247-270.
NASA Astrophysics Data System (ADS)
Ferdous, Nazneen; Bhat, Chandra R.
2013-01-01
This paper proposes and estimates a spatial panel ordered-response probit model with temporal autoregressive error terms to analyze changes in urban land development intensity levels over time. Such a model structure maintains a close linkage between the land owner's decision (unobserved to the analyst) and the land development intensity level (observed by the analyst) and accommodates spatial interactions between land owners that lead to spatial spillover effects. In addition, the model structure incorporates spatial heterogeneity as well as spatial heteroscedasticity. The resulting model is estimated using a composite marginal likelihood (CML) approach that does not require any simulation machinery and that can be applied to data sets of any size. A simulation exercise indicates that the CML approach recovers the model parameters very well, even in the presence of high spatial and temporal dependence. In addition, the simulation results demonstrate that ignoring spatial dependency and spatial heterogeneity when both are actually present will lead to bias in parameter estimation. A demonstration exercise applies the proposed model to examine urban land development intensity levels using parcel-level data from Austin, Texas.
Intermediate and advanced topics in multilevel logistic regression analysis
Merlo, Juan
2017-01-01
Multilevel data occur frequently in health services, population and public health, and epidemiologic research. In such research, binary outcomes are common. Multilevel logistic regression models allow one to account for the clustering of subjects within clusters of higher‐level units when estimating the effect of subject and cluster characteristics on subject outcomes. A search of the PubMed database demonstrated that the use of multilevel or hierarchical regression models is increasing rapidly. However, our impression is that many analysts simply use multilevel regression models to account for the nuisance of within‐cluster homogeneity that is induced by clustering. In this article, we describe a suite of analyses that can complement the fitting of multilevel logistic regression models. These ancillary analyses permit analysts to estimate the marginal or population‐average effect of covariates measured at the subject and cluster level, in contrast to the within‐cluster or cluster‐specific effects arising from the original multilevel logistic regression model. We describe the interval odds ratio and the proportion of opposed odds ratios, which are summary measures of effect for cluster‐level covariates. We describe the variance partition coefficient and the median odds ratio which are measures of components of variance and heterogeneity in outcomes. These measures allow one to quantify the magnitude of the general contextual effect. We describe an R 2 measure that allows analysts to quantify the proportion of variation explained by different multilevel logistic regression models. We illustrate the application and interpretation of these measures by analyzing mortality in patients hospitalized with a diagnosis of acute myocardial infarction. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28543517
Assessing the performance of regional landslide early warning models: the EDuMaP method
NASA Astrophysics Data System (ADS)
Calvello, M.; Piciullo, L.
2015-10-01
The paper proposes the evaluation of the technical performance of a regional landslide early warning system by means of an original approach, called EDuMaP method, comprising three successive steps: identification and analysis of the Events (E), i.e. landslide events and warning events derived from available landslides and warnings databases; definition and computation of a Duration Matrix (DuMa), whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model Performance (P) by means of performance criteria and indicators applied to the duration matrix. During the first step, the analyst takes into account the features of the warning model by means of ten input parameters, which are used to identify and classify landslide and warning events according to their spatial and temporal characteristics. In the second step, the analyst computes a time-based duration matrix having a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The proposed method is based on a framework clearly distinguishing between local and regional landslide early warning systems as well as among correlation laws, warning models and warning systems. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warnings data from the municipal early warning system operating in Rio de Janeiro (Brazil).
Intermediate and advanced topics in multilevel logistic regression analysis.
Austin, Peter C; Merlo, Juan
2017-09-10
Multilevel data occur frequently in health services, population and public health, and epidemiologic research. In such research, binary outcomes are common. Multilevel logistic regression models allow one to account for the clustering of subjects within clusters of higher-level units when estimating the effect of subject and cluster characteristics on subject outcomes. A search of the PubMed database demonstrated that the use of multilevel or hierarchical regression models is increasing rapidly. However, our impression is that many analysts simply use multilevel regression models to account for the nuisance of within-cluster homogeneity that is induced by clustering. In this article, we describe a suite of analyses that can complement the fitting of multilevel logistic regression models. These ancillary analyses permit analysts to estimate the marginal or population-average effect of covariates measured at the subject and cluster level, in contrast to the within-cluster or cluster-specific effects arising from the original multilevel logistic regression model. We describe the interval odds ratio and the proportion of opposed odds ratios, which are summary measures of effect for cluster-level covariates. We describe the variance partition coefficient and the median odds ratio which are measures of components of variance and heterogeneity in outcomes. These measures allow one to quantify the magnitude of the general contextual effect. We describe an R 2 measure that allows analysts to quantify the proportion of variation explained by different multilevel logistic regression models. We illustrate the application and interpretation of these measures by analyzing mortality in patients hospitalized with a diagnosis of acute myocardial infarction. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Reproducibility of apatite fission-track length data and thermal history reconstruction
NASA Astrophysics Data System (ADS)
Ketcham, Richard A.; Donelick, Raymond A.; Balestrieri, Maria Laura; Zattin, Massimiliano
2009-07-01
The ability to derive detailed thermal history information from apatite fission-track analysis is predicated on the reliability of track length measurements. However, insufficient attention has been given to whether and how these measurements should be standardized. In conjunction with a fission-track workshop we conducted an experiment in which 11 volunteers measured ~ 50 track lengths on one or two samples. One mount contained Durango apatite with unannealed induced tracks, and one contained apatite from a crystalline rock containing spontaneous tracks with a broad length distribution caused by partial resetting. Results for both mounts showed scatter indicative of differences in measurement technique among the individual analysts. The effects of this variability on thermal history inversion were tested using the HeFTy computer program to model the spontaneous track measurements. A cooling-only scenario and a reheating scenario more consistent with the sample's geological history were posed. When a uniform initial length value from the literature was used, results among analysts were very inconsistent in both scenarios, although normalizing for track angle by projecting all lengths to a c-axis parallel crystallographic orientation improved some aspects of congruency. When the induced track measurement was used as the basis for thermal history inversion congruency among analysts, and agreement with inversions based on data previously collected, was significantly improved. Further improvement was obtained by using c-axis projection. Differences among inversions that persisted could be traced to differential sampling of long- and short-track populations among analysts. The results of this study, while demonstrating the robustness of apatite fission-track thermal history inversion, nevertheless point to the necessity for a standardized length calibration schema that accounts for analyst variation.
The Pope's confessor: a metaphor relating to illness in the analyst.
Clark, R W
1995-01-01
This paper examines some of the internal and external eventualities in the situation of illness in the analyst. The current emphasis on the use of the self as part of the analyzing instrument makes impairments in the analyst's physical well-being potentially disabling to the analytic work. A recommendation is made for analysts, both individually and as a professional group, to always consider this aspect of a personal medical problem.
Desire and the female analyst.
Schaverien, J
1996-04-01
The literature on erotic transference and countertransference between female analyst and male patient is reviewed and discussed. It is known that female analysts are less likely than their male colleagues to act out sexually with their patients. It has been claimed that a) male patients do not experience sustained erotic transferences, and b) female analysts do not experience erotic countertransferences with female or male patients. These views are challenged and it is argued that, if there is less sexual acting out by female analysts, it is not because of an absence of eros in the therapeutic relationship. The literature review covers material drawn from psychoanalysis, feminist psychotherapy, Jungian analysis, as well as some sociological and cultural sources. It is organized under the following headings: the gender of the analyst, sexual acting out, erotic transference, maternal and paternal transference, gender and power, countertransference, incest taboo--mothers and sons and sexual themes in the transference.
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
Smarter Cities Marketing Insights 2.0 initiative, a data quality analyst at EnerNOC for its demand wind energy as a wind program analyst for Green Energy Ohio in 2005 and as a data analyst for The
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
NASA Astrophysics Data System (ADS)
Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Duffy, P. B.
2007-12-01
Incorporating climate change information into long-term evaluations of water and energy resources requires analysts to have access to climate projection data that have been spatially downscaled to "basin-relevant" resolution. This is necessary in order to develop system-specific hydrology and demand scenarios consistent with projected climate scenarios. Analysts currently have access to "climate model" resolution data (e.g., at LLNL PCMDI), but not spatially downscaled translations of these datasets. Motivated by a common interest in supporting regional and local assessments, the U.S. Bureau of Reclamation and LLNL (through support from the DOE National Energy Technology Laboratory) have teamed to develop an archive of downscaled climate projections (temperature and precipitation) with geographic coverage consistent with the North American Land Data Assimilation System domain, encompassing the contiguous United States. A web-based information service, hosted at LLNL Green Data Oasis, has been developed to provide Reclamation, LLNL, and other interested analysts free access to archive content. A contemporary statistical method was used to bias-correct and spatially disaggregate projection datasets, and was applied to 112 projections included in the WCRP CMIP3 multi-model dataset hosted by LLNL PCMDI (i.e. 16 GCMs and their multiple simulations of SRES A2, A1b, and B1 emissions pathways).
The ASAC Air Carrier Investment Model (Second Generation)
NASA Technical Reports Server (NTRS)
Wingrove, Earl R., III; Johnson, Jesse P.; Sickles, Robin C.; Good, David H.
1997-01-01
To meet its objective of assisting the U.S. aviation industry with the technological challenges of the future, NASA must identify research areas that have the greatest potential for improving the operation of the air transportation system. To accomplish this, NASA is building an Aviation System Analysis Capability (ASAC). The ASAC differs from previous NASA modeling efforts in that the economic behavior of buyers and sellers in the air transportation and aviation industries is central to its conception. To link the economics of flight with the technology of flight, ASAC requires a parametrically based mode with extensions that link airline operations and investments in aircraft with aircraft characteristics. This model also must provide a mechanism for incorporating air travel demand and profitability factors into the airlines' investment decisions. Finally, the model must be flexible and capable of being incorporated into a wide-ranging suite of economic and technical models that are envisioned for ASAC. We describe a second-generation Air Carrier Investment Model that meets these requirements. The enhanced model incorporates econometric results from the supply and demand curves faced by U.S.-scheduled passenger air carriers. It uses detailed information about their fleets in 1995 to make predictions about future aircraft purchases. It enables analysts with the ability to project revenue passenger-miles flown, airline industry employment, airline operating profit margins, numbers and types of aircraft in the fleet, and changes in aircraft manufacturing employment under various user-defined scenarios.
Self-analysis and the development of an interpretation.
Campbell, Donald
2017-10-01
In spite of the fact that Freud's self-analysis was at the centre of so many of his discoveries, self-analysis remains a complex, controversial and elusive exercise. While self-analysis is often seen as emerging at the end of an analysis and then used as a criteria in assessing the suitability for termination, I try to attend to the patient's resistance to self-analysis throughout an analysis. I take the view that the development of the patient's capacity for self-analysis within the analytic session contributes to the patient's growth and their creative and independent thinking during the analysis, which prepares him or her for a fuller life after the formal analysis ends. The model I will present is based on an over lapping of the patient's and the analyst's self-analysis, with recognition and use of the analyst's counter-transference. My focus is on the analyst's self-analysis that is in response to a particular crisis of not knowing, which results in feeling intellectually and emotionally stuck. This paper is not a case study, but a brief look at the process I went through to arrive at a particular interpretation with a particular patient during a particular session. I will concentrate on resistances in which both patient and analyst initially rely upon what is consciously known. Copyright © 2017 Institute of Psychoanalysis.
Mainali, Dipak; Seelenbinder, John
2016-05-01
Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Reading, A. M.; Morse, P. E.; Staal, T.
2017-12-01
Geoscientific inversion outputs, such as seismic tomography contour images, are finding increasing use amongst scientific user communities that have limited knowledge of the impact of output parameter uncertainty on subsequent interpretations made from such images. We make use of a newly written computer application which enables seismic tomography images to be displayed in a performant 3D graphics environment. This facilitates the mapping of colour scales to the human visual sensorium for the interactive interpretation of contoured inversion results incorporating parameter uncertainty. Two case examples of seismic tomography inversions or contoured compilations are compared from the southern hemisphere continents of Australia and Antarctica. The Australian example is based on the AuSREM contoured seismic wavespeed model while the Antarctic example is a valuable but less well constrained result. Through adjusting the multiple colour gradients, layer separations, opacity, illumination, shadowing and background effects, we can optimise the insights obtained from the 3D structure in the inversion compilation or result. Importantly, we can also limit the display to show information in a way that is mapped to the uncertainty in the 3D result. Through this practical application, we demonstrate that the uncertainty in the result can be handled through a well-posed mapping of the parameter values to displayed colours in the knowledge of what is perceived visually by a typical human. We found that this approach maximises the chance of a useful tectonic interpretation by a diverse scientific user community. In general, we develop the idea that quantified inversion uncertainty can be used to tailor the way that the output is presented to the analyst for scientific interpretation.
Energy Systems Integration News | Energy Systems Integration Facility |
capabilities, and new methodologies that allowed NREL to model operations of the Eastern Interconnection at Analyst Power Systems Modeling Researcher Project Manager Power Systems Engineering Center Research Engineer Power Systems Modeling and Control Get the full list of job postings and learn more about working
Individual Learning Accounts and Other Models of Financing Lifelong Learning
ERIC Educational Resources Information Center
Schuetze, Hans G.
2007-01-01
To answer the question "Financing what?" this article distinguishes several models of lifelong learning as well as a variety of lifelong learning activities. Several financing methods are briefly reviewed, however the principal focus is on Individual Learning Accounts (ILAs) which were seen by some analysts as a promising model for…
Advanced Small Modular Reactor Economics Model Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Thomas J.
2014-10-01
The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less
Homeland security application of the Army Soft Target Exploitation and Fusion (STEF) system
NASA Astrophysics Data System (ADS)
Antony, Richard T.; Karakowski, Joseph A.
2010-04-01
A fusion system that accommodates both text-based extracted information along with more conventional sensor-derived input has been developed and demonstrated in a terrorist attack scenario as part of the Empire Challenge (EC) 09 Exercise. Although the fusion system was developed to support Army military analysts, the system, based on a set of foundational fusion principles, has direct applicability to department of homeland security (DHS) & defense, law enforcement, and other applications. Several novel fusion technologies and applications were demonstrated in EC09. One such technology is location normalization that accommodates both fuzzy semantic expressions such as behind Library A, across the street from the market place, as well as traditional spatial representations. Additionally, the fusion system provides a range of fusion products not supported by traditional fusion algorithms. Many of these additional capabilities have direct applicability to DHS. A formal test of the fusion system was performed during the EC09 exercise. The system demonstrated that it was able to (1) automatically form tracks, (2) help analysts visualize behavior of individuals over time, (3) link key individuals based on both explicit message-based information as well as discovered (fusion-derived) implicit relationships, and (4) suggest possible individuals of interest based on their association with High Value Individuals (HVI) and user-defined key locations.
SOMFlow: Guided Exploratory Cluster Analysis with Self-Organizing Maps and Analytic Provenance.
Sacha, Dominik; Kraus, Matthias; Bernard, Jurgen; Behrisch, Michael; Schreck, Tobias; Asano, Yuki; Keim, Daniel A
2018-01-01
Clustering is a core building block for data analysis, aiming to extract otherwise hidden structures and relations from raw datasets, such as particular groups that can be effectively related, compared, and interpreted. A plethora of visual-interactive cluster analysis techniques has been proposed to date, however, arriving at useful clusterings often requires several rounds of user interactions to fine-tune the data preprocessing and algorithms. We present a multi-stage Visual Analytics (VA) approach for iterative cluster refinement together with an implementation (SOMFlow) that uses Self-Organizing Maps (SOM) to analyze time series data. It supports exploration by offering the analyst a visual platform to analyze intermediate results, adapt the underlying computations, iteratively partition the data, and to reflect previous analytical activities. The history of previous decisions is explicitly visualized within a flow graph, allowing to compare earlier cluster refinements and to explore relations. We further leverage quality and interestingness measures to guide the analyst in the discovery of useful patterns, relations, and data partitions. We conducted two pair analytics experiments together with a subject matter expert in speech intonation research to demonstrate that the approach is effective for interactive data analysis, supporting enhanced understanding of clustering results as well as the interactive process itself.
NASA Astrophysics Data System (ADS)
Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.
2017-10-01
Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.
SafetyAnalyst : software tools for safety management of specific highway sites
DOT National Transportation Integrated Search
2010-07-01
SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...
Exploring the Analytical Processes of Intelligence Analysts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Kuchar, Olga A.; Wolf, Katherine E.
We present an observational case study in which we investigate and analyze the analytical processes of intelligence analysts. Participating analysts in the study carry out two scenarios where they organize and triage information, conduct intelligence analysis, report results, and collaborate with one another. Through a combination of artifact analyses, group interviews, and participant observations, we explore the space and boundaries in which intelligence analysts work and operate. We also assess the implications of our findings on the use and application of relevant information technologies.
Reflections: can the analyst share a traumatizing experience with a traumatized patient?
Lijtmaer, Ruth
2010-01-01
This is a personal account of a dreadful event in the analyst's life that was similar to a patient's trauma. It is a reflection on how the analyst dealt with her own trauma, the patient's trauma, and the transference and countertransference dynamics. Included is a description of the analyst's inner struggles with self-disclosure, continuance of her professional work, and the need for persistent self-scrutiny. The meaning of objects in people's life, particularly the concept of home, will be addressed.
Common modeling system for digital simulation
NASA Technical Reports Server (NTRS)
Painter, Rick
1994-01-01
The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.
NASA Astrophysics Data System (ADS)
Schutte, Klamer; Burghouts, Gertjan; van der Stap, Nanda; Westerwoudt, Victor; Bouma, Henri; Kruithof, Maarten; Baan, Jan; ten Hove, Johan-Martijn
2016-10-01
The bottleneck in situation awareness is no longer in the sensing domain but rather in the data interpretation domain, since the number of sensors is rapidly increasing and it is not affordable to increase human data-analysis capacity at the same rate. Automatic image analysis can assist a human analyst by alerting when an event of interest occurs. However, common state-of-the-art image recognition systems learn representations in high-dimensional feature spaces, which makes them less suitable to generate a user-comprehensive message. Such data-driven approaches rely on large amounts of training data, which is often not available for quite rare but high-impact incidents in the security domain. The key contribution of this paper is that we present a novel real-time system for image understanding based on generic instantaneous low-level processing components (symbols) and flexible user-definable and user-understandable combinations of these components (sentences) at a higher level for the recognition of specific relevant events in the security domain. We show that the detection of an event of interest can be enhanced by utilizing recognition of multiple short-term preparatory actions.
NASA Technical Reports Server (NTRS)
Granaas, Michael M.; Rhea, Donald C.
1989-01-01
In recent years the needs of ground-based researcher-analysts to access real-time engineering data in the form of processed information has expanded rapidly. Fortunately, the capacity to deliver that information has also expanded. The development of advanced display systems is essential to the success of a research test activity. Those developed at the National Aeronautics and Space Administration (NASA), Western Aeronautical Test Range (WATR), range from simple alphanumerics to interactive mapping and graphics. These unique display systems are designed not only to meet basic information display requirements of the user, but also to take advantage of techniques for optimizing information display. Future ground-based display systems will rely heavily not only on new technologies, but also on interaction with the human user and the associated productivity with that interaction. The psychological abilities and limitations of the user will become even more important in defining the difference between a usable and a useful display system. This paper reviews the requirements for development of real-time displays; the psychological aspects of design such as the layout, color selection, real-time response rate, and interactivity of displays; and an analysis of some existing WATR displays.
Pressure And Thermal Modeling Of Rocket Launches
NASA Technical Reports Server (NTRS)
Smith, Sheldon D.; Myruski, Brian L.; Farmer, Richard C.; Freeman, Jon A.
1995-01-01
Report presents mathematical model for use in designing rocket-launching stand. Predicts pressure and thermal environment, as well as thermal responses of structures to impinging rocket-exhaust plumes. Enables relatively inexperienced analyst to determine time-varying distributions and absolute levels of pressure and heat loads on structures.
Shirazi, Mohammadali; Dhavala, Soma Sekhar; Lord, Dominique; Geedipally, Srinivas Reddy
2017-10-01
Safety analysts usually use post-modeling methods, such as the Goodness-of-Fit statistics or the Likelihood Ratio Test, to decide between two or more competitive distributions or models. Such metrics require all competitive distributions to be fitted to the data before any comparisons can be accomplished. Given the continuous growth in introducing new statistical distributions, choosing the best one using such post-modeling methods is not a trivial task, in addition to all theoretical or numerical issues the analyst may face during the analysis. Furthermore, and most importantly, these measures or tests do not provide any intuitions into why a specific distribution (or model) is preferred over another (Goodness-of-Logic). This paper ponders into these issues by proposing a methodology to design heuristics for Model Selection based on the characteristics of data, in terms of descriptive summary statistics, before fitting the models. The proposed methodology employs two analytic tools: (1) Monte-Carlo Simulations and (2) Machine Learning Classifiers, to design easy heuristics to predict the label of the 'most-likely-true' distribution for analyzing data. The proposed methodology was applied to investigate when the recently introduced Negative Binomial Lindley (NB-L) distribution is preferred over the Negative Binomial (NB) distribution. Heuristics were designed to select the 'most-likely-true' distribution between these two distributions, given a set of prescribed summary statistics of data. The proposed heuristics were successfully compared against classical tests for several real or observed datasets. Not only they are easy to use and do not need any post-modeling inputs, but also, using these heuristics, the analyst can attain useful information about why the NB-L is preferred over the NB - or vice versa- when modeling data. Copyright © 2017 Elsevier Ltd. All rights reserved.
NHDPlus (National Hydrography Dataset Plus)
NHDPlus is a geospatial, hydrologic framework dataset that is intended for use by geospatial analysts and modelers to support water resources related applications. NHDPlus was developed by the USEPA in partnership with the US Geologic Survey
Laser Threat Analysis System (LTAS)
NASA Astrophysics Data System (ADS)
Pfaltz, John M.; Richardson, Christina E.; Ruiz, Abel; Barsalou, Norman; Thomas, Robert J.
2002-11-01
LTAS is a totally integrated modeling and simulation environment designed for the purpose of ascertaining the susceptibility of Air Force pilots and air crews to optical radiation threats. Using LTAS, mission planners can assess the operational impact of optically directed energy weapons and countermeasures. Through various scenarios, threat analysts are able to determine the capability of laser threats and their impact on operational missions including the air crew's ability to complete their mission effectively. Additionally, LTAS allows the risk of laser use on training ranges and the requirement for laser protection to be evaluated. LTAS gives mission planners and threat analysts complete control of the threat environment including threat parameter control and placement, terrain mapping (line-of-site), atmospheric conditions, and laser eye protection (LEP) selection. This report summarizes the design of the final version of LTAS, and the modeling methodologies implemented to accomplish analysis.
Do Sell-Side Stock Analysts Exhibit Escalation of Commitment?
Milkman, Katherine L.
2010-01-01
This paper presents evidence that when an analyst makes an out-of-consensus forecast of a company’s quarterly earnings that turns out to be incorrect, she escalates her commitment to maintaining an out-of-consensus view on the company. Relative to an analyst who was close to the consensus, the out-of-consensus analyst adjusts her forecasts for the current fiscal year’s earnings less in the direction of the quarterly earnings surprise. On average, this type of updating behavior reduces forecasting accuracy, so it does not seem to reflect superior private information. Further empirical results suggest that analysts do not have financial incentives to stand by extreme stock calls in the face of contradictory evidence. Managerial and financial market implications are discussed. PMID:21516220
Mortality, integrity, and psychoanalysis (who are you to me? Who am I to you?).
Pinsky, Ellen
2014-01-01
The author narrates her experience of mourning her therapist's sudden death. The profession has neglected implications of the analyst's mortality: what is lost or vulnerable to loss? What is that vulnerability's function? The author's process of mourning included her writing and her becoming an analyst. Both pursuits inspired reflections on mortality in two overlapping senses: bodily (the analyst is mortal and can die) and character (the analyst is mortal and can err). The subject thus expands to include impaired character and ethical violations. Paradoxically, the analyst's human limitations threaten each psychoanalytic situation, but also enable it: human imperfection animates the work. The essay ends with a specific example of integrity. © 2014 The Psychoanalytic Quarterly, Inc.
The tobacco industry's use of Wall Street analysts in shaping policy.
Alamar, B C; Glantz, S A
2004-09-01
To document how the tobacco industry has used Wall Street analysts to further its public policy objectives. Searching tobacco documents available on the internet, newspaper articles, and transcripts of public hearings. The tobacco industry used nominally independent Wall Street analysts as third parties to support the tobacco industry's legislative agenda at both national and state levels in the USA. The tobacco industry has, for example, edited the testimony of at least one analyst before he testified to the US Senate Judiciary Committee, while representing himself as independent of the industry. The tobacco industry has used undisclosed collaboration with Wall Street analysts, as they have used undisclosed relationships with research scientists and academics, to advance the interests of the tobacco industry in public policy.
The Use of Object-Oriented Analysis Methods in Surety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.
1999-05-01
Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less
75 FR 20385 - Amended Certification Regarding Eligibility To Apply for Worker Adjustment Assistance
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-19
..., Inconen, CTS, Hi-Tec, Woods, Ciber, Kelly Services, Analysts International Corp, Comsys, Filter LLC..., Ciber, Kelly Services, Analysts International Corp, Comsys, Filter LLC, Excell, Entegee, Chipton- Ross..., Kelly Services, Analysts International Corp, Comsys, Filter LLC, Excell, Entegee, Chipton- Ross, Ian...
Subcellular object quantification with Squassh3C and SquasshAnalyst.
Rizk, Aurélien; Mansouri, Maysam; Ballmer-Hofer, Kurt; Berger, Philipp
2015-11-01
Quantitative image analysis plays an important role in contemporary biomedical research. Squassh is a method for automatic detection, segmentation, and quantification of subcellular structures and analysis of their colocalization. Here we present the applications Squassh3C and SquasshAnalyst. Squassh3C extends the functionality of Squassh to three fluorescence channels and live-cell movie analysis. SquasshAnalyst is an interactive web interface for the analysis of Squassh3C object data. It provides segmentation image overview and data exploration, figure generation, object and image filtering, and a statistical significance test in an easy-to-use interface. The overall procedure combines the Squassh3C plug-in for the free biological image processing program ImageJ and a web application working in conjunction with the free statistical environment R, and it is compatible with Linux, MacOS X, or Microsoft Windows. Squassh3C and SquasshAnalyst are available for download at www.psi.ch/lbr/SquasshAnalystEN/SquasshAnalyst.zip.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jurrus, Elizabeth R.; Hodas, Nathan O.; Baker, Nathan A.
Forensic analysis of nanoparticles is often conducted through the collection and identifi- cation of electron microscopy images to determine the origin of suspected nuclear material. Each image is carefully studied by experts for classification of materials based on texture, shape, and size. Manually inspecting large image datasets takes enormous amounts of time. However, automatic classification of large image datasets is a challenging problem due to the complexity involved in choosing image features, the lack of training data available for effective machine learning methods, and the availability of user interfaces to parse through images. Therefore, a significant need exists for automatedmore » and semi-automated methods to help analysts perform accurate image classification in large image datasets. We present INStINCt, our Intelligent Signature Canvas, as a framework for quickly organizing image data in a web based canvas framework. Images are partitioned using small sets of example images, chosen by users, and presented in an optimal layout based on features derived from convolutional neural networks.« less
NIST mixed stain study 3: signal intensity balance in commercial short tandem repeat multiplexes.
Duewer, David L; Kline, Margaret C; Redman, Janette W; Butler, John M
2004-12-01
Short-tandem repeat (STR) allelic intensities were collected from more than 60 forensic laboratories for a suite of seven samples as part of the National Institute of Standards and Technology-coordinated 2001 Mixed Stain Study 3 (MSS3). These interlaboratory challenge data illuminate the relative importance of intrinsic and user-determined factors affecting the locus-to-locus balance of signal intensities for currently used STR multiplexes. To varying degrees, seven of the eight commercially produced multiplexes used by MSS3 participants displayed very similar patterns of intensity differences among the different loci probed by the multiplexes for all samples, in the hands of multiple analysts, with a variety of supplies and instruments. These systematic differences reflect intrinsic properties of the individual multiplexes, not user-controllable measurement practices. To the extent that quality systems specify minimum and maximum absolute intensities for data acceptability and data interpretation schema require among-locus balance, these intrinsic intensity differences may decrease the utility of multiplex results and surely increase the cost of analysis.
Eliciting expert opinion for economic models: an applied example.
Leal, José; Wordsworth, Sarah; Legood, Rosa; Blair, Edward
2007-01-01
Expert opinion is considered as a legitimate source of information for decision-analytic modeling where required data are unavailable. Our objective was to develop a practical computer-based tool for eliciting expert opinion about the shape of the uncertainty distribution around individual model parameters. We first developed a prepilot survey with departmental colleagues to test a number of alternative approaches to eliciting opinions on the shape of the uncertainty distribution around individual parameters. This information was used to develop a survey instrument for an applied clinical example. This involved eliciting opinions from experts to inform a number of parameters involving Bernoulli processes in an economic model evaluating DNA testing for families with a genetic disease, hypertrophic cardiomyopathy. The experts were cardiologists, clinical geneticists, and laboratory scientists working with cardiomyopathy patient populations and DNA testing. Our initial prepilot work suggested that the more complex elicitation techniques advocated in the literature were difficult to use in practice. In contrast, our approach achieved a reasonable response rate (50%), provided logical answers, and was generally rated as easy to use by respondents. The computer software user interface permitted graphical feedback throughout the elicitation process. The distributions obtained were incorporated into the model, enabling the use of probabilistic sensitivity analysis. There is clearly a gap in the literature between theoretical elicitation techniques and tools that can be used in applied decision-analytic models. The results of this methodological study are potentially valuable for other decision analysts deriving expert opinion.
This art of psychoanalysis. Dreaming undreamt dreams and interrupted cries.
Ogden, Thomas H
2004-08-01
It is the art of psychoanalysis in the making, a process inventing itself as it goes, that is the subject of this paper. The author articulates succinctly how he conceives of psychoanalysis, and offers a detailed clinical illustration. He suggests that each analysand unconsciously (and ambivalently) is seeking help in dreaming his 'night terrors' (his undreamt and undreamable dreams) and his 'nightmares' (his dreams that are interrupted when the pain of the emotional experience being dreamt exceeds his capacity for dreaming). Undreamable dreams are understood as manifestations of psychotic and psychically foreclosed aspects of the personality; interrupted dreams are viewed as reflections of neurotic and other non-psychotic parts of the personality. The analyst's task is to generate conditions that may allow the analysand--with the analyst's participation--to dream the patient's previously undreamable and interrupted dreams. A significant part of the analyst's participation in the patient's dreaming takes the form of the analyst's reverie experience. In the course of this conjoint work of dreaming in the analytic setting, the analyst may get to know the analysand sufficiently well for the analyst to be able to say something that is true to what is occurring at an unconscious level in the analytic relationship. The analyst's use of language contributes significantly to the possibility that the patient will be able to make use of what the analyst has said for purposes of dreaming his own experience, thereby dreaming himself more fully into existence.
An interactive tool for semi-automatic feature extraction of hyperspectral data
NASA Astrophysics Data System (ADS)
Kovács, Zoltán; Szabó, Szilárd
2016-09-01
The spectral reflectance of the surface provides valuable information about the environment, which can be used to identify objects (e.g. land cover classification) or to estimate quantities of substances (e.g. biomass). We aimed to develop an MS Excel add-in - Hyperspectral Data Analyst (HypDA) - for a multipurpose quantitative analysis of spectral data in VBA programming language. HypDA was designed to calculate spectral indices from spectral data with user defined formulas (in all possible combinations involving a maximum of 4 bands) and to find the best correlations between the quantitative attribute data of the same object. Different types of regression models reveal the relationships, and the best results are saved in a worksheet. Qualitative variables can also be involved in the analysis carried out with separability and hypothesis testing; i.e. to find the wavelengths responsible for separating data into predefined groups. HypDA can be used both with hyperspectral imagery and spectrometer measurements. This bivariate approach requires significantly fewer observations than popular multivariate methods; it can therefore be applied to a wide range of research areas.
An On-line Technology Information System (OTIS) for Advanced Life Support
NASA Technical Reports Server (NTRS)
Levri, Julie A.; Boulanger, Richard; Hogan, John A.; Rodriquez, Luis
2003-01-01
OTIS is an on-line communication platform designed for smooth flow of technology information between advanced life support (ALS) technology developers, researchers, system analysts, and managers. With pathways for efficient transfer of information, several improvements in the ALS Program will result. With OTIS, it will be possible to provide programmatic information for technology developers and researchers, technical information for analysts, and managerial decision support. OTIS is a platform that enables the effective research, development, and delivery of complex systems for life support. An electronic data collection form has been developed for the solid waste element, drafted by the Solid Waste Working Group. Forms for other elements (air revitalization, water recovery, food processing, biomass production and thermal control) will also be developed, based on lessons learned from the development of the solid waste form. All forms will be developed by consultation with other working groups, comprised of experts in the area of interest. Forms will be converted to an on-line data collection interface that technology developers will use to transfer information into OTIS. Funded technology developers will log in to OTIS annually to complete the element- specific forms for their technology. The type and amount of information requested expands as the technology readiness level (TRL) increases. The completed forms will feed into a regularly updated and maintained database that will store technology information and allow for database searching. To ensure confidentiality of proprietary information, security permissions will be customized for each user. Principal investigators of a project will be able to designate certain data as proprietary and only technical monitors of a task, ALS Management, and the principal investigator will have the ability to view this information. The typical OTIS user will be able to read all non-proprietary information about all projects.Interaction with the database will occur over encrypted connections, and data will be stored on the server in an encrypted form. Implementation of OTIS will initiate a community-accessible repository of technology development information. With OTIS, ALS element leads and managers will be able to carry out informed technology selection for programmatic decisions. OTIS will also allow analysts to make accurate evaluations of technology options. Additionally, the range and specificity of information solicited will help educate technology developers of program needs. With augmentation, OTIS reporting is capable of replacing the current fiscal year-end reporting process. Overall, the system will enable more informed R&TD decisions and more rapid attainment of ALS Program goals.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-05
... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small And Medium Business, Tampa, Florida; Verizon Business Networks Services, Inc., Senior Coordinator-Order... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
...,968B] Verizon Business Networks Services, Inc. Senior Analysts-Sales Impletmentation (SA-SI) Birmingham, Alabama; Verizon Business Networks Services, Inc. Senior Analysts-Sales Impletmentation (SA-SI) Service Program Delivery Division San Francisco, California; Verizon Business Networks Services, Inc.Senior...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-01
.... Securities Offering. Series 86 Research Analyst--Analysis..... From $160 to $175. Series 87 Research Analyst... Order Processing Assistant Representatives, Research Analysts and Operations Professionals, respectively... examination.\\7\\ \\6\\ PROCTOR is a computer system that is specifically designed for the administration and...
NASA Astrophysics Data System (ADS)
Palanisamy, Giriprakash; Wilson, Bruce E.; Cook, Robert B.; Lenhardt, Chris W.; Santhana Vannan, Suresh; Pan, Jerry; McMurry, Ben F.; Devarakonda, Ranjeet
2010-12-01
The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) is one of the science-oriented data centers in EOSDIS, aligned primarily with terrestrial ecology. The ORNL DAAC archives and serves data from NASA-funded field campaigns (such as BOREAS, FIFE, and LBA), regional and global data sets relevant to biogeochemical cycles, land validation studies for remote sensing, and source code for some terrestrial ecology models. Users of the ORNL DAAC include field ecologists, remote sensing scientists, modelers at various scales, synthesis scientific groups, a range of educational users (particularly baccalaureate and graduate instruction), and decision support analysts. It is clear that the wide range of users served by the ORNL DAAC have differing needs and differing capabilities for accessing and using data. It is also not possible for the ORNL DAAC, or the other data centers in EDSS to develop all of the tools and interfaces to support even most of the potential uses of data directly. As is typical of Information Technology to support a research enterprise, the user needs will continue to evolve rapidly over time and users themselves cannot predict future needs, as those needs depend on the results of current investigation. The ORNL DAAC is addressing these needs by targeted implementation of web services and tools which can be consumed by other applications, so that a modeler can retrieve data in netCDF format with the Climate Forecasting convention and a field ecologist can retrieve subsets of that same data in a comma separated value format, suitable for use in Excel or R. Tools such as our MODIS Subsetting capability, the Spatial Data Access Tool (SDAT; based on OGC web services), and OPeNDAP-compliant servers such as THREDDS particularly enable such diverse means of access. We also seek interoperability of metadata, recognizing that terrestrial ecology is a field where there are a very large number of relevant data repositories. ORNL DAAC metadata is published to several metadata repositories using the Open Archive Initiative Protocol for Metadata Handling (OAI-PMH), to increase the chances that users can find data holdings relevant to their particular scientific problem. ORNL also seeks to leverage technology across these various data projects and encourage standardization of processes and technical architecture. This standardization is behind current efforts involving the use of Drupal and Fedora Commons. This poster describes the current and planned approaches that the ORNL DAAC is taking to enable cost-effective interoperability among data centers, both across the NASA EOSDIS data centers and across the international spectrum of terrestrial ecology-related data centers. The poster will highlight the standards that we are currently using across data formats, metadata formats, and data protocols. References: [1]Devarakonda R., et al. Mercury: reusable metadata management, data discovery and access system. Earth Science Informatics (2010), 3(1): 87-94. [2]Devarakonda R., et al. Data sharing and retrieval using OAI-PMH. Earth Science Informatics (2011), 4(1): 1-5.
Training and the Change Agent Role Model
ERIC Educational Resources Information Center
Leach, Wesley B.; Owens, Vyrle W.
1973-01-01
The authors discuss the qualities possessed by a model change agent and roles played by him as resident technical participant: analyst, advisor, advocate, systems linker, innovator, and trainer. Besides presenting the teaching plan for change agents, the authors call upon their Peace Corps experiences to provide specific examples of what is…
The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation
2015-05-20
biased for a variety of reasons, and neurological and physiological data can be corrupted by broken or improperly used sensors. If it were possible...War and Development Policy. Washington, DC: World Bank and Oxford University Press. Collier, P. and Hoeffler, A. (2004). " Greed and Grievances in
A meteorologically driven grain sorghum stress indicator model
NASA Technical Reports Server (NTRS)
Taylor, T. W.; Ravet, F. W. (Principal Investigator)
1981-01-01
A grain sorghum soil moisture and temperature stress model is described. It was developed to serve as a meteorological data filter to alert commodity analysts to potential stress conditions and crop phenology in selected grain sorghum production areas. The model also identifies optimum conditions on a daily basis and planting/harvest problems associated with poor tractability.
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
78 FR 77769 - Data Collection Available for Public Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-24
... comments to Amy Garcia, Program Analyst, Office of Government Contracting, Small Business Administration, 409 3rd Street, 7th Floor, Washington, DC 20416. FOR FURTHER INFORMATION CONTACT: Amy Garcia, Program Analyst, 202-205- 6842, amy.garcia@sba.gov , or Curtis B. Rich, Management Analyst, 202- 205-7030, curtis...
Osborne, Nikola K P; Taylor, Michael C; Healey, Matthew; Zajac, Rachel
2016-03-01
It is becoming increasingly apparent that contextual information can exert a considerable influence on decisions about forensic evidence. Here, we explored accuracy and contextual influence in bloodstain pattern classification, and how these variables might relate to analyst characteristics. Thirty-nine bloodstain pattern analysts with varying degrees of experience each completed measures of compliance, decision-making style, and need for closure. Analysts then examined a bloodstain pattern without any additional contextual information, and allocated votes to listed pattern types according to favoured and less favoured classifications. Next, if they believed it would assist with their classification, analysts could request items of contextual information - from commonly encountered sources of information in bloodstain pattern analysis - and update their vote allocation. We calculated a shift score for each item of contextual information based on vote reallocation. Almost all forms of contextual information influenced decision-making, with medical findings leading to the highest shift scores. Although there was a small positive association between shift scores and the degree to which analysts displayed an intuitive decision-making style, shift scores did not vary meaningfully as a function of experience or the other characteristics measured. Almost all of the erroneous classifications were made by novice analysts. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
1974-01-01
A collection of blank worksheets for use on each BRAVO problem to be analyzed is supplied, for the purposes of recording the inputs for the BRAVO analysis, working out the definition of mission equipment, recording inputs to the satellite synthesis computer program, estimating satellite earth station costs, costing terrestrial systems, and cost effectiveness calculations. The group of analysts working BRAVO will normally use a set of worksheets on each problem, however, the workbook pages are of sufficiently good quality that the user can duplicate them, if more worksheet blanks are required than supplied. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.
X-ray detectors at the Linac Coherent Light Source.
Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella; Carron, Sebastian; Dragone, Angelo; Freytag, Dietrich; Haller, Gunther; Hart, Philip; Hasi, Jasmine; Herbst, Ryan; Herrmann, Sven; Kenney, Chris; Markovic, Bojan; Nishimura, Kurtis; Osier, Shawn; Pines, Jack; Reese, Benjamin; Segal, Julie; Tomada, Astrid; Weaver, Matt
2015-05-01
Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a new generation of cameras under development at SLAC, is introduced.
Combining factual and heuristic knowledge in knowledge acquisition
NASA Technical Reports Server (NTRS)
Gomez, Fernando; Hull, Richard; Karr, Clark; Hosken, Bruce; Verhagen, William
1992-01-01
A knowledge acquisition technique that combines heuristic and factual knowledge represented as two hierarchies is described. These ideas were applied to the construction of a knowledge acquisition interface to the Expert System Analyst (OPERA). The goal of OPERA is to improve the operations support of the computer network in the space shuttle launch processing system. The knowledge acquisition bottleneck lies in gathering knowledge from human experts and transferring it to OPERA. OPERA's knowledge acquisition problem is approached as a classification problem-solving task, combining this approach with the use of factual knowledge about the domain. The interface was implemented in a Symbolics workstation making heavy use of windows, pull-down menus, and other user-friendly devices.
X-ray detectors at the Linac Coherent Light Source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella
Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a newmore » generation of cameras under development at SLAC, is introduced.« less
X-ray detectors at the Linac Coherent Light Source
Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella; ...
2015-04-21
Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a newmore » generation of cameras under development at SLAC, is introduced.« less
X-ray detectors at the Linac Coherent Light Source
Blaj, Gabriel; Caragiulo, Pietro; Carini, Gabriella; Carron, Sebastian; Dragone, Angelo; Freytag, Dietrich; Haller, Gunther; Hart, Philip; Hasi, Jasmine; Herbst, Ryan; Herrmann, Sven; Kenney, Chris; Markovic, Bojan; Nishimura, Kurtis; Osier, Shawn; Pines, Jack; Reese, Benjamin; Segal, Julie; Tomada, Astrid; Weaver, Matt
2015-01-01
Free-electron lasers (FELs) present new challenges for camera development compared with conventional light sources. At SLAC a variety of technologies are being used to match the demands of the Linac Coherent Light Source (LCLS) and to support a wide range of scientific applications. In this paper an overview of X-ray detector design requirements at FELs is presented and the various cameras in use at SLAC are described for the benefit of users planning experiments or analysts looking at data. Features and operation of the CSPAD camera, which is currently deployed at LCLS, are discussed, and the ePix family, a new generation of cameras under development at SLAC, is introduced. PMID:25931071
SIERRA Code Coupling Module: Arpeggio User Manual Version 4.44
DOE Office of Scientific and Technical Information (OSTI.GOV)
Subia, Samuel R.; Overfelt, James R.; Baur, David G.
2017-04-01
The SNL Sierra Mechanics code suite is designed to enable simulation of complex multiphysics scenarios. The code suite is composed of several specialized applications which can operate either in standalone mode or coupled with each other. Arpeggio is a supported utility that enables loose coupling of the various Sierra Mechanics applications by providing access to Framework services that facilitate the coupling. More importantly Arpeggio orchestrates the execution of applications that participate in the coupling. This document describes the various components of Arpeggio and their operability. The intent of the document is to provide a fast path for analysts interested inmore » coupled applications via simple examples of its usage.« less
Towards a Relation Extraction Framework for Cyber-Security Concepts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Corinne L; Bridges, Robert A; Huffer, Kelly M
In order to assist security analysts in obtaining information pertaining to their network, such as novel vulnerabilities, exploits, or patches, information retrieval methods tailored to the security domain are needed. As labeled text data is scarce and expensive, we follow developments in semi-supervised NLP and implement a bootstrapping algorithm for extracting security entities and their relationships from text. The algorithm requires little input data, specifically, a few relations or patterns (heuristics for identifying relations), and incorporates an active learning component which queries the user on the most important decisions to prevent drifting the desired relations. Preliminary testing on a smallmore » corpus shows promising results, obtaining precision of .82.« less
NBOD2- PROGRAM TO DERIVE AND SOLVE EQUATIONS OF MOTION FOR COUPLED N-BODY SYSTEMS
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1994-01-01
The analysis of the dynamic characteristics of a complex system, such as a spacecraft or a robot, is usually best accomplished through the study of a simulation model. The simulation model must have the same dynamic characteristics as the complex system, while lending itself to mathematical quantification. The NBOD2 computer program was developed to aid in the analysis of spacecraft attitude dynamics. NBOD2 is a very general program that may be applied to a large class of problems involving coupled N-body systems. NBOD2 provides the dynamics analyst with the capability to automatically derive and numerically solve the equations of motion for any system that can be modeled as a topological tree of coupled rigid bodies, flexible bodies, point masses, and symmetrical momentum wheels. NBOD2 uses a topological tree model of the dynamic system to derive the vector-dyadic equations of motion for the system. The user builds this topological tree model by using rigid and flexible bodies, point masses, and symmetrical momentum wheels with appropriate connections. To insure that the relative motion between contiguous bodies is kinematically constrained, NBOD2 assumes that contiguous rigid and flexible bodies are connected by physically reliable 0, 1, 2, and 3-degrees-of-freedom gimbals. These gimbals prohibit relative translational motion, while permitting up to 3 degrees of relative rotational freedom at hinge points. Point masses may have 0, 1, 2, or 3-degrees of relative translational freedom, and symmetric momentum wheels may have a single degree of rotational freedom relative to the body in which they are imbedded. Flexible bodies may possess several degrees of vibrational freedom in addition to the degrees of freedom associated with the connection gimbals. Data concerning the natural modes and vibrations of the flexible bodies must be supplied by the user. NBOD2 combines the best features of the discrete-body approach and the nested body approach to reduce the topological tree to a complete set of nonlinear equations of motion in vector-dyadic form for the system being analyzed. NBOD2 can then numerically solve the equations of motion. Input to NBOD2 consists of a user-supplied description of the system to be modeled. The NBOD2 system includes an interactive, tutorial, input support program to aid the NBOD2 user in preparing input data. Output from NBOD2 consists of a listing of the complete set of nonlinear equations of motion in vector-dyadic form and any userspecified set of system state variables. The NBOD2 program is written in FORTRAN 77 for batch execution and has been implemented on a DEC VAX-11/780 computer. The NBOD2 program was developed in 1978 and last updated in 1982.
The CASA Dallas Fort Worth Remote Sensing Network ICT for Urban Disaster Mitigation
NASA Astrophysics Data System (ADS)
Chandrasekar, Venkatachalam; Chen, Haonan; Philips, Brenda; Seo, Dong-jun; Junyent, Francesc; Bajaj, Apoorva; Zink, Mike; Mcenery, John; Sukheswalla, Zubin; Cannon, Amy; Lyons, Eric; Westbrook, David
2013-04-01
The dual-polarization X-band radar network developed by the U.S. National Science Foundation Engineering Center for Collaborative Adaptive Sensing of the Atmosphere (CASA) has shown great advantages for observing and prediction of hazardous weather events in the lower atmosphere (1-3 km above ground level). The network is operating though a scanning methodology called DCAS, distributed collaborative adaptive sensing, which is designed to focus on particular interesting regions of the atmosphere and disseminate information for decision-making to multiple end-users, such as emergency managers and policy analysts. Since spring 2012, CASA and the North Central Texas Council of Governments (NCTCOG) have embarked the development of Dallas Fort Worth (DFW) urban remote sensing network, including 8-node of dual-polarization X-band radars, in the populous DFW Metroplex (pop. 6.3 million in 2010). The main goal of CASA DFW urban demonstration network is to protect the safety and prosperity of humans and ecosystems through research activities that include: 1) to demonstrate the DCAS operation paradigm developed by CASA; 2) to create high-resolution, three-dimensional mapping of the meteorological conditions; 3) to help the local emergency managers issue impacts-based warnings and forecasts for severe wind, tornado, hail, and flash flood hazards. The products of this radar network will include single and multi-radar data, vector wind retrieval, quantitative precipitation estimation and nowcasting, and numerical weather predictions. In addition, the high spatial and temporal resolution rainfall products from CASA can serve as a reliable data input for distributed hydrological models in urban area. This paper presents the information and communication link between radars, rainfall product generation, hydrologic model link and end user community in the Dallas Fort Worth Urban Network. Specific details of the Information and Communication Technologies (ICT) between the various subsystems are presented.
CUAHSI Hydrologic Information Systems
NASA Astrophysics Data System (ADS)
Maidment, D.; Zaslavsky, I.; Tarboton, D.; Piasecki, M.; Goodall, J.
2006-12-01
The Consortium of Universities for the Advancement of Hydrologic Science, Inc (CUAHSI) has a Hydrologic Information System (HIS) project, which is supported by NSF to develop infrastructure and services to support the advance of hydrologic science in the United States. This paper provides an overview of the HIS project. A set of web services called WaterOneFlow is being developed to provide better access to water observations data (point measurements of streamflow, water quality, climate and groundwater levels) from government agencies and individual investigator projects. Successful partnerships have been created with the USGS National Water Information System, EPA Storet and the NCDC Climate Data Online. Observations catalogs have been created for stations in the measurement networks of each of these data systems so that they can be queried in a uniform manner through CUAHSI HIS, and data delivered from them directly to the user via web services. A CUAHSI Observations Data Model has been designed for storing individual investigator data and an equivalent set of web services created for that so that individual investigators can publish their data onto the internet in the same format CUAHSI is providing for the federal agency data. These data will be accessed through HIS Servers hosted at the national level by CUAHSI and also by research centers and academic departments for regional application of HIS. An individual user application called HIS Analyst will enable individual hydrologic scientists to access the information from the network of HIS Servers. The present focus is on water observations data but later development of this system will include weather and climate grid information, GIS data, remote sensing data and linkages between data and hydrologic simulation models.
NASA Astrophysics Data System (ADS)
Suftin, I.; Read, J. S.; Walker, J.
2013-12-01
Scientists prefer not having to be tied down to a specific machine or operating system in order to analyze local and remote data sets or publish work. Increasingly, analysis has been migrating to decentralized web services and data sets, using web clients to provide the analysis interface. While simplifying workflow access, analysis, and publishing of data, the move does bring with it its own unique set of issues. Web clients used for analysis typically offer workflows geared towards a single user, with steps and results that are often difficult to recreate and share with others. Furthermore, workflow results often may not be easily used as input for further analysis. Older browsers further complicate things by having no way to maintain larger chunks of information, often offloading the job of storage to the back-end server or trying to squeeze it into a cookie. It has been difficult to provide a concept of "session storage" or "workflow sharing" without a complex orchestration of the back-end for storage depending on either a centralized file system or database. With the advent of HTML5, browsers gained the ability to store more information through the use of the Web Storage API (a browser-cookie holds a maximum of 4 kilobytes). Web Storage gives us the ability to store megabytes of arbitrary data in-browser either with an expiration date or just for a session. This allows scientists to create, update, persist and share their workflow without depending on the backend to store session information, providing the flexibility for new web-based workflows to emerge. In the DSASWeb portal ( http://cida.usgs.gov/DSASweb/ ), using these techniques, the representation of every step in the analyst's workflow is stored as plain-text serialized JSON, which we can generate as a text file and provide to the analyst as an upload. This file may then be shared with others and loaded back into the application, restoring the application to the state it was in when the session file was generated. A user may then view results produced during that session or go back and alter input parameters, creating new results and producing new, unique sessions which they can then again share. This technique not only provides independence for the user to manage their session as they like, but also allows much greater freedom for the application provider to scale out without having to worry about carrying over user information or maintaining it in a central location.
Midgley, Nicholas
2006-01-01
Psychoanalysts have long recognized the complex interaction between clinical data and formal psychoanalytic theories. While clinical data are often used to provide "evidence" for psychoanalytic paradigms, the theoretical model used by the analyst also structures what can and cannot be seen in the data. This delicate interaction between theory and clinical data can be seen in the history of interpretations of Freud's "Analysis of a Phobia in a Five-Year-Old Boy" ("Little Hans"). Freud's himself revised his reading of the case in 1926, after which a number of psychoanalysts--including Melanie Klein, Jacques Lacan, and John Bowlby--reinterpreted the case in the light of their particular models of the mind. These analysts each found "evidence" for their theoretical model within this classic case study, and in doing so they illuminated aspects of the case that had previously been obscured, while also revealing a great deal about the shifting preoccupations of psychoanalysis as a field.
Protecting medical data for decision-making analyses.
Brumen, Bostjan; Welzer, Tatjana; Druzovec, Marjan; Golob, Izidor; Jaakkola, Hannu; Rozman, Ivan; Kubalík, Jiri
2005-02-01
In this paper, we present a procedure for data protection, which can be applied before any model building based analyses are performed. In medical environments, abundant data exist, but because of the lack of knowledge, they are rarely analyzed, although they hide valuable and often life-saving knowledge. To be able to analyze the data, the analyst needs to have a full access to the relevant sources, but this may be in the direct contradiction with the demand that data remain secure, and more importantly in medical area, private. This is especially the case if the data analyst is outsourced and not directly affiliated with the data owner. We address this issue and propose a solution where the model-building process is still possible while data are better protected. We consider the case where the distributions of original data values are preserved while the values themselves change, so that the resulting model is equivalent to the one built with original data.
Supporting the Growing Needs of the GIS Industry
NASA Technical Reports Server (NTRS)
2003-01-01
Visual Learning Systems, Inc. (VLS), of Missoula, Montana, has developed a commercial software application called Feature Analyst. Feature Analyst was conceived under a Small Business Innovation Research (SBIR) contract with NASA's Stennis Space Center, and through the Montana State University TechLink Center, an organization funded by NASA and the U.S. Department of Defense to link regional companies with Federal laboratories for joint research and technology transfer. The software provides a paradigm shift to automated feature extraction, as it utilizes spectral, spatial, temporal, and ancillary information to model the feature extraction process; presents the ability to remove clutter; incorporates advanced machine learning techniques to supply unparalleled levels of accuracy; and includes an exceedingly simple interface for feature extraction.
Willemsen, Hessel
2014-11-01
In this paper I aim to outline the importance of working clinically with affect when treating severely traumatized patients who have a limited capacity to symbolize. These patients, who suffer the loss of maternal care early in life, require the analyst to be closely attuned to the patient's distress through use of the countertransference and with significantly less attention paid to the transference. It is questionable whether we can speak of transference when there is limited capacity to form internal representations. The analyst's relationship with the patient is not necessarily used to make interpretations but, instead, the analyst's reverie functions therapeutically to develop awareness and containment of affect, first in the analyst's mind and, later, in the patient's, so that, in time, a relationship between the patient's mind and the body, as the first object, is made. In contrast to general object-relations theories, in which the first object is considered to be the breast or the mother, Ferrari (2004) proposes that the body is the first object in the emerging mind. Once a relationship between mind and body is established, symbolization becomes possible following the formation of internal representations of affective states in the mind, where previously there were few. Using Ferrari's body-mind model, two clinical case vignettes underline the need to use the countertransference with patients who suffered chronic developmental trauma in early childhood. © 2014, The Society of Analytical Psychology.
The Aviation System Analysis Capability Air Carrier Cost-Benefit Model
NASA Technical Reports Server (NTRS)
Gaier, Eric M.; Edlich, Alexander; Santmire, Tara S.; Wingrove, Earl R.., III
1999-01-01
To meet its objective of assisting the U.S. aviation industry with the technological challenges of the future, NASA must identify research areas that have the greatest potential for improving the operation of the air transportation system. Therefore, NASA is developing the ability to evaluate the potential impact of various advanced technologies. By thoroughly understanding the economic impact of advanced aviation technologies and by evaluating how the new technologies will be used in the integrated aviation system, NASA aims to balance its aeronautical research program and help speed the introduction of high-leverage technologies. To meet these objectives, NASA is building the Aviation System Analysis Capability (ASAC). NASA envisions ASAC primarily as a process for understanding and evaluating the impact of advanced aviation technologies on the U.S. economy. ASAC consists of a diverse collection of models and databases used by analysts and other individuals from the public and private sectors brought together to work on issues of common interest to organizations in the aviation community. ASAC also will be a resource available to the aviation community to analyze; inform; and assist scientists, engineers, analysts, and program managers in their daily work. The ASAC differs from previous NASA modeling efforts in that the economic behavior of buyers and sellers in the air transportation and aviation industries is central to its conception. Commercial air carriers, in particular, are an important stakeholder in this community. Therefore, to fully evaluate the implications of advanced aviation technologies, ASAC requires a flexible financial analysis tool that credibly links the technology of flight with the financial performance of commercial air carriers. By linking technical and financial information, NASA ensures that its technology programs will continue to benefit the user community. In addition, the analysis tool must be capable of being incorporated into the wide-ranging suite of economic and technical models that comprise ASAC. This report describes an Air Carrier Cost-Benefit Model (CBM) that meets these requirements. The ASAC CBM is distinguished from many of the aviation cost-benefit models by its exclusive focus on commercial air carriers. The model considers such benefit categories as time and fuel savings, utilization opportunities, reliability and capacity enhancements, and safety and security improvements. The model distinguishes between benefits that are predictable and those that occur randomly. By making such a distinction, the model captures the ability of air carriers to reoptimize scheduling and crew assignments for predictable benefits. In addition, the model incorporates a life-cycle cost module for new technology, which applies the costs of nonrecurring acquisitions, recurring maintenance and operation, and training to each aircraft equipment type independently.
This study examined inter-analyst classification variability based on training site signature selection only for six classifications from a 10 km2 Landsat ETM+ image centered over a highly heterogeneous area in south-central Virginia. Six analysts classified the image...
Avila, Manuel; Graterol, Eduardo; Alezones, Jesús; Criollo, Beisy; Castillo, Dámaso; Kuri, Victoria; Oviedo, Norman; Moquete, Cesar; Romero, Marbella; Hanley, Zaida; Taylor, Margie
2012-06-01
The appearance of rice grain is a key aspect in quality determination. Mainly, this analysis is performed by expert analysts through visual observation; however, due to the subjective nature of the analysis, the results may vary among analysts. In order to evaluate the concordance between analysts from Latin-American rice quality laboratories for rice grain appearance through digital images, an inter-laboratory test was performed with ten analysts and images of 90 grains captured with a high resolution scanner. Rice grains were classified in four categories including translucent, chalky, white belly, and damaged grain. Data was categorized using statistic parameters like mode and its frequency, the relative concordance, and the reproducibility parameter kappa. Additionally, a referential image gallery of typical grain for each category was constructed based on mode frequency. Results showed a Kappa value of 0.49, corresponding to a moderate reproducibility, attributable to subjectivity in the visual analysis of grain images. These results reveal the need for standardize the evaluation criteria among analysts to improve the confidence of the determination of rice grain appearance.
Composable Analytic Systems for next-generation intelligence analysis
NASA Astrophysics Data System (ADS)
DiBona, Phil; Llinas, James; Barry, Kevin
2015-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.
Health versus money. Value judgments in the perspective of decision analysis.
Thompson, M S
1983-01-01
An important, but largely uninvestigated, value trade-off balances marginal nonhealth consumption against marginal medical care. Benefit-cost analysts have traditionally, if not fully satisfactorily, dealt with this issue by valuing health gains by their effects on productivity. Cost-effectiveness analysts compare monetary and health effects and leave their relative valuations to decision makers. A decision-analytic model using the satisfaction or utility gained from nonhealth consumption and the level of health enables one to calculate willingness to pay--a theoretically superior way of assigning monetary values to effects for benefit-cost analysis-and to determine minimally acceptable cost-effectiveness ratios. Examples show how a decision-analytic model of utility can differentiate medical actions so essential that failure to take them would be considered negligent from actions so expensive as to be unjustifiable, and can help to determine optimal legal arrangements for compensation for medical malpractice.
Pedestrian paths: why path-dependence theory leaves health policy analysis lost in space.
Brown, Lawrence D
2010-08-01
Path dependence, a model first advanced to explain puzzles in the diffusion of technology, has lately won allegiance among analysts of the politics of public policy, including health care policy. Though the central premise of the model--that past events and decisions shape options for innovation in the present and future--is indisputable (indeed path dependence is, so to speak, too shallow to be false), the approach, at least as applied to health policy, suffers from ambiguities that undercut its claims to illuminate policy projects such as managed care, on which this article focuses. Because path dependence adds little more than marginal value to familiar images of the politics of policy--incrementalism, for one--analysts might do well to put it on the back burner and pursue instead "thick descriptions" that help them to distinguish different degrees of openness to exogenous change among diverse policy arenas.
RPD-based Hypothesis Reasoning for Cyber Situation Awareness
NASA Astrophysics Data System (ADS)
Yen, John; McNeese, Michael; Mullen, Tracy; Hall, David; Fan, Xiaocong; Liu, Peng
Intelligence workers such as analysts, commanders, and soldiers often need a hypothesis reasoning framework to gain improved situation awareness of the highly dynamic cyber space. The development of such a framework requires the integration of interdisciplinary techniques, including supports for distributed cognition (human-in-the-loop hypothesis generation), supports for team collaboration (identification of information for hypothesis evaluation), and supports for resource-constrained information collection (hypotheses competing for information collection resources). We here describe a cognitively-inspired framework that is built upon Klein’s recognition-primed decision model and integrates the three components of Endsley’s situation awareness model. The framework naturally connects the logic world of tools for cyber situation awareness with the mental world of human analysts, enabling the perception, comprehension, and prediction of cyber situations for better prevention, survival, and response to cyber attacks by adapting missions at the operational, tactical, and strategic levels.
Human/autonomy collaboration for the automated generation of intelligence products
NASA Astrophysics Data System (ADS)
DiBona, Phil; Schlachter, Jason; Kuter, Ugur; Goldman, Robert
2017-05-01
Intelligence Analysis remains a manual process despite trends toward autonomy in information processing. Analysts need agile decision--support tools that can adapt to the evolving information needs of the mission, allowing the analyst to pose novel analytic questions. Our research enables the analysts to only provide a constrained English specification of what the intelligence product should be. Using HTN planning, the autonomy discovers, decides, and generates a workflow of algorithms to create the intelligence product. Therefore, the analyst can quickly and naturally communicate to the autonomy what information product is needed, rather than how to create it.
Space Object Radiometric Modeling for Hardbody Optical Signature Database Generation
2009-09-01
Introduction This presentation summarizes recent activity in monitoring spacecraft health status using passive remote optical nonimaging ...Approved for public release; distribution is unlimited. Space Object Radiometric Modeling for Hardbody Optical Signature Database Generation...It is beneficial to the observer/analyst to understand the fundamental optical signature variability associated with these detection and
The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation
2014-08-20
many different synthetic series can be generated at once. If the series already exists in the dataset, it is updated to reflect the new values. The...Testing for causality: a personal viewpoint. Journal of Economic Dynamics and Control, 2, 329-352. Manning, C., Raghavan, R., and Schutze , H. (2008
DIVWAG Model Documentation. Volume II. Programmer/Analyst Manual. Part 4.
1976-07-01
Model Constant Data Deck Structure . .. .... IV-13-A-40 Appendix B. Movement Model Program Descriptions . .. .. . .IV-13-B-1 1. Introduction...Data ................ IV-15-A-17 11. Airmobile Constant Data Deck Structure .. ...... .. IV-15-A-30 Appendix B. Airmobile Model Program Descriptions...Make no changes. 12. AIRMOBILE CONSTANT DATA DECK STRUCTURE . The deck structure required by the Airmobile Model constant data load program and the data
ERIC Educational Resources Information Center
Watson, William J.
Occupational analysts using Comprehensive Occupational Data Analysis Programs (CODAP) make subjective decisions at various stages in their analysis of an occupation. The possibility exists that two different analysts could reach different conclusions in analyzing an occupation, and thereby provide divergent guidance to management. Two analysts,…
Meghdadi, Amir H; Irani, Pourang
2013-12-01
We propose a novel video visual analytics system for interactive exploration of surveillance video data. Our approach consists of providing analysts with various views of information related to moving objects in a video. To do this we first extract each object's movement path. We visualize each movement by (a) creating a single action shot image (a still image that coalesces multiple frames), (b) plotting its trajectory in a space-time cube and (c) displaying an overall timeline view of all the movements. The action shots provide a still view of the moving object while the path view presents movement properties such as speed and location. We also provide tools for spatial and temporal filtering based on regions of interest. This allows analysts to filter out large amounts of movement activities while the action shot representation summarizes the content of each movement. We incorporated this multi-part visual representation of moving objects in sViSIT, a tool to facilitate browsing through the video content by interactive querying and retrieval of data. Based on our interaction with security personnel who routinely interact with surveillance video data, we identified some of the most common tasks performed. This resulted in designing a user study to measure time-to-completion of the various tasks. These generally required searching for specific events of interest (targets) in videos. Fourteen different tasks were designed and a total of 120 min of surveillance video were recorded (indoor and outdoor locations recording movements of people and vehicles). The time-to-completion of these tasks were compared against a manual fast forward video browsing guided with movement detection. We demonstrate how our system can facilitate lengthy video exploration and significantly reduce browsing time to find events of interest. Reports from expert users identify positive aspects of our approach which we summarize in our recommendations for future video visual analytics systems.
The Pacor 2 expert system: A case-based reasoning approach to troubleshooting
NASA Technical Reports Server (NTRS)
Sary, Charisse
1994-01-01
The Packet Processor 2 (Pacor 2) Data Capture Facility (DCF) acquires, captures, and performs level-zero processing of packet telemetry for spaceflight missions that adhere to communication services recommendations established by the Consultative Committee for Space Data Systems (CCSDS). A major goal of this project is to reduce life-cycle costs. One way to achieve this goal is to increase automation. Through automation, using expert systems, and other technologies, staffing requirements will remain static, which will enable the same number of analysts to support more missions. Analysts provide packet telemetry data evaluation and analysis services for all data received. Data that passes this evaluation is forwarded to the Data Distribution Facility (DDF) and released to scientists. Through troubleshooting, data that fails this evaluation is dumped and analyzed to determine if its quality can be improved before it is released. This paper describes a proof-of-concept prototype that troubleshoots data quality problems. The Pacor 2 expert system prototype uses the case-based reasoning (CBR) approach to development, an alternative to a rule-based approach. Because Pacor 2 is not operational, the prototype has been developed using cases that describe existing troubleshooting experience from currently operating missions. Through CBR, this experience will be available to analysts when Pacor 2 becomes operational. As Pacor 2 unique experience is gained, analysts will update the case base. In essence, analysts are training the system as they learn. Once the system has learned the cases most likely to recur, it can serve as an aide to inexperienced analysts, a refresher to experienced analysts for infrequently occurring problems, or a training tool for new analysts. The Expert System Development Methodology (ESDM) is being used to guide development.
Alaska IPASS database preparation manual.
P. McHugh; D. Olson; C. Schallau
1989-01-01
Describes the data, their sources, and the calibration procedures used in compiling a database for the Alaska IPASS (interactive policy analysis simulation system) model. Although this manual is for Alaska, it provides generic instructions for analysts preparing databases for other geographical areas.
The analyst's authenticity: "if you see something, say something".
Goldstein, George; Suzuki, Jessica Y
2015-05-01
The history of authenticity in psychoanalysis is as old as analysis itself, but the analyst's authenticity in particular has become an increasingly important area of focus in recent decades. This article traces the development of conceptions of analytic authenticity and proposes that the analyst's spontaneous verbalization of his or her unformulated experience in session can be a potent force in the course of an analysis. We acknowledge that although analytic authenticity can be a challenging ideal for the analyst to strive for, it contains the power to transform the experience of the patient and the analyst, as well as the meaning of their work together. Whether it comes in the form of an insight-oriented comment or a simple acknowledgment of things as they seem to be, a therapist's willingness to speak aloud something that has lost its language is a powerful clinical phenomenon that transcends theoretical orientation and modality. © 2015 Wiley Periodicals, Inc.
Instruction in information structuring improves Bayesian judgment in intelligence analysts.
Mandel, David R
2015-01-01
An experiment was conducted to test the effectiveness of brief instruction in information structuring (i.e., representing and integrating information) for improving the coherence of probability judgments and binary choices among intelligence analysts. Forty-three analysts were presented with comparable sets of Bayesian judgment problems before and immediately after instruction. After instruction, analysts' probability judgments were more coherent (i.e., more additive and compliant with Bayes theorem). Instruction also improved the coherence of binary choices regarding category membership: after instruction, subjects were more likely to invariably choose the category to which they assigned the higher probability of a target's membership. The research provides a rare example of evidence-based validation of effectiveness in instruction to improve the statistical assessment skills of intelligence analysts. Such instruction could also be used to improve the assessment quality of other types of experts who are required to integrate statistical information or make probabilistic assessments.
The lure of the symptom in psychoanalytic treatment.
Ogden, Thomas H; Gabbard, Glen O
2010-06-01
Psychoanalysis, which at its core is a search for truth, stands in a subversive position vis-à-vis the contemporary therapeutic culture that places a premium on symptomatic "cure." Nevertheless, analysts are vulnerable to succumbing to the internal and external pressures for the achievement of symptomatic improvement. In this communication we trace the evolution of Freud's thinking about the relationship between the aims of psychoanalysis and the alleviation of symptoms. We note that analysts today may recapitulate Freud's early struggles in their pursuit of symptom removal. We present an account of a clinical consultation in which the analytic pair were ensnared in an impasse that involved the analyst's preoccupation with the intransigence of one of the patient's symptoms. We suggest alternative ways of working with these clinical issues and offer some thoughts on how our own work as analysts and consultants to colleagues has been influenced by our understanding of what frequently occurs when the analyst becomes symptom-focused.
Self-disclosure, trauma and the pressures on the analyst.
West, Marcus
2017-09-01
This paper argues that self-disclosure is intimately related to traumatic experience and the pressures on the analyst not to re-traumatize the patient or repeat traumatic dynamics. The paper gives a number of examples of such pressures and outlines the difficulties the analyst may experience in adopting an analytic attitude - attempting to stay as closely as possible with what the patient brings. It suggests that self-disclosure may be used to try to disconfirm the patient's negative sense of themselves or the analyst, or to try to induce a positive sense of self or of the analyst which, whilst well-meaning, may be missing the point and may be prolonging the patient's distress. Examples are given of staying with the co-construction of the traumatic early relational dynamics and thus working through the traumatic complex; this attitude is compared and contrasted with some relational psychoanalytic attitudes. © 2017, The Society of Analytical Psychology.
Department of the Air Force Information Technology Program FY 95 President’s Budget
1994-03-01
2095 2200 552 900 1032 Description: Contractor hardware maintenan support, systems analyst support software development and maintenance, and off -the...hardware maintenance support, systems analyst support, operations support, configuration management, test support, and off -the-shelf software license...2419 2505 2594 Description: Contractor hardware maintenance support, systems analyst support, operations support, and off -the-shelf software license
Nothing but the truth: self-disclosure, self-revelation, and the persona of the analyst.
Levine, Susan S
2007-01-01
The question of the analyst's self-disclosure and self-revelation inhabits every moment of every psychoanalytic treatment. All self-disclosures and revelations, however, are not equivalent, and differentiating among them allows us to define a construct that can be called the analytic persona. Analysts already rely on an unarticulated concept of an analytic persona that guides them, for instance, as they decide what constitutes appropriate boundaries. Clinical examples illustrate how self-disclosures and revelations from within and without the analytic persona feel different, for both patient and analyst. The analyst plays a specific role for each patient and is both purposefully and unconsciously different in this context than in other settings. To a great degree, the self is a relational phenomenon. Our ethics call for us to tell nothing but the truth and simultaneously for us not to tell the whole truth. The unarticulated working concept of an analytic persona that many analysts have refers to the self we step out of at the close of each session and the self we step into as the patient enters the room. Attitudes toward self-disclosure and self-revelation can be considered reflections of how we conceptualize this persona.
Fault Tree in the Trenches, A Success Story
NASA Technical Reports Server (NTRS)
Long, R. Allen; Goodson, Amanda (Technical Monitor)
2000-01-01
Getting caught up in the explanation of Fault Tree Analysis (FTA) minutiae is easy. In fact, most FTA literature tends to address FTA concepts and methodology. Yet there seems to be few articles addressing actual design changes resulting from the successful application of fault tree analysis. This paper demonstrates how fault tree analysis was used to identify and solve a potentially catastrophic mechanical problem at a rocket motor manufacturer. While developing the fault tree given in this example, the analyst was told by several organizations that the piece of equipment in question had been evaluated by several committees and organizations, and that the analyst was wasting his time. The fault tree/cutset analysis resulted in a joint-redesign of the control system by the tool engineering group and the fault tree analyst, as well as bragging rights for the analyst. (That the fault tree found problems where other engineering reviews had failed was not lost on the other engineering groups.) Even more interesting was that this was the analyst's first fault tree which further demonstrates how effective fault tree analysis can be in guiding (i.e., forcing) the analyst to take a methodical approach in evaluating complex systems.
Pina, Jamie; Massoudi, Barbara L; Chester, Kelley; Koyanagi, Mark
2018-06-07
Researchers and analysts have not completely examined word frequency analysis as an approach to creating a public health quality improvement taxonomy. To develop a taxonomy of public health quality improvement concepts for an online exchange of quality improvement work. We analyzed documents, conducted an expert review, and employed a user-centered design along with a faceted search approach to make online entries searchable for users. To provide the most targeted facets to users, we used word frequency to analyze 334 published public health quality improvement documents to find the most common clusters of word meanings. We then reviewed the highest-weighted concepts and categorized their relationships to quality improvement details in our taxonomy. Next, we mapped meanings to items in our taxonomy and presented them in order of their weighted percentages in the data. Using these methods, we developed and sorted concepts in the faceted search presentation so that online exchange users could access relevant search criteria. We reviewed 50 of the top synonym clusters and identified 12 categories for our taxonomy data. The final categories were as follows: Summary; Planning and Execution Details; Health Impact; Training and Preparation; Information About the Community; Information About the Health Department; Results; Quality Improvement (QI) Staff; Information; Accreditation Details; Collaborations; and Contact Information of the Submitter. Feedback about the elements in the taxonomy and presentation of elements in our search environment from users has been positive. When relevant data are available, the word frequency analysis method may be useful in other taxonomy development efforts for public health.
Analysis and Visualization of Relations in eLearning
NASA Astrophysics Data System (ADS)
Dráždilová, Pavla; Obadi, Gamila; Slaninová, Kateřina; Martinovič, Jan; Snášel, Václav
The popularity of eLearning systems is growing rapidly; this growth is enabled by the consecutive development in Internet and multimedia technologies. Web-based education became wide spread in the past few years. Various types of learning management systems facilitate development of Web-based courses. Users of these courses form social networks through the different activities performed by them. This chapter focuses on searching the latent social networks in eLearning systems data. These data consist of students activity records wherein latent ties among actors are embedded. The social network studied in this chapter is represented by groups of students who have similar contacts and interact in similar social circles. Different methods of data clustering analysis can be applied to these groups, and the findings show the existence of latent ties among the group members. The second part of this chapter focuses on social network visualization. Graphical representation of social network can describe its structure very efficiently. It can enable social network analysts to determine the network degree of connectivity. Analysts can easily determine individuals with a small or large amount of relationships as well as the amount of independent groups in a given network. When applied to the field of eLearning, data visualization simplifies the process of monitoring the study activities of individuals or groups, as well as the planning of educational curriculum, the evaluation of study processes, etc.
Searching social networks for subgraph patterns
NASA Astrophysics Data System (ADS)
Ogaard, Kirk; Kase, Sue; Roy, Heather; Nagi, Rakesh; Sambhoos, Kedar; Sudit, Moises
2013-06-01
Software tools for Social Network Analysis (SNA) are being developed which support various types of analysis of social networks extracted from social media websites (e.g., Twitter). Once extracted and stored in a database such social networks are amenable to analysis by SNA software. This data analysis often involves searching for occurrences of various subgraph patterns (i.e., graphical representations of entities and relationships). The authors have developed the Graph Matching Toolkit (GMT) which provides an intuitive Graphical User Interface (GUI) for a heuristic graph matching algorithm called the Truncated Search Tree (TruST) algorithm. GMT is a visual interface for graph matching algorithms processing large social networks. GMT enables an analyst to draw a subgraph pattern by using a mouse to select categories and labels for nodes and links from drop-down menus. GMT then executes the TruST algorithm to find the top five occurrences of the subgraph pattern within the social network stored in the database. GMT was tested using a simulated counter-insurgency dataset consisting of cellular phone communications within a populated area of operations in Iraq. The results indicated GMT (when executing the TruST graph matching algorithm) is a time-efficient approach to searching large social networks. GMT's visual interface to a graph matching algorithm enables intelligence analysts to quickly analyze and summarize the large amounts of data necessary to produce actionable intelligence.
NASA Technical Reports Server (NTRS)
Basile, Lisa
1988-01-01
The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historical records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.
NASA Technical Reports Server (NTRS)
Basile, Lisa
1988-01-01
The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historial records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.
NASA Technical Reports Server (NTRS)
Lovejoy, Andrew E.; Hilburger, Mark W.
2013-01-01
This document outlines a Modeling and Analysis Plan (MAP) to be followed by the SBKF analysts. It includes instructions on modeling and analysis formulation and execution, model verification and validation, identifying sources of error and uncertainty, and documentation. The goal of this MAP is to provide a standardized procedure that ensures uniformity and quality of the results produced by the project and corresponding documentation.
High-level PC-based laser system modeling
NASA Astrophysics Data System (ADS)
Taylor, Michael S.
1991-05-01
Since the inception of the Strategic Defense Initiative (SDI) there have been a multitude of comparison studies done in an attempt to evaluate the effectiveness and relative sizes of complementary, and sometimes competitive, laser weapon systems. It became more and more apparent that what the systems analyst needed was not only a fast, but a cost effective way to perform high-level trade studies. In the present investigation, a general procedure is presented for the development of PC-based algorithmic systems models for laser systems. This procedure points out all of the major issues that should be addressed in the design and development of such a model. Issues addressed include defining the problem to be modeled, defining a strategy for development, and finally, effective use of the model once developed. Being a general procedure, it will allow a systems analyst to develop a model to meet specific needs. To illustrate this method of model development, a description of the Strategic Defense Simulation - Design To (SDS-DT) model developed and used by Science Applications International Corporation (SAIC) is presented. SDS-DT is a menu-driven, fast executing, PC-based program that can be used to either calculate performance, weight, volume, and cost values for a particular design or, alternatively, to run parametrics on particular system parameters to perhaps optimize a design.
GROSS- GAMMA RAY OBSERVATORY ATTITUDE DYNAMICS SIMULATOR
NASA Technical Reports Server (NTRS)
Garrick, J.
1994-01-01
The Gamma Ray Observatory (GRO) spacecraft will constitute a major advance in gamma ray astronomy by offering the first opportunity for comprehensive observations in the range of 0.1 to 30,000 megaelectronvolts (MeV). The Gamma Ray Observatory Attitude Dynamics Simulator, GROSS, is designed to simulate this mission. The GRO Dynamics Simulator consists of three separate programs: the Standalone Profile Program; the Simulator Program, which contains the Simulation Control Input/Output (SCIO) Subsystem, the Truth Model (TM) Subsystem, and the Onboard Computer (OBC) Subsystem; and the Postprocessor Program. The Standalone Profile Program models the environment of the spacecraft and generates a profile data set for use by the simulator. This data set contains items such as individual external torques; GRO spacecraft, Tracking and Data Relay Satellite (TDRS), and solar and lunar ephemerides; and star data. The Standalone Profile Program is run before a simulation. The SCIO subsystem is the executive driver for the simulator. It accepts user input, initializes parameters, controls simulation, and generates output data files and simulation status display. The TM subsystem models the spacecraft dynamics, sensors, and actuators. It accepts ephemerides, star data, and environmental torques from the Standalone Profile Program. With these and actuator commands from the OBC subsystem, the TM subsystem propagates the current state of the spacecraft and generates sensor data for use by the OBC and SCIO subsystems. The OBC subsystem uses sensor data from the TM subsystem, a Kalman filter (for attitude determination), and control laws to compute actuator commands to the TM subsystem. The OBC subsystem also provides output data to the SCIO subsystem for output to the analysts. The Postprocessor Program is run after simulation is completed. It generates printer and CRT plots and tabular reports of the simulated data at the direction of the user. GROSS is written in FORTRAN 77 and ASSEMBLER and has been implemented on a VAX 11/780 under VMS 4.5. It has a virtual memory requirement of 255k. GROSS was developed in 1986.
Technology evaluation, assessment, modeling, and simulation: the TEAMS capability
NASA Astrophysics Data System (ADS)
Holland, Orgal T.; Stiegler, Robert L.
1998-08-01
The United States Marine Corps' Technology Evaluation, Assessment, Modeling and Simulation (TEAMS) capability, located at the Naval Surface Warfare Center in Dahlgren Virginia, provides an environment for detailed test, evaluation, and assessment of live and simulated sensor and sensor-to-shooter systems for the joint warfare community. Frequent use of modeling and simulation allows for cost effective testing, bench-marking, and evaluation of various levels of sensors and sensor-to-shooter engagements. Interconnectivity to live, instrumented equipment operating in real battle space environments and to remote modeling and simulation facilities participating in advanced distributed simulations (ADS) exercises is available to support a wide- range of situational assessment requirements. TEAMS provides a valuable resource for a variety of users. Engineers, analysts, and other technology developers can use TEAMS to evaluate, assess and analyze tactical relevant phenomenological data on tactical situations. Expeditionary warfare and USMC concept developers can use the facility to support and execute advanced warfighting experiments (AWE) to better assess operational maneuver from the sea (OMFTS) concepts, doctrines, and technology developments. Developers can use the facility to support sensor system hardware, software and algorithm development as well as combat development, acquisition, and engineering processes. Test and evaluation specialists can use the facility to plan, assess, and augment their processes. This paper presents an overview of the TEAMS capability and focuses specifically on the technical challenges associated with the integration of live sensor hardware into a synthetic environment and how those challenges are being met. Existing sensors, recent experiments and facility specifications are featured.
Using interactive visual reasoning to support sense-making: implications for design.
Kodagoda, Neesha; Attfield, Simon; Wong, B L William; Rooney, Chris; Choudhury, Sharmin Tinni
2013-12-01
This research aims to develop design guidelines for systems that support investigators and analysts in the exploration and assembly of evidence and inferences. We focus here on the problem of identifying candidate 'influencers' within a community of practice. To better understand this problem and its related cognitive and interaction needs, we conducted a user study using a system called INVISQUE (INteractive Visual Search and QUery Environment) loaded with content from the ACM Digital Library. INVISQUE supports search and manipulation of results over a freeform infinite 'canvas'. The study focuses on the representations user create and their reasoning process. It also draws on some pre-established theories and frameworks related to sense-making and cognitive work in general, which we apply as a 'theoretical lenses' to consider findings and articulate solutions. Analysing the user-study data in the light of these provides some understanding of how the high-level problem of identifying key players within a domain can translate into lower-level questions and interactions. This, in turn, has informed our understanding of representation and functionality needs at a level of description which abstracts away from the specifics of the problem at hand to the class of problems of interest. We consider the study outcomes from the perspective of implications for design.
User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org
Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.
2013-01-01
Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report documents the objectives and the conceptual and methodological approach used in the development of the Coal Production Submodule (CPS). It provides a description of the CPS for model analysts and the public. The Coal Market Module provides annual forecasts of prices, production, and consumption of coal.
Trade Space Analysis: Rotational Analyst Research Project
2015-09-01
POM Program Objective Memoranda PM Program Manager RFP Request for Proposal ROM Rough Order Magnitude RSM Response Surface Method RSE ...response surface method (RSM) / response surface equations ( RSEs ) as surrogate models. It uses the RSEs with Monte Carlo simulation to quantitatively
Tools of the trade -- and how to use them
NASA Astrophysics Data System (ADS)
2017-07-01
The role of physicists in finance is changing, as quantitative trading opens an exciting alternative to traditional financial modelling, and data science lures would-be 'quants' away. But the void is being steadily filled by a new type of analyst.
The Performance of Local Dependence Measures with Psychological Data
ERIC Educational Resources Information Center
Houts, Carrie R.; Edwards, Michael C.
2013-01-01
The violation of the assumption of local independence when applying item response theory (IRT) models has been shown to have a negative impact on all estimates obtained from the given model. Numerous indices and statistics have been proposed to aid analysts in the detection of local dependence (LD). A Monte Carlo study was conducted to evaluate…
Toward a Model of Lifelong Education.
ERIC Educational Resources Information Center
Knowles, Malcolm S.
Some of the criticisms that have been leveled at the educational establishment by social analysts are discussed. It is suggested that one of the new realities is that education must be a lifelong process in order to avoid the catastrophe of human obsolescence. The assumptions and elements for a new model of education as a lifelong process are…
Counter-terrorism threat prediction architecture
NASA Astrophysics Data System (ADS)
Lehman, Lynn A.; Krause, Lee S.
2004-09-01
This paper will evaluate the feasibility of constructing a system to support intelligence analysts engaged in counter-terrorism. It will discuss the use of emerging techniques to evaluate a large-scale threat data repository (or Infosphere) and comparing analyst developed models to identify and discover potential threat-related activity with a uncertainty metric used to evaluate the threat. This system will also employ the use of psychological (or intent) modeling to incorporate combatant (i.e. terrorist) beliefs and intent. The paper will explore the feasibility of constructing a hetero-hierarchical (a hierarchy of more than one kind or type characterized by loose connection/feedback among elements of the hierarchy) agent based framework or "family of agents" to support "evidence retrieval" defined as combing, or searching the threat data repository and returning information with an uncertainty metric. The counter-terrorism threat prediction architecture will be guided by a series of models, constructed to represent threat operational objectives, potential targets, or terrorist objectives. The approach would compare model representations against information retrieved by the agent family to isolate or identify patterns that match within reasonable measures of proximity. The central areas of discussion will be the construction of an agent framework to search the available threat related information repository, evaluation of results against models that will represent the cultural foundations, mindset, sociology and emotional drive of typical threat combatants (i.e. the mind and objectives of a terrorist), and the development of evaluation techniques to compare result sets with the models representing threat behavior and threat targets. The applicability of concepts surrounding Modeling Field Theory (MFT) will be discussed as the basis of this research into development of proximity measures between the models and result sets and to provide feedback in support of model adaptation (learning). The increasingly complex demands facing analysts evaluating activity threatening to the security of the United States make the family of agent-based data collection (fusion) a promising area. This paper will discuss a system to support the collection and evaluation of potential threat activity as well as an approach fro presentation of the information.
NASA Technical Reports Server (NTRS)
Perino, Scott; Bayandor, Javid; Siddens, Aaron
2012-01-01
The anticipated NASA Mars Sample Return Mission (MSR) requires a simple and reliable method in which to return collected Martian samples back to earth for scientific analysis. The Multi-Mission Earth Entry Vehicle (MMEEV) is NASA's proposed solution to this MSR requirement. Key aspects of the MMEEV are its reliable and passive operation, energy absorbing foam-composite structure, and modular impact sphere (IS) design. To aid in the development of an EEV design that can be modified for various missions requirements, two fully parametric finite element models were developed. The first model was developed in an explicit finite element code and was designed to evaluate the impact response of the vehicle and payload during the final stage of the vehicle's return to earth. The second model was developed in an explicit code and was designed to evaluate the static and dynamic structural response of the vehicle during launch and reentry. In contrast to most other FE models, built through a Graphical User Interface (GUI) pre-processor, the current model was developed using a coding technique that allows the analyst to quickly change nearly all aspects of the model including: geometric dimensions, material properties, load and boundary conditions, mesh properties, and analysis controls. Using the developed design tool, a full range of proposed designs can quickly be analyzed numerically and thus the design trade space for the EEV can be fully understood. An engineer can then quickly reach the best design for a specific mission and also adapt and optimize the general design for different missions.
Transcending the caesura: reverie, dreaming and counter-dreaming.
Bergstein, Avner
2013-08-01
The author reflects about our capacity to get in touch with primitive, irrepresentable, seemingly unreachable parts of the Self and with the unrepressed unconscious. It is suggested that when the patient's dreaming comes to a halt, or encounters a caesura, the analyst dreams that which the patient cannot. Getting in touch with such primitive mental states and with the origin of the Self is aspired to, not so much for discovering historical truth or recovering unconscious content, as for generating motion between different parts of the psyche. The movement itself is what expands the mind and facilitates psychic growth. Bion's brave and daring notion of 'caesura', suggesting a link between mature emotions and thinking and intra-uterine life, serves as a model for bridging seemingly unbridgeable states of mind. Bion inspires us to 'dream' creatively, to let our minds roam freely, stressing the analyst's speculative imagination and intuition often bordering on hallucination. However, being on the seam between conscious and unconscious, dreaming subverts the psychic equilibrium and poses a threat of catastrophe as a result of the confusion it affords between the psychotic and the non-psychotic parts of the personality. Hence there is a tendency to try and evade it through a more saturated mode of thinking, often relying on external reality. The analyst's dreaming and intuition, perhaps a remnant of intra-uterine life, is elaborated as means of penetrating and transcending the caesura, thus facilitating patient and analyst to bear unbearable states of mind and the painful awareness of the unknowability of the emotional experience. This is illustrated clinically. Copyright © 2013 Institute of Psychoanalysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, J.R.; O`Neill, D.C.; Barker, B.W.
1994-10-01
The research described in this report is directed toward the development of a workstation-based data management, analysis and visualization system which can be used to improve the Air Force`s capability to evaluate site specific environmental hazards. The initial prototype system described in this report is directed toward a specific application to the Massachusetts Military Reservation (formerly Otis Air Force Base) on Cape Cod, Massachusetts. This system integrates a comprehensive, on-line environmental database for the site together with a map-based graphical user interface which facilitates analyst access to the databases and analysis tools needed to characterize the subsurface geologic and hydrologicmore » environments at the site.« less
ATLAS-plus: Multimedia Instruction in Embryology, Gross Anatomy, and Histology
Chapman, CM; Miller, JG; Bush, LC; Bruenger, JA; Wysor, WJ; Meininger, ET; Wolf, FM; Fischer, TV; Beaudoin, AR; Burkel, WE; MacCallum, DK; Fisher, DL; Carlson, BM
1992-01-01
ATLAS-plus [Advanced Tools for Learning Anatomical Structure] is a multimedia program used to assist in the teaching of anatomy at the University of Michigan Medical School. ATLAS-plus contains three courses: Histology, Embryology, and Gross Anatomy. In addition to the three courses, a glossary containing terms from the three courses is available. All three courses and the glossary are accessible in the ATLAS-plus environment. The ATLAS-plus environment provides a consistent set of tools and options so that the user can navigate easily and intelligently in and between the various courses and modules in the ATLAS-plus world. The program is a collaboration between anatomy and cell biology faculty, medical students, graphic artists, systems analysts, and instructional designers. PMID:1482964
Better Incident Response with SCOT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruner, Todd
2015-04-01
SCOT is an incident response management system and knowledge base designed for incident responders by incident responders. SCOT increases the effectiveness of the team without adding undue burdens. Focused on reducing the friction between analysts and their tools, SCOT enables analysts to document and share their research and response efforts in near real time. Automatically identifying indicators and correlating those indicators, SCOT helps analysts discover and respond to advanced threats.
On Murray Jackson's 1961 'Chair, couch and countertransference'.
Connolly, Angela
2015-09-01
One of the problems facing psychoanalysts of all schools is that theory has evolved at a much faster pace than practice. Whereas there has been an explosion of theory, practice has remained, at least officially, static and unchanging. It is in this sense that Murray Jackson's 1961 paper is still relevant today. Despite the rise of the new relational and intersubjective paradigms, most psychoanalysts, and not a few Jungian analysts, still seem to feel that the couch is an essential component of the analytical setting and process. If the use of the couch is usually justified by the argument that it favours regression, facilitates analytical reverie and protects the patient from the influence of the analyst, over time many important psychoanalysts have come to challenge this position. Increasingly these analysts suggest that the use of the couch may actually be incompatible with the newer theoretical models. This contention is strengthened by some of the findings coming from the neurosciences and infant research. This underlines the necessity of empirical research to verify the clinical effectiveness of these different positions, couch or face-to-face, but it is exactly this type of research that is lacking. © 2015, The Society of Analytical Psychology.
NASA Astrophysics Data System (ADS)
Le Bras, R. J.; Arora, N. S.; Kushida, N.; Kebede, F.; Feitio, P.; Tomuta, E.
2017-12-01
The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has reached out to the broader scientific community through a series of conferences, the later one of which took place in June 2017 in Vienna, Austria. Stemming out of this outreach effort, after the inception of research and development efforts in 2009, the NET-VISA software, following a Bayesian modelling approach, has been elaborated to improve on the key step of automatic association of joint seismic, hydro-acoustic, and infrasound detections. When compared with the current operational system, it has been consistently shown on off-line tests to improve the overlap with the analyst-reviewed Reviewed Event Bulletin (REB) by ten percent for an average of 85% overlap, while the inconsistency rate is essentially the same at about 50%. Testing by analysts in realistic conditions on a few days of data has also demonstrated the software performance in finding additional events which qualify for publication in the REB. Starting in August 2017, the automatic events produced by the software will be reviewed by analysts at the CTBTO, and we report on the initial evaluation of this introduction into operations.
Interactive visual comparison of multimedia data through type-specific views
NASA Astrophysics Data System (ADS)
Burtner, Russ; Bohn, Shawn; Payne, Debbie
2013-01-01
Analysts who work with collections of multimedia to perform information foraging understand how difficult it is to connect information across diverse sets of mixed media. The wealth of information from blogs, social media, and news sites often can provide actionable intelligence; however, many of the tools used on these sources of content are not capable of multimedia analysis because they only analyze a single media type. As such, analysts are taxed to keep a mental model of the relationships among each of the media types when generating the broader content picture. To address this need, we have developed Canopy, a novel visual analytic tool for analyzing multimedia. Canopy provides insight into the multimedia data relationships by exploiting the linkages found in text, images, and video co-occurring in the same document and across the collection. Canopy connects derived and explicit linkages and relationships through multiple connected visualizations to aid analysts in quickly summarizing, searching, and browsing collected information to explore relationships and align content. In this paper, we will discuss the features and capabilities of the Canopy system and walk through a scenario illustrating how this system might be used in an operational environment.
Interactive visual comparison of multimedia data through type-specific views
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burtner, Edwin R.; Bohn, Shawn J.; Payne, Deborah A.
2013-02-05
Analysts who work with collections of multimedia to perform information foraging understand how difficult it is to connect information across diverse sets of mixed media. The wealth of information from blogs, social media, and news sites often can provide actionable intelligence; however, many of the tools used on these sources of content are not capable of multimedia analysis because they only analyze a single media type. As such, analysts are taxed to keep a mental model of the relationships among each of the media types when generating the broader content picture. To address this need, we have developed Canopy, amore » novel visual analytic tool for analyzing multimedia. Canopy provides insight into the multimedia data relationships by exploiting the linkages found in text, images, and video co-occurring in the same document and across the collection. Canopy connects derived and explicit linkages and relationships through multiple connected visualizations to aid analysts in quickly summarizing, searching, and browsing collected information to explore relationships and align content. In this paper, we will discuss the features and capabilities of the Canopy system and walk through a scenario illustrating how this system might be used in an operational environment. Keywords: Multimedia (Image/Video/Music) Visualization.« less