NASA Astrophysics Data System (ADS)
Lan, Hengxing; Derek Martin, C.; Lim, C. H.
2007-02-01
Geographic information system (GIS) modeling is used in combination with three-dimensional (3D) rockfall process modeling to assess rockfall hazards. A GIS extension, RockFall Analyst (RA), which is capable of effectively handling large amounts of geospatial information relative to rockfall behaviors, has been developed in ArcGIS using ArcObjects and C#. The 3D rockfall model considers dynamic processes on a cell plane basis. It uses inputs of distributed parameters in terms of raster and polygon features created in GIS. Two major components are included in RA: particle-based rockfall process modeling and geostatistics-based rockfall raster modeling. Rockfall process simulation results, 3D rockfall trajectories and their velocity features either for point seeders or polyline seeders are stored in 3D shape files. Distributed raster modeling, based on 3D rockfall trajectories and a spatial geostatistical technique, represents the distribution of spatial frequency, the flying and/or bouncing height, and the kinetic energy of falling rocks. A distribution of rockfall hazard can be created by taking these rockfall characteristics into account. A barrier analysis tool is also provided in RA to aid barrier design. An application of these modeling techniques to a case study is provided. The RA has been tested in ArcGIS 8.2, 8.3, 9.0 and 9.1.
Evaluation techniques and metrics for assessment of pan+MSI fusion (pansharpening)
NASA Astrophysics Data System (ADS)
Mercovich, Ryan A.
2015-05-01
Fusion of broadband panchromatic data with narrow band multispectral data - pansharpening - is a common and often studied problem in remote sensing. Many methods exist to produce data fusion results with the best possible spatial and spectral characteristics, and a number have been commercially implemented. This study examines the output products of 4 commercial implementations with regard to their relative strengths and weaknesses for a set of defined image characteristics and analyst use-cases. Image characteristics used are spatial detail, spatial quality, spectral integrity, and composite color quality (hue and saturation), and analyst use-cases included a variety of object detection and identification tasks. The imagery comes courtesy of the RIT SHARE 2012 collect. Two approaches are used to evaluate the pansharpening methods, analyst evaluation or qualitative measure and image quality metrics or quantitative measures. Visual analyst evaluation results are compared with metric results to determine which metrics best measure the defined image characteristics and product use-cases and to support future rigorous characterization the metrics' correlation with the analyst results. Because pansharpening represents a trade between adding spatial information from the panchromatic image, and retaining spectral information from the MSI channels, the metrics examined are grouped into spatial improvement metrics and spectral preservation metrics. A single metric to quantify the quality of a pansharpening method would necessarily be a combination of weighted spatial and spectral metrics based on the importance of various spatial and spectral characteristics for the primary task of interest. Appropriate metrics and weights for such a combined metric are proposed here, based on the conducted analyst evaluation. Additionally, during this work, a metric was developed specifically focused on assessment of spatial structure improvement relative to a reference image and independent of scene content. Using analysis of Fourier transform images, a measure of high-frequency content is computed in small sub-segments of the image. The average increase in high-frequency content across the image is used as the metric, where averaging across sub-segments combats the scene dependent nature of typical image sharpness techniques. This metric had an improved range of scores, better representing difference in the test set than other common spatial structure metrics.
Setting analyst: A practical harvest planning technique
Olivier R.M. Halleux; W. Dale Greene
2001-01-01
Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...
User interface for ground-water modeling: Arcview extension
Tsou, Ming‐shu; Whittemore, Donald O.
2001-01-01
Numerical simulation for ground-water modeling often involves handling large input and output data sets. A geographic information system (GIS) provides an integrated platform to manage, analyze, and display disparate data and can greatly facilitate modeling efforts in data compilation, model calibration, and display of model parameters and results. Furthermore, GIS can be used to generate information for decision making through spatial overlay and processing of model results. Arc View is the most widely used Windows-based GIS software that provides a robust user-friendly interface to facilitate data handling and display. An extension is an add-on program to Arc View that provides additional specialized functions. An Arc View interface for the ground-water flow and transport models MODFLOW and MT3D was built as an extension for facilitating modeling. The extension includes preprocessing of spatially distributed (point, line, and polygon) data for model input and postprocessing of model output. An object database is used for linking user dialogs and model input files. The Arc View interface utilizes the capabilities of the 3D Analyst extension. Models can be automatically calibrated through the Arc View interface by external linking to such programs as PEST. The efficient pre- and postprocessing capabilities and calibration link were demonstrated for ground-water modeling in southwest Kansas.
Forensic steganalysis: determining the stego key in spatial domain steganography
NASA Astrophysics Data System (ADS)
Fridrich, Jessica; Goljan, Miroslav; Soukal, David; Holotyak, Taras
2005-03-01
This paper is an extension of our work on stego key search for JPEG images published at EI SPIE in 2004. We provide a more general theoretical description of the methodology, apply our approach to the spatial domain, and add a method that determines the stego key from multiple images. We show that in the spatial domain the stego key search can be made significantly more efficient by working with the noise component of the image obtained using a denoising filter. The technique is tested on the LSB embedding paradigm and on a special case of embedding by noise adding (the +/-1 embedding). The stego key search can be performed for a wide class of steganographic techniques even for sizes of secret message well below those detectable using known methods. The proposed strategy may prove useful to forensic analysts and law enforcement.
Using GIS to analyze animal movements in the marine environment
Hooge, Philip N.; Eichenlaub, William M.; Solomon, Elizabeth K.; Kruse, Gordon H.; Bez, Nicolas; Booth, Anthony; Dorn, Martin W.; Hills, Susan; Lipcius, Romuald N.; Pelletier, Dominique; Roy, Claude; Smith, Stephen J.; Witherell, David B.
2001-01-01
Advanced methods for analyzing animal movements have been little used in the aquatic research environment compared to the terrestrial. In addition, despite obvious advantages of integrating geographic information systems (GIS) with spatial studies of animal movement behavior, movement analysis tools have not been integrated into GIS for either aquatic or terrestrial environments. We therefore developed software that integrates one of the most commonly used GIS programs (ArcView®) with a large collection of animal movement analysis tools. This application, the Animal Movement Analyst Extension (AMAE), can be loaded as an extension to ArcView® under multiple operating system platforms (PC, Unix, and Mac OS). It contains more than 50 functions, including parametric and nonparametric home range analyses, random walk models, habitat analyses, point and circular statistics, tests of complete spatial randomness, tests for autocorrelation and sample size, point and line manipulation tools, and animation tools. This paper describes the use of these functions in analyzing animal location data; some limited examples are drawn from a sonic-tracking study of Pacific halibut (Hippoglossus stenolepis) in Glacier Bay, Alaska. The extension is available on the Internet at www.absc.usgs.gov/glba/gistools/index.htm.
2013-01-01
Introduction There is a great health services disparity between urban and rural areas in China. The percentage of people who are unable to access health services due to long travel times increases. This paper takes Donghai County as the study unit to analyse areas with physician shortages and characteristics of the potential spatial accessibility of health services. We analyse how the unequal health services resources distribution and the New Cooperative Medical Scheme affect the potential spatial accessibility of health services in Donghai County. We also give some advice on how to alleviate the unequal spatial accessibility of health services in areas that are more remote and isolated. Methods The shortest traffic times of from hospitals to villages are calculated with an O-D matrix of GIS extension model. This paper applies an enhanced two-step floating catchment area (E2SFCA) method to study the spatial accessibility of health services and to determine areas with physician shortages in Donghai County. The sensitivity of the E2SFCA for assessing variation in the spatial accessibility of health services is checked using different impedance coefficient valuesa. Geostatistical Analyst model and spatial analyst method is used to analyse the spatial pattern and the edge effect of potential spatial accessibility of health services. Results The results show that 69% of villages have access to lower potential spatial accessibility of health services than the average for Donghai County, and 79% of the village scores are lower than the average for Jiangsu Province. The potential spatial accessibility of health services diminishes greatly from the centre of the county to outlying areas. Using a smaller impedance coefficient leads to greater disparity among the villages. The spatial accessibility of health services is greater along highway in the county. Conclusions Most of villages are in underserved health services areas. An unequal distribution of health service resources and the reimbursement policies of the New Cooperative Medical Scheme have led to an edge effect regarding spatial accessibility of health services in Donghai County, whereby people living on the edge of the county have less access to health services. Comprehensive measures should be considered to alleviate the unequal spatial accessibility of health services in areas that are more remote and isolated. PMID:23688278
NASA Technical Reports Server (NTRS)
Keely, Leslie
2008-01-01
This is a status report for the project entitled Planetary Spatial Analyst (PSA). This report covers activities from the project inception on October 1, 2007 to June 1, 2008. Originally a three year proposal, PSA was awarded funding for one year and required a revised work statement and budget. At the time of this writing the project is well on track both for completion of work as well as budget. The revised project focused on two objectives: build a solid connection with the target community and implement a prototype software application that provides 3D visualization and spatial analysis technologies for that community. Progress has been made for both of these objectives.
NASA Astrophysics Data System (ADS)
Łojek, Jacek
2012-01-01
The objective of this paper was to use the ArcView 3.2 application for spatial modelling of the exploration forms (pits) in the Bykowszczyzna 8 archaeological site. The 3D digital documentation at a specific scale makes possible easy archiving, presentation, and simple spatial analyses of the examined objects. The ArcView 3.2 programme and its extensions (Spatial Analyst and 3D Analyst), commonly used as analytical tools in geomorphology, were inventively used for inventory-making in the archaeological site. Traditional field sketches were only a base, which enables entering data into the programme, and don't documentation material in itself as it used to be. The method of data visual ization proposed by the author gives new possibilities for using the GIS platform software. W artykule zaprezentowano projekt wykorzystania aplikacji ArcView 3.2 w modelowaniu obrazu form eksploracyjnych na stanowisku archeologicznym Bykowszczyzna 8. Stanowisko zostało objęte programem ratowniczych badań archeologicznych w związku z budową obwodnicy miasta Kocka na trasie krajowej nr 19 relacji Siemiatycze-Lublin-Nisko. Zasadniczy etap prac archeologicznych na stanowisku Bykowszczyzna 8 obejmował pozyskanie oraz inwentaryzację materiału zabytkowego wypełniającego formy. W wyniku wybrania tego materiału, w obszarze stanowiska pozostają charakterystyczne jamy gospodarcze, które stanowią negatywowy obraz wypełnienia formy. Kształt jam jest dokumentowany w postaci szkiców oraz fotografii. Dokumentacja ta stanowi punkt wyjścia procesu digitalizacji (materiał źródłowy). Treścią artykułu jest sporządzenie cyfrowej dokumentacji zawierającej plany stanowiska w kilku poziomach szczegółowości (dla pasa, pola oraz pojedynczych form) oraz wygenerowanie modeli w standardzie 3D. Dokumentacja taka umożliwia łatwą archiwizację oraz czytelną prezentację wybranych obiektów. Możliwe jest również wykonanie analiz przestrzennych. Funkcje programu ArcView 3.2. oraz jego rozszerzeń: Spatial Analyst i 3D Analyst wykorzystywane jako narzędzia analityczne w geomorfologii, w sposób nowatorski zostały zastosowane w pracach inwentaryzacyjnych na stanowisku archeologicznym. Wykonywane tradycyjnymi metodami szkice terenowe, są w tym przypadku tylko podkładem umożliwiającym wprowadzenie danych do programu, a nie jak dotychczas materiałem dokumentacyjnym samym w sobie. Zaproponowana przez autora metoda wizualizacji danych, daje nowe możliwości wykorzystania programów środowiska GIS oraz jest kolejnym krokiem w dziedzinie współpracy geografów i archeologów.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-27
... Program Manager Import Administration, from Emeka Chukwudebe, Case Analyst, Import administration, Re..., 2011. \\4\\ See Memorandum for All Interested Parties, from Emeka Chukwudebe, Case Analyst, Import... new shipper review to 300 days if it determines that the case is extraordinarily complicated. See 19...
Rockfall hazard analysis using LiDAR and spatial modeling
NASA Astrophysics Data System (ADS)
Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho
2010-05-01
Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.
NASA Astrophysics Data System (ADS)
Ferdous, Nazneen; Bhat, Chandra R.
2013-01-01
This paper proposes and estimates a spatial panel ordered-response probit model with temporal autoregressive error terms to analyze changes in urban land development intensity levels over time. Such a model structure maintains a close linkage between the land owner's decision (unobserved to the analyst) and the land development intensity level (observed by the analyst) and accommodates spatial interactions between land owners that lead to spatial spillover effects. In addition, the model structure incorporates spatial heterogeneity as well as spatial heteroscedasticity. The resulting model is estimated using a composite marginal likelihood (CML) approach that does not require any simulation machinery and that can be applied to data sets of any size. A simulation exercise indicates that the CML approach recovers the model parameters very well, even in the presence of high spatial and temporal dependence. In addition, the simulation results demonstrate that ignoring spatial dependency and spatial heterogeneity when both are actually present will lead to bias in parameter estimation. A demonstration exercise applies the proposed model to examine urban land development intensity levels using parcel-level data from Austin, Texas.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-19
... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Proposed Information Collection; Comment Request; Manufacturing Extension Partnership (MEP) Management Information Reporting... record. Dated: April 16, 2012. Gwellnar Banks, Management Analyst, Office of the Chief Information...
NASA Astrophysics Data System (ADS)
Margitus, Michael R.; Tagliaferri, William A., Jr.; Sudit, Moises; LaMonica, Peter M.
2012-06-01
Understanding the structure and dynamics of networks are of vital importance to winning the global war on terror. To fully comprehend the network environment, analysts must be able to investigate interconnected relationships of many diverse network types simultaneously as they evolve both spatially and temporally. To remove the burden from the analyst of making mental correlations of observations and conclusions from multiple domains, we introduce the Dynamic Graph Analytic Framework (DYGRAF). DYGRAF provides the infrastructure which facilitates a layered multi-modal network analysis (LMMNA) approach that enables analysts to assemble previously disconnected, yet related, networks in a common battle space picture. In doing so, DYGRAF provides the analyst with timely situation awareness, understanding and anticipation of threats, and support for effective decision-making in diverse environments.
Cognitive task analysis of network analysts and managers for network situational awareness
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn
2010-01-01
The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.
KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery
NASA Astrophysics Data System (ADS)
Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan
2013-05-01
KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.
Cognitive Task Analysis of Network Analysts and Managers for Network Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.
The goal of the project was to create a set of next generation cyber situational awareness capabilities with applications to other domains in the long term. The goal is to improve the decision making process such that decision makers can choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understood what their needs truly were. Consequently, this is the focus of this portion of the research. This paper discusses the methodology we followed to acquire this feedback from the analysts, namely a cognitive task analysis. Additionally, this papermore » provides the details we acquired from the analysts. This essentially provides details on their processes, goals, concerns, the data and meta-data they analyze, etc. A final result we describe is the generation of a task-flow diagram.« less
Evaluation of SNS Beamline Shielding Configurations using MCNPX Accelerated by ADVANTG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Risner, Joel M; Johnson, Seth R.; Remec, Igor
2015-01-01
Shielding analyses for the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory pose significant computational challenges, including highly anisotropic high-energy sources, a combination of deep penetration shielding and an unshielded beamline, and a desire to obtain well-converged nearly global solutions for mapping of predicted radiation fields. The majority of these analyses have been performed using MCNPX with manually generated variance reduction parameters (source biasing and cell-based splitting and Russian roulette) that were largely based on the analyst's insight into the problem specifics. Development of the variance reduction parameters required extensive analyst time, and was often tailored to specific portionsmore » of the model phase space. We previously applied a developmental version of the ADVANTG code to an SNS beamline study to perform a hybrid deterministic/Monte Carlo analysis and showed that we could obtain nearly global Monte Carlo solutions with essentially uniform relative errors for mesh tallies that cover extensive portions of the model with typical voxel spacing of a few centimeters. The use of weight window maps and consistent biased sources produced using the FW-CADIS methodology in ADVANTG allowed us to obtain these solutions using substantially less computer time than the previous cell-based splitting approach. While those results were promising, the process of using the developmental version of ADVANTG was somewhat laborious, requiring user-developed Python scripts to drive much of the analysis sequence. In addition, limitations imposed by the size of weight-window files in MCNPX necessitated the use of relatively coarse spatial and energy discretization for the deterministic Denovo calculations that we used to generate the variance reduction parameters. We recently applied the production version of ADVANTG to this beamline analysis, which substantially streamlined the analysis process. We also tested importance function collapsing (in space and energy) capabilities in ADVANTG. These changes, along with the support for parallel Denovo calculations using the current version of ADVANTG, give us the capability to improve the fidelity of the deterministic portion of the hybrid analysis sequence, obtain improved weight-window maps, and reduce both the analyst and computational time required for the analysis process.« less
NASA Astrophysics Data System (ADS)
Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Duffy, P. B.
2007-12-01
Incorporating climate change information into long-term evaluations of water and energy resources requires analysts to have access to climate projection data that have been spatially downscaled to "basin-relevant" resolution. This is necessary in order to develop system-specific hydrology and demand scenarios consistent with projected climate scenarios. Analysts currently have access to "climate model" resolution data (e.g., at LLNL PCMDI), but not spatially downscaled translations of these datasets. Motivated by a common interest in supporting regional and local assessments, the U.S. Bureau of Reclamation and LLNL (through support from the DOE National Energy Technology Laboratory) have teamed to develop an archive of downscaled climate projections (temperature and precipitation) with geographic coverage consistent with the North American Land Data Assimilation System domain, encompassing the contiguous United States. A web-based information service, hosted at LLNL Green Data Oasis, has been developed to provide Reclamation, LLNL, and other interested analysts free access to archive content. A contemporary statistical method was used to bias-correct and spatially disaggregate projection datasets, and was applied to 112 projections included in the WCRP CMIP3 multi-model dataset hosted by LLNL PCMDI (i.e. 16 GCMs and their multiple simulations of SRES A2, A1b, and B1 emissions pathways).
Spatial-Operator Algebra For Robotic Manipulators
NASA Technical Reports Server (NTRS)
Rodriguez, Guillermo; Kreutz, Kenneth K.; Milman, Mark H.
1991-01-01
Report discusses spatial-operator algebra developed in recent studies of mathematical modeling, control, and design of trajectories of robotic manipulators. Provides succinct representation of mathematically complicated interactions among multiple joints and links of manipulator, thereby relieving analyst of most of tedium of detailed algebraic manipulations. Presents analytical formulation of spatial-operator algebra, describes some specific applications, summarizes current research, and discusses implementation of spatial-operator algebra in the Ada programming language.
Intelligent services for discovery of complex geospatial features from remote sensing imagery
NASA Astrophysics Data System (ADS)
Yue, Peng; Di, Liping; Wei, Yaxing; Han, Weiguo
2013-09-01
Remote sensing imagery has been commonly used by intelligence analysts to discover geospatial features, including complex ones. The overwhelming volume of routine image acquisition requires automated methods or systems for feature discovery instead of manual image interpretation. The methods of extraction of elementary ground features such as buildings and roads from remote sensing imagery have been studied extensively. The discovery of complex geospatial features, however, is still rather understudied. A complex feature, such as a Weapon of Mass Destruction (WMD) proliferation facility, is spatially composed of elementary features (e.g., buildings for hosting fuel concentration machines, cooling towers, transportation roads, and fences). Such spatial semantics, together with thematic semantics of feature types, can be used to discover complex geospatial features. This paper proposes a workflow-based approach for discovery of complex geospatial features that uses geospatial semantics and services. The elementary features extracted from imagery are archived in distributed Web Feature Services (WFSs) and discoverable from a catalogue service. Using spatial semantics among elementary features and thematic semantics among feature types, workflow-based service chains can be constructed to locate semantically-related complex features in imagery. The workflows are reusable and can provide on-demand discovery of complex features in a distributed environment.
A multi-phase network situational awareness cognitive task analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.
Abstract The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into making certain that we had feedback from network analysts and managers and understand what their genuine needs are. This article discusses the cognitive task-analysis methodology that we followed to acquire feedback from the analysts. This article also provides the details we acquired from the analysts on their processes, goals, concerns, themore » data and metadata that they analyze. Finally, we describe the generation of a novel task-flow diagram representing the activities of the target user base.« less
76 FR 79152 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-21
... request is for an extension of a currently approved collection. NOAA asks people who operate [email protected] . Dated: December 16, 2011. Gwellnar Banks, Management Analyst, Office of the Chief...
Visualization of Spatio-Temporal Relations in Movement Event Using Multi-View
NASA Astrophysics Data System (ADS)
Zheng, K.; Gu, D.; Fang, F.; Wang, Y.; Liu, H.; Zhao, W.; Zhang, M.; Li, Q.
2017-09-01
Spatio-temporal relations among movement events extracted from temporally varying trajectory data can provide useful information about the evolution of individual or collective movers, as well as their interactions with their spatial and temporal contexts. However, the pure statistical tools commonly used by analysts pose many difficulties, due to the large number of attributes embedded in multi-scale and multi-semantic trajectory data. The need for models that operate at multiple scales to search for relations at different locations within time and space, as well as intuitively interpret what these relations mean, also presents challenges. Since analysts do not know where or when these relevant spatio-temporal relations might emerge, these models must compute statistical summaries of multiple attributes at different granularities. In this paper, we propose a multi-view approach to visualize the spatio-temporal relations among movement events. We describe a method for visualizing movement events and spatio-temporal relations that uses multiple displays. A visual interface is presented, and the user can interactively select or filter spatial and temporal extents to guide the knowledge discovery process. We also demonstrate how this approach can help analysts to derive and explain the spatio-temporal relations of movement events from taxi trajectory data.
75 FR 37419 - Agency Information Collection Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-29
.... If you anticipate difficulty in submitting comments within that period or if you want access to the...: Written comments should be sent to the following: Denise Clarke, Procurement Analyst, MA-612/950 L'Enfant...
ERIC Educational Resources Information Center
Mulvenon, Sean W.; Wang, Kening; Mckenzie, Sarah; Anderson, Travis
2006-01-01
Effective exploration of spatially referenced educational achievement data can help educational researchers and policy analysts speed up gaining valuable insight into datasets. This article illustrates a demo system developed in the National Office for Research on Measurement and Evaluation Systems (NORMES) for supporting Web-based interactive…
NASA Astrophysics Data System (ADS)
Dąbski, Maciej; Zmarz, Anna; Pabjanek, Piotr; Korczak-Abshire, Małgorzata; Karsznia, Izabela; Chwedorzewska, Katarzyna J.
2017-08-01
High-resolution aerial images allow detailed analyses of periglacial landforms, which is of particular importance in light of climate change and resulting changes in active layer thickness. The aim of this study is to show possibilities of using UAV-based photography to perform spatial analysis of periglacial landforms on the Demay Point peninsula, King George Island, and hence to supplement previous geomorphological studies of the South Shetland Islands. Photogrammetric flights were performed using a PW-ZOOM fixed-winged unmanned aircraft vehicle. Digital elevation models (DEM) and maps of slope and contour lines were prepared in ESRI ArcGIS 10.3 with the Spatial Analyst extension, and three-dimensional visualizations in ESRI ArcScene 10.3 software. Careful interpretation of orthophoto and DEM, allowed us to vectorize polygons of landforms, such as (i) solifluction landforms (solifluction sheets, tongues, and lobes); (ii) scarps, taluses, and a protalus rampart; (iii) patterned ground (hummocks, sorted circles, stripes, nets and labyrinths, and nonsorted nets and stripes); (iv) coastal landforms (cliffs and beaches); (v) landslides and mud flows; and (vi) stone fields and bedrock outcrops. We conclude that geomorphological studies based on commonly accessible aerial and satellite images can underestimate the spatial extent of periglacial landforms and result in incomplete inventories. The PW-ZOOM UAV is well suited to gather detailed geomorphological data and can be used in spatial analysis of periglacial landforms in the Western Antarctic Peninsula region.
National Water-Quality Assessment (NAWQA) area-characterization toolbox
Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.
2010-01-01
This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells. These tools are built on top of standard functionality included in ArcGIS Desktop running at the ArcInfo license level. Most of the tools require a license for the ArcGIS Spatial Analyst extension. ArcGIS is a commercial GIS software system produced by ESRI, Inc. (http://www.esri.com). The NAWQA Area-Characterization Toolbox is not supported by ESRI, Inc. or its technical support staff. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.
NASA Astrophysics Data System (ADS)
Gad, Mohamed A.; Elshehaly, Mai H.; Gračanin, Denis; Elmongui, Hicham G.
2018-02-01
This research presents a novel Trajectory-based Tracking Analyst (TTA) that can track and link spatiotemporally variable data from multiple sources. The proposed technique uses trajectory information to determine the positions of time-enabled and spatially variable scatter data at any given time through a combination of along trajectory adjustment and spatial interpolation. The TTA is applied in this research to track large spatiotemporal data of volcanic eruptions (acquired using multi-sensors) in the unsteady flow field of the atmosphere. The TTA enables tracking injections into the atmospheric flow field, the reconstruction of the spatiotemporally variable data at any desired time, and the spatiotemporal join of attribute data from multiple sources. In addition, we were able to create a smooth animation of the volcanic ash plume at interactive rates. The initial results indicate that the TTA can be applied to a wide range of multiple-source data.
Fink, Bruce
2015-02-01
What is love and what part does it play in psychoanalysis? Where are the analyst and the analysand situated in relation to the roles defined as those of the "lover" and the "beloved"? Jacques Lacan explores these and other questions in his soon-to-be-published Seminar VIII: Transference by providing an extensive commentary on Plato's most famous dialogue on love, the Symposium. This paper outlines some of the major points about love that grow out of Lacan's reading of the dialogue and examines their relevance to the analytic setting. Can the analyst be characterized as a sort of modern-day Socrates?
A generalized baleen whale call detection and classification system.
Baumgartner, Mark F; Mussoline, Sarah E
2011-05-01
Passive acoustic monitoring allows the assessment of marine mammal occurrence and distribution at greater temporal and spatial scales than is now possible with traditional visual surveys. However, the large volume of acoustic data and the lengthy and laborious task of manually analyzing these data have hindered broad application of this technique. To overcome these limitations, a generalized automated detection and classification system (DCS) was developed to efficiently and accurately identify low-frequency baleen whale calls. The DCS (1) accounts for persistent narrowband and transient broadband noise, (2) characterizes temporal variation of dominant call frequencies via pitch-tracking, and (3) classifies calls based on attributes of the resulting pitch tracks using quadratic discriminant function analysis (QDFA). Automated detections of sei whale (Balaenoptera borealis) downsweep calls and North Atlantic right whale (Eubalaena glacialis) upcalls were evaluated using recordings collected in the southwestern Gulf of Maine during the spring seasons of 2006 and 2007. The accuracy of the DCS was similar to that of a human analyst: variability in differences between the DCS and an analyst was similar to that between independent analysts, and temporal variability in call rates was similar among the DCS and several analysts.
NASA Astrophysics Data System (ADS)
Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.
2016-12-01
The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.
Three-Dimensional Dispaly Of Document Set
Lantrip, David B.; Pennock, Kelly A.; Pottier, Marc C.; Schur, Anne; Thomas, James J.; Wise, James A.
2003-06-24
A method for spatializing text content for enhanced visual browsing and analysis. The invention is applied to large text document corpora such as digital libraries, regulations and procedures, archived reports, and the like. The text content from these sources may be transformed to a spatial representation that preserves informational characteristics from the documents. The three-dimensional representation may then be visually browsed and analyzed in ways that avoid language processing and that reduce the analysts' effort.
Three-dimensional display of document set
Lantrip, David B [Oxnard, CA; Pennock, Kelly A [Richland, WA; Pottier, Marc C [Richland, WA; Schur, Anne [Richland, WA; Thomas, James J [Richland, WA; Wise, James A [Richland, WA
2006-09-26
A method for spatializing text content for enhanced visual browsing and analysis. The invention is applied to large text document corpora such as digital libraries, regulations and procedures, archived reports, and the like. The text content from these sources may e transformed to a spatial representation that preserves informational characteristics from the documents. The three-dimensional representation may then be visually browsed and analyzed in ways that avoid language processing and that reduce the analysts' effort.
Three-dimensional display of document set
Lantrip, David B [Oxnard, CA; Pennock, Kelly A [Richland, WA; Pottier, Marc C [Richland, WA; Schur, Anne [Richland, WA; Thomas, James J [Richland, WA; Wise, James A [Richland, WA
2001-10-02
A method for spatializing text content for enhanced visual browsing and analysis. The invention is applied to large text document corpora such as digital libraries, regulations and procedures, archived reports, and the like. The text content from these sources may be transformed to a spatial representation that preserves informational characteristics from the documents. The three-dimensional representation may then be visually browsed and analyzed in ways that avoid language processing and that reduce the analysts' effort.
Three-dimensional display of document set
Lantrip, David B [Oxnard, CA; Pennock, Kelly A [Richland, WA; Pottier, Marc C [Richland, WA; Schur, Anne [Richland, WA; Thomas, James J [Richland, WA; Wise, James A [Richland, WA; York, Jeremy [Bothell, WA
2009-06-30
A method for spatializing text content for enhanced visual browsing and analysis. The invention is applied to large text document corpora such as digital libraries, regulations and procedures, archived reports, and the like. The text content from these sources may be transformed to a spatial representation that preserves informational characteristics from the documents. The three-dimensional representation may then be visually browsed and analyzed in ways that avoid language processing and that reduce the analysts' effort.
OSPAR standard method and software for statistical analysis of beach litter data.
Schulz, Marcus; van Loon, Willem; Fleet, David M; Baggelaar, Paul; van der Meulen, Eit
2017-09-15
The aim of this study is to develop standard statistical methods and software for the analysis of beach litter data. The optimal ensemble of statistical methods comprises the Mann-Kendall trend test, the Theil-Sen slope estimation, the Wilcoxon step trend test and basic descriptive statistics. The application of Litter Analyst, a tailor-made software for analysing the results of beach litter surveys, to OSPAR beach litter data from seven beaches bordering on the south-eastern North Sea, revealed 23 significant trends in the abundances of beach litter types for the period 2009-2014. Litter Analyst revealed a large variation in the abundance of litter types between beaches. To reduce the effects of spatial variation, trend analysis of beach litter data can most effectively be performed at the beach or national level. Spatial aggregation of beach litter data within a region is possible, but resulted in a considerable reduction in the number of significant trends. Copyright © 2017 Elsevier Ltd. All rights reserved.
How Analysts Cognitively “Connect the Dots”
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradel, Lauren; Self, Jessica S.; Endert, Alexander
2013-06-04
As analysts attempt to make sense of a collection of documents, such as intelligence analysis reports, they may wish to “connect the dots” between pieces of information that may initially seem unrelated. This process of synthesizing information between information requires users to make connections between pairs of documents, creating a conceptual story. We conducted a user study to analyze the process by which users connect pairs of documents and how they spatially arrange information. Users created conceptual stories that connected the dots using organizational strategies that ranged in complexity. We propose taxonomies for cognitive connections and physical structures used whenmore » trying to “connect the dots” between two documents. We compared the user-created stories with a data-mining algorithm that constructs chains of documents using co-occurrence metrics. Using the insight gained into the storytelling process, we offer design considerations for the existing data mining algorithm and corresponding tools to combine the power of data mining and the complex cognitive processing of analysts.« less
Damage Assessment for Disaster Relief Efforts in Urban Areas Using Optical Imagery and LiDAR Data
NASA Astrophysics Data System (ADS)
Bahr, Thomas
2014-05-01
Imagery combined with LiDAR data and LiDAR-derived products provides a significant source of geospatial data which is of use in disaster mitigation planning. Feature rich building inventories can be constructed from tools with 3D rooftop extraction capabilities, and two dimensional outputs such as DSMs and DTMs can be used to generate layers to support routing efforts in Spatial Analyst and Network Analyst workflows. This allows us to leverage imagery and LiDAR tools for disaster mitigation or other scenarios. Software such as ENVI, ENVI LiDAR, and ArcGIS® Spatial and Network Analyst can therefore be used in conjunction to help emergency responders route ground teams in support of disaster relief efforts. This is exemplified by a case study against the background of the magnitude 7.0 earthquake that struck Haiti's capital city of Port-au-Prince on January 12, 2010. Soon after, both LiDAR data and an 8-band WorldView-2 scene were collected to map the disaster zone. The WorldView-2 scene was orthorectified and atmospherically corrected in ENVI prior to use. ENVI LiDAR was used to extract the DSM, DTM, buildings, and debris from the LiDAR data point cloud. These datasets provide a foundation for the 2D portion of the analysis. As the data was acquired over an area of dense urbanization, the majority of ground surfaces are roads, and standing buildings and debris are actually largely separable on the basis of elevation classes. To extract the road network of Port-au-Prince, the LiDAR-based feature height information was fused with the WorldView-2 scene, using ENVI's object-based feature extraction approach. This road network was converted to a network dataset for further analysis by the ArcGIS Network Analyst. For the specific case of Haiti, the distribution of blue tarps, used as accommodations for refugees, provided a spectrally distinct target. Pure blue tarp pixel spectra were selected from the WorldView-2 scene and input as a reference into ENVI's Spectral Angle Mapper (SAM) classification routine, together with a water-shadow mask to prevent false positives. The resulting blue tarp shape file was input into the ArcGIS Point Density tool, a feature of the Spatial Analyst toolbox. The final distribution map shows the density of blue tarps in Port-au-Prince and can be used to roughly delineate camps of refugees. Analogous, a debris density map was generated after separating the debris elevation class. The combination of this debris density map with the road network allowed to construct an intact road network of Port-au-Prince within the ArcGIS Network Analyst. Moderate density debris was used as a cost-increase barrier feature of the network dataset, and high density debris was used as a total obstruction barrier feature. Based on this information, two hypothetical routing scenarios were analyzed. One involved routing a ground team between two different refugee concentration zones. For the other, potential helicopter landing zones were computed from the LiDAR-derived products and added as facility features to the Network Analyst. Routes from the helicopter landing zones to refugee concentration access points were solved using closest facility logic, again making use of the obstructed network.
Supporting the Growing Needs of the GIS Industry
NASA Technical Reports Server (NTRS)
2003-01-01
Visual Learning Systems, Inc. (VLS), of Missoula, Montana, has developed a commercial software application called Feature Analyst. Feature Analyst was conceived under a Small Business Innovation Research (SBIR) contract with NASA's Stennis Space Center, and through the Montana State University TechLink Center, an organization funded by NASA and the U.S. Department of Defense to link regional companies with Federal laboratories for joint research and technology transfer. The software provides a paradigm shift to automated feature extraction, as it utilizes spectral, spatial, temporal, and ancillary information to model the feature extraction process; presents the ability to remove clutter; incorporates advanced machine learning techniques to supply unparalleled levels of accuracy; and includes an exceedingly simple interface for feature extraction.
76 FR 44580 - Agency Information Collection Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-26
... as soon as possible. Richard Langston, Procurement Analyst, MA-611/L'Enfant Plaza Building, U.S... . FOR FURTHER INFORMATION CONTACT: Richard Langston at the above address, or by telephone at (202) 287... Authority: 42 U.S.C. 2201. Issued in Washington, DC on July 20, 2011. Patrick Ferraro, Acting Director...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-13
... DEPARTMENT OF JUSTICE National Drug Intelligence Center [OMB Number 1105-0087] Agency Information... of Justice (DOJ), National Drug Intelligence Center (NDIC), will be submitting the following... technicians, medical examiners); and other specific groups such as drug intelligence analysts. The NDIC has...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-14
... DEPARTMENT OF COMMERCE International Trade Administration [A-570-918] Steel Wire Garment Hangers... International Trade Compliance Analyst, Office 9, regarding Third Administrative Review of Steel Wire Garment..., Import Administration; regarding the Antidumping Duty Administrative Review of Steel Garment Wire Hangers...
77 FR 36478 - Notice of Request for Extension of a Currently Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-19
... support of the Single Family Housing Direct Loans and Grants programs. The collection involves the use of... consideration. FOR FURTHER INFORMATION CONTACT: Migdaliz Bernier, Finance and Loan Analyst, Single Family... information is estimated to average 6 minutes per response. Respondents: Individuals and business already...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-16
...: Notice of request for public comments regarding an extension to an existing OMB clearance. SUMMARY: Under...: Ms. Cecelia L. Davis, Procurement Analyst, Acquisition Policy Division, GSA (202) 219-0202 or email... to obtaining financial protection against damages under Government contracts (e.g., use of bonds, bid...
J-adaptive estimation with estimated noise statistics
NASA Technical Reports Server (NTRS)
Jazwinski, A. H.; Hipkins, C.
1973-01-01
The J-adaptive sequential estimator is extended to include simultaneous estimation of the noise statistics in a model for system dynamics. This extension completely automates the estimator, eliminating the requirement of an analyst in the loop. Simulations in satellite orbit determination demonstrate the efficacy of the sequential estimation algorithm.
Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.
Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi
2015-02-01
We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-19
... DEPARTMENT OF COMMERCE International Trade Administration [A-570-918] Steel Wire Garment Hangers... administrative review of the antidumping duty order on steel wire garment hangers from the People's Republic of... Trade Compliance Analyst, Office 9, regarding the Second Administrative Review of Steel Wire Garment...
Market-Based Reforms in Urban Education.
ERIC Educational Resources Information Center
Ladd, Helen F.
This paper is for policymakers, advocates, and analysts who understand that the issues surrounding the introduction of more market-based mechanisms into education are complex and who accept the view that evidence is useful in sorting out the issues. It uses the market framework of demand, supply, and market pricing to organize the extensive but…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-11
... deadlines for submission of surrogate value data, surrogate value rebuttal comments, case and rebuttal briefs. See Memorandum to the File, from Javier Barrientos, Senior Case Analyst, Certain Frozen Fish... extended the deadlines for submission of case and rebuttal briefs. See Memorandum to the File, from Javier...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-01
... for submission of surrogate value data, surrogate value rebuttal comments, case and rebuttal briefs. See Memorandum to the File, from Javier Barrientos, Senior Case Analyst, Certain Frozen Fish Fillets... extended the deadlines for submission of case and rebuttal briefs. See Memorandum to the File, from Javier...
Casement, Ann
2014-02-01
The Jungian analyst Gerhard Adler left Berlin and re-settled in London in 1936. He was closely involved with the professionalization of analytical psychology internationally and in the UK, including the formation of the International Association for Analytical Psychology (IAAP) and The Society of Analytical Psychology (SAP).The tensions that arose within the latter organization led to a split that ended in the formation of the Association of Jungian Analysts (AJA). A further split at AJA resulted in the creation of another organization, the Independent Group of Analytical Psychologists (IGAP). Adler's extensive publications include his role as an editor of Jung's Collected Works and as editor of the C.G. Jung Letters. © 2014, The Society of Analytical Psychology.
Geographic analysis of forest health indicators using spatial scan statistics
John W. Coulston; Kurt H. Riitters
2003-01-01
Forest health analysts seek to define the location, extent, and magnitude of changes in forest ecosystems, to explain the observed changes when possible, and to draw attention to the unexplained changes for further investigation. The data come from a variety of sources including satellite images, field plot measurements, and low-altitude aerial surveys. Indicators...
Tsai, Yu Hsin; Stow, Douglas; Weeks, John
2013-01-01
The goal of this study was to map and quantify the number of newly constructed buildings in Accra, Ghana between 2002 and 2010 based on high spatial resolution satellite image data. Two semi-automated feature detection approaches for detecting and mapping newly constructed buildings based on QuickBird very high spatial resolution satellite imagery were analyzed: (1) post-classification comparison; and (2) bi-temporal layerstack classification. Feature Analyst software based on a spatial contextual classifier and ENVI Feature Extraction that uses a true object-based image analysis approach of image segmentation and segment classification were evaluated. Final map products representing new building objects were compared and assessed for accuracy using two object-based accuracy measures, completeness and correctness. The bi-temporal layerstack method generated more accurate results compared to the post-classification comparison method due to less confusion with background objects. The spectral/spatial contextual approach (Feature Analyst) outperformed the true object-based feature delineation approach (ENVI Feature Extraction) due to its ability to more reliably delineate individual buildings of various sizes. Semi-automated, object-based detection followed by manual editing appears to be a reliable and efficient approach for detecting and enumerating new building objects. A bivariate regression analysis was performed using neighborhood-level estimates of new building density regressed on a census-derived measure of socio-economic status, yielding an inverse relationship with R2 = 0.31 (n = 27; p = 0.00). The primary utility of the new building delineation results is to support spatial analyses of land cover and land use and demographic change. PMID:24415810
The Use of Object-Oriented Analysis Methods in Surety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.
1999-05-01
Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less
Germaine, Stephen S.; O'Donnell, Michael S.; Aldridge, Cameron L.; Baer, Lori; Fancher, Tammy; McBeth, Jamie; McDougal, Robert R.; Waltermire, Robert; Bowen, Zachary H.; Diffendorfer, James; Garman, Steven; Hanson, Leanne
2012-01-01
We evaluated how well three leading information-extraction software programs (eCognition, Feature Analyst, Feature Extraction) and manual hand digitization interpreted information from remotely sensed imagery of a visually complex gas field in Wyoming. Specifically, we compared how each mapped the area of and classified the disturbance features present on each of three remotely sensed images, including 30-meter-resolution Landsat, 10-meter-resolution SPOT (Satellite Pour l'Observation de la Terre), and 0.6-meter resolution pan-sharpened QuickBird scenes. Feature Extraction mapped the spatial area of disturbance features most accurately on the Landsat and QuickBird imagery, while hand digitization was most accurate on the SPOT imagery. Footprint non-overlap error was smallest on the Feature Analyst map of the Landsat imagery, the hand digitization map of the SPOT imagery, and the Feature Extraction map of the QuickBird imagery. When evaluating feature classification success against a set of ground-truthed control points, Feature Analyst, Feature Extraction, and hand digitization classified features with similar success on the QuickBird and SPOT imagery, while eCognition classified features poorly relative to the other methods. All maps derived from Landsat imagery classified disturbance features poorly. Using the hand digitized QuickBird data as a reference and making pixel-by-pixel comparisons, Feature Extraction classified features best overall on the QuickBird imagery, and Feature Analyst classified features best overall on the SPOT and Landsat imagery. Based on the entire suite of tasks we evaluated, Feature Extraction performed best overall on the Landsat and QuickBird imagery, while hand digitization performed best overall on the SPOT imagery, and eCognition performed worst overall on all three images. Error rates for both area measurements and feature classification were prohibitively high on Landsat imagery, while QuickBird was time and cost prohibitive for mapping large spatial extents. The SPOT imagery produced map products that were far more accurate than Landsat and did so at a far lower cost than QuickBird imagery. Consideration of degree of map accuracy required, costs associated with image acquisition, software, operator and computation time, and tradeoffs in the form of spatial extent versus resolution should all be considered when evaluating which combination of imagery and information-extraction method might best serve any given land use mapping project. When resources permit, attaining imagery that supports the highest classification and measurement accuracy possible is recommended.
Fuzzification of continuous-value spatial evidence for mineral prospectivity mapping
NASA Astrophysics Data System (ADS)
Yousefi, Mahyar; Carranza, Emmanuel John M.
2015-01-01
Complexities of geological processes portrayed as certain feature in a map (e.g., faults) are natural sources of uncertainties in decision-making for exploration of mineral deposits. Besides natural sources of uncertainties, knowledge-driven (e.g., fuzzy logic) mineral prospectivity mapping (MPM) is also plagued and incurs further uncertainty in subjective judgment of analyst when there is no reliable proven value of evidential scores corresponding to relative importance of geological features that can directly be measured. In this regard, analysts apply expert opinion to assess relative importance of spatial evidences as meaningful decision support. This paper aims for fuzzification of continuous spatial data used as proxy evidence to facilitate and to support fuzzy MPM to generate exploration target areas for further examination of undiscovered deposits. In addition, this paper proposes to adapt the concept of expected value to further improve fuzzy logic MPM because the analysis of uncertain variables can be presented in terms of their expected value. The proposed modified expected value approach to MPM is not only a multi-criteria approach but it also treats uncertainty of geological processes a depicted by maps or spatial data in term of biased weighting more realistically in comparison with classified evidential maps because fuzzy membership scores are defined continuously whereby, for example, there is no need to categorize distances from evidential features to proximity classes using arbitrary intervals. The proposed continuous weighting approach and then integrating the weighted evidence layers by using modified expected value function, described in this paper can be used efficiently in either greenfields or brownfields.
Principles of computer processing of Landsat data for geologic applications
Taranik, James V.
1978-01-01
The main objectives of computer processing of Landsat data for geologic applications are to improve display of image data to the analyst or to facilitate evaluation of the multispectral characteristics of the data. Interpretations of the data are made from enhanced and classified data by an analyst trained in geology. Image enhancements involve adjustments of brightness values for individual picture elements. Image classification involves determination of the brightness values of picture elements for a particular cover type. Histograms are used to display the range and frequency of occurrence of brightness values. Landsat-1 and -2 data are preprocessed at Goddard Space Flight Center (GSFC) to adjust for the detector response of the multispectral scanner (MSS). Adjustments are applied to minimize the effects of striping, adjust for bad-data lines and line segments and lost individual pixel data. Because illumination conditions and landscape characteristics vary considerably and detector response changes with time, the radiometric adjustments applied at GSFC are seldom perfect and some detector striping remain in Landsat data. Rotation of the Earth under the satellite and movements of the satellite platform introduce geometric distortions in the data that must also be compensated for if image data are to be correctly displayed to the data analyst. Adjustments to Landsat data are made to compensate for variable solar illumination and for atmospheric effects. GeoMetric registration of Landsat data involves determination of the spatial location of a pixel in. the output image and the determination of a new value for the pixel. The general objective of image enhancement is to optimize display of the data to the analyst. Contrast enhancements are employed to expand the range of brightness values in Landsat data so that the data can be efficiently recorded in a manner desired by the analyst. Spatial frequency enhancements are designed to enhance boundaries between features which have subtle differences in brightness values. Ratioing tends to reduce the effects due to topography and it tends to emphasize changes in brightness values between two Landsat bands. Simulated natural color is produced for geologists so that the colors of materials on images appear similar to colors of actual materials in the field. Image classification of Landsat data involves both machine assisted delineation of multispectral patterns in four-dimensional spectral space and identification of machine delineated multispectral patterns that represent particular cover conditions. The geological information derived from an analysis of a multispectral classification is usually related to lithology.
Phase 2 and phase 3 presentation grids
Joseph McCollum; Jamie K. Cochran
2009-01-01
Many forest inventory and analysis (FIA) analysts, other researchers, and FIA Spatial Data Services personnel have expressed their desire to use the FIA Phase 2 (P2) and Phase 3 (P3), and Forest Health Monitoring (FHM) grids in presentations and other analytical reports. Such uses have been prohibited due to the necessity of keeping the actual P2, P3, and FHM grids...
NASA Technical Reports Server (NTRS)
Friedman, S. Z.; Walker, R. E.; Aitken, R. B.
1986-01-01
The Image Based Information System (IBIS) has been under development at the Jet Propulsion Laboratory (JPL) since 1975. It is a collection of more than 90 programs that enable processing of image, graphical, tabular data for spatial analysis. IBIS can be utilized to create comprehensive geographic data bases. From these data, an analyst can study various attributes describing characteristics of a given study area. Even complex combinations of disparate data types can be synthesized to obtain a new perspective on spatial phenomena. In 1984, new query software was developed enabling direct Boolean queries of IBIS data bases through the submission of easily understood expressions. An improved syntax methodology, a data dictionary, and display software simplified the analysts' tasks associated with building, executing, and subsequently displaying the results of a query. The primary purpose of this report is to describe the features and capabilities of the new query software. A secondary purpose of this report is to compare this new query software to the query software developed previously (Friedman, 1982). With respect to this topic, the relative merits and drawbacks of both approaches are covered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coram, Jamie L.; Morrow, James D.; Perkins, David Nikolaus
2015-09-01
This document describes the PANTHER R&D Application, a proof-of-concept user interface application developed under the PANTHER Grand Challenge LDRD. The purpose of the application is to explore interaction models for graph analytics, drive algorithmic improvements from an end-user point of view, and support demonstration of PANTHER technologies to potential customers. The R&D Application implements a graph-centric interaction model that exposes analysts to the algorithms contained within the GeoGraphy graph analytics library. Users define geospatial-temporal semantic graph queries by constructing search templates based on nodes, edges, and the constraints among them. Users then analyze the results of the queries using bothmore » geo-spatial and temporal visualizations. Development of this application has made user experience an explicit driver for project and algorithmic level decisions that will affect how analysts one day make use of PANTHER technologies.« less
ERIC Educational Resources Information Center
RONEY, MAURICE W.; AND OTHERS
DESIGNED FOR USE IN PLANNING PREPARATORY PROGRAMS, THIS CURRICULUM CAN ALSO BE USEFUL IN PLANNING EXTENSION COURSES FOR EMPLOYED PERSONS. MATERIALS WERE ADAPTED FROM A GUIDE PREPARED BY ORANGE COAST COLLEGE, CALIFORNIA, UNDER A CONTRACTUAL ARRANGEMENT WITH THE U.S. OFFICE OF EDUCATION, AND REVIEWED BY A COMMITTEE COMPOSED OF SPECIALISTS IN DATA…
NASA Technical Reports Server (NTRS)
Peters, C.; Kampe, F. (Principal Investigator)
1980-01-01
The mathematical description and implementation of the statistical estimation procedure known as the Houston integrated spatial/spectral estimator (HISSE) is discussed. HISSE is based on a normal mixture model and is designed to take advantage of spectral and spatial information of LANDSAT data pixels, utilizing the initial classification and clustering information provided by the AMOEBA algorithm. The HISSE calculates parametric estimates of class proportions which reduce the error inherent in estimates derived from typical classify and count procedures common to nonparametric clustering algorithms. It also singles out spatial groupings of pixels which are most suitable for labeling classes. These calculations are designed to aid the analyst/interpreter in labeling patches with a crop class label. Finally, HISSE's initial performance on an actual LANDSAT agricultural ground truth data set is reported.
NASA Astrophysics Data System (ADS)
Dempewolf, J.; Becker-Reshef, I.; Nakalembe, C. L.; Tumbo, S.; Maurice, S.; Mbilinyi, B.; Ntikha, O.; Hansen, M.; Justice, C. J.; Adusei, B.; Kongo, V.
2015-12-01
In-season monitoring of crop conditions provides critical information for agricultural policy and decision making and most importantly for food security planning and management. Nationwide agricultural monitoring in countries dominated by smallholder farming systems, generally relies on extensive networks of field data collectors. In Tanzania, extension agents make up this network and report on conditions across the country, approaching a "near-census". Data is collected on paper which is resource and time intensive, as well as prone to errors. Data quality is ambiguous and there is a general lack of clear and functional feedback loops between farmers, extension agents, analysts and decision makers. Moreover, the data are not spatially explicit, limiting the usefulness for analysis and quality of policy outcomes. Despite significant advances in remote sensing and information communication technologies (ICT) for monitoring agriculture, the full potential of these new tools is yet to be realized in Tanzania. Their use is constrained by the lack of resources, skills and infrastructure to access and process these data. The use of ICT technologies for data collection, processing and analysis is equally limited. The AgriSense-STARS project is developing and testing a system for national-scale in-season monitoring of smallholder agriculture using a combination of three main tools, 1) GLAM-East Africa, an automated MODIS satellite image processing system, 2) field data collection using GeoODK and unmanned aerial vehicles (UAVs), and 3) the Tanzania Crop Monitor, a collaborative online portal for data management and reporting. These tools are developed and applied in Tanzania through the National Food Security Division of the Ministry of Agriculture, Food Security and Cooperatives (MAFC) within a statistically representative sampling framework (area frame) that ensures data quality, representability and resource efficiency.
On the extensive unification of digital-to-analog converters and kernels
NASA Astrophysics Data System (ADS)
Liao, Yanchu
2012-09-01
System administrators agree that scalable communication is an interesting new topic in the field of steganography, and leading analysts concur. After years of unfortunate re-search into context-free grammar, we argue the intuitive unification of fiber-optic cables and context-free grammar. Our focus here is not on whether sensor networks and randomized algorithms can collaborate to accomplish this aim, but rather on introducing an analysis of DHTs [2] (Soupy Coil).
NASA Astrophysics Data System (ADS)
Garfinkle, Noah W.; Selig, Lucas; Perkins, Timothy K.; Calfas, George W.
2017-05-01
Increasing worldwide internet connectivity and access to sources of print and open social media has increased near realtime availability of textual information. Capabilities to structure and integrate textual data streams can contribute to more meaningful representations of operational environment factors (i.e., Political, Military, Economic, Social, Infrastructure, Information, Physical Environment, and Time [PMESII-PT]) and tactical civil considerations (i.e., Areas, Structures, Capabilities, Organizations, People and Events [ASCOPE]). However, relying upon human analysts to encode this information as it arrives quickly proves intractable. While human analysts possess an ability to comprehend context in unstructured text far beyond that of computers, automated geoparsing (the extraction of locations from unstructured text) can empower analysts to automate sifting through datasets for areas of interest. This research evaluates existing approaches to geoprocessing as well as initiating the research and development of locally-improved methods of tagging parts of text as possible locations, resolving possible locations into coordinates, and interfacing such results with human analysts. The objective of this ongoing research is to develop a more contextually-complete picture of an area of interest (AOI) including human-geographic context for events. In particular, our research is working to make improvements to geoparsing (i.e., the extraction of spatial context from documents), which requires development, integration, and validation of named-entity recognition (NER) tools, gazetteers, and entity-attribution. This paper provides an overview of NER models and methodologies as applied to geoparsing, explores several challenges encountered, presents preliminary results from the creation of a flexible geoparsing research pipeline, and introduces ongoing and future work with the intention of contributing to the efficient geocoding of information containing valuable insights into human activities in space.
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
Gandjour, Afschin; Müller, Dirk
2014-10-01
One of the major ethical concerns regarding cost-effectiveness analysis in health care has been the inclusion of life-extension costs ("it is cheaper to let people die"). For this reason, many analysts have opted to rule out life-extension costs from the analysis. However, surprisingly little has been written in the health economics literature regarding this ethical concern and the resulting practice. The purpose of this work was to present a framework and potential solution for ethical objections against life-extension costs. This work found three levels of ethical concern: (i) with respect to all life-extension costs (disease-related and -unrelated); (ii) with respect to disease-unrelated costs only; and (iii) regarding disease-unrelated costs plus disease-related costs not influenced by the intervention. Excluding all life-extension costs for ethical reasons would require-for reasons of consistency-a simultaneous exclusion of savings from reducing morbidity. At the other extreme, excluding only disease-unrelated life-extension costs for ethical reasons would require-again for reasons of consistency-the exclusion of health gains due to treatment of unrelated diseases. Therefore, addressing ethical concerns regarding the inclusion of life-extension costs necessitates fundamental changes in the calculation of cost effectiveness.
Mitrani, Judith L
2011-02-01
The author suggests a number of technical extensions/clinical applications of Frances Tustin's work with autistic children, which are applicable to the psychoanalysis of neurotic, borderline and psychotic adults. These are especially relevant to those individuals in whom early uncontained happenings (Bion) have been silently encapsulated through the use of secretive autosensual maneuvers related to autistic objects and shapes. Although such encapsulations may constitute obstacles to emotional and intellectual development, are consequential in both the relational and vocational spheres for many analysands and present unending challenges for their analysts, the author demonstrates ways in which it may be possible to detect and to modify these in a transference-centered analysis. A detailed process of differential diagnosis between autistic states and neurotic/narcissistic (object-related) states in adults is outlined, along with several clinical demonstrations of the handling of a variety of elemental terrors, including the 'dread of dissolution.' The idiosyncratic and perverse use of the analytic setting and of the analyst and issues of the analysand's motivations are considered and illustrated. A new model related to 'objects in the periphery' is introduced as an alternative to the more classical Kleinian models regarding certain responses and/or non-responses to transference interpretation. Issues a propos the countertransference are also taken up throughout. Copyright © 2011 Institute of Psychoanalysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vimmerstedt, Laura J.; Jadun, Paige; McMillan, Colin A.
This report provides projected cost and performance assumptions for electric technologies considered in the Electrification Futures Study, a detailed and comprehensive analysis of the effects of widespread electrification of end-use service demands in all major economic sectors - transportation, residential and commercial buildings, and industry - for the contiguous United States through 2050. Using extensive literature searches and expert assessment, the authors identify slow, moderate, and rapid technology advancement sensitivities on technology cost and performance, and they offer a comparative analysis of levelized cost metrics as a reference indicator of total costs. The identification and characterization of these end-use servicemore » demand technologies is fundamental to the Electrification Futures Study. This report, the larger Electrification Futures Study, and the associated data and methodologies may be useful to planners and analysts in evaluating the potential role of electrification in an uncertain future. The report could be broadly applicable for other analysts and researchers who wish to assess electrification and electric technologies.« less
NASA Astrophysics Data System (ADS)
Vijith, H.; Satheesh, R.
2007-09-01
Hydrogeochemistry of groundwater in upland sub-watersheds of Meenachil river, parts of Western Ghats, Kottayam, Kerala, India was used to assess the quality of groundwater for determining its suitability for drinking and agricultural purposes. The study area is dominated by rocks of Archaean age, and Charnonckite is dominated over other rocks. Rubber plantation dominated over other types of the vegetation in the area. Though the study area receives heavy rainfall, it frequently faces water scarcity as well as water quality problems. Hence, a Geographical Information System (GIS) based assessment of spatiotemporal behaviour of groundwater quality has been carried out in the region. Twenty-eight water samples were collected from different wells and analysed for major chemical constituents both in monsoon and post-monsoon seasons to determine the quality variation. Physical and chemical parameters of groundwater such as pH, dissolved oxygen (DO), total hardness (TH), chloride (Cl), nitrate (NO3) and phosphate (PO4) were determined. A surface map was prepared in the ArcGIS 8.3 (spatial analyst module) to assess the quality in terms of spatial variation, and it showed that the high and low regions of water quality varied spatially during the study period. The influence of lithology over the quality of groundwater is negligible in this region because majority of the area comes under single lithology, i.e. charnockite, and it was found that the extensive use of fertilizers and pesticides in the rubber, tea and other agricultural practices influenced the groundwater quality of the region. According to the overall assessment of the basin, all the parameters analysed are below the desirable limits of WHO and Indian standards for drinking water. Hence, considering the pH, the groundwater in the study area is not suitable for drinking but can be used for irrigation, industrial and domestic purposes. The spatial analysis of groundwater quality patterns of the study area shows seasonal fluctuations and these spatial patterns of physical and chemical constituents are useful in deciding water use strategies for various purposes.
ERIC Educational Resources Information Center
Johanson, Megan; Papafragou, Anna
2014-01-01
Children's overextensions of spatial language are often taken to reveal spatial biases. However, it is unclear whether extension patterns should be attributed to children's overly general spatial concepts or to a narrower notion of conceptual similarity allowing metaphor-like extensions. We describe a previously unnoticed extension of…
Multispectral scanner system parameter study and analysis software system description, volume 2
NASA Technical Reports Server (NTRS)
Landgrebe, D. A. (Principal Investigator); Mobasseri, B. G.; Wiersma, D. J.; Wiswell, E. R.; Mcgillem, C. D.; Anuta, P. E.
1978-01-01
The author has identified the following significant results. The integration of the available methods provided the analyst with the unified scanner analysis package (USAP), the flexibility and versatility of which was superior to many previous integrated techniques. The USAP consisted of three main subsystems; (1) a spatial path, (2) a spectral path, and (3) a set of analytic classification accuracy estimators which evaluated the system performance. The spatial path consisted of satellite and/or aircraft data, data correlation analyzer, scanner IFOV, and random noise model. The output of the spatial path was fed into the analytic classification and accuracy predictor. The spectral path consisted of laboratory and/or field spectral data, EXOSYS data retrieval, optimum spectral function calculation, data transformation, and statistics calculation. The output of the spectral path was fended into the stratified posterior performance estimator.
DOT National Transportation Integrated Search
2009-01-01
This booklet provides an overview of SafetyAnalyst. SafetyAnalyst is a set of software tools under development to help State and local highway agencies advance their programming of site-specific safety improvements. SafetyAnalyst will incorporate sta...
A crisis in the analyst's life: self-containment, symbolization, and the holding space.
Michelle, Flax
2011-04-01
Most analysts will experience some degree of crisis in the course of their working life. This paper explores the complex interplay between the analyst's affect during a crisis in her lifeü and the affective dynamics of the patient. The central question is "who or what holds the analyst"--especially in times of crisis. Symbolization of affect, facilitated by the analyst's self-created holding environment, is seen as a vital process in order for containment to take place. In the clinical case presented, the analyst's dog was an integral part of the analyst's self-righting through this difficult period; the dog functioned as an "analytic object" within the analysis.
Lagrangian continuum dynamics in ALEGRA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, Michael K. W.; Love, Edward
Alegra is an ALE (Arbitrary Lagrangian-Eulerian) multi-material finite element code that emphasizes large deformations and strong shock physics. The Lagrangian continuum dynamics package in Alegra uses a Galerkin finite element spatial discretization and an explicit central-difference stepping method in time. The goal of this report is to describe in detail the characteristics of this algorithm, including the conservation and stability properties. The details provided should help both researchers and analysts understand the underlying theory and numerical implementation of the Alegra continuum hydrodynamics algorithm.
Effects of Motivation: Rewarding Hackers for Undetected Attacks Cause Analysts to Perform Poorly.
Maqbool, Zahid; Makhijani, Nidhi; Pammi, V S Chandrasekhar; Dutt, Varun
2017-05-01
The aim of this study was to determine how monetary motivations influence decision making of humans performing as security analysts and hackers in a cybersecurity game. Cyberattacks are increasing at an alarming rate. As cyberattacks often cause damage to existing cyber infrastructures, it is important to understand how monetary rewards may influence decision making of hackers and analysts in the cyber world. Currently, only limited attention has been given to this area. In an experiment, participants were randomly assigned to three between-subjects conditions ( n = 26 for each condition): equal payoff, where the magnitude of monetary rewards for hackers and defenders was the same; rewarding hacker, where the magnitude of monetary reward for hacker's successful attack was 10 times the reward for analyst's successful defense; and rewarding analyst, where the magnitude of monetary reward for analyst's successful defense was 10 times the reward for hacker's successful attack. In all conditions, half of the participants were human hackers playing against Nash analysts and half were human analysts playing against Nash hackers. Results revealed that monetary rewards for human hackers and analysts caused a decrease in attack and defend actions compared with the baseline. Furthermore, rewarding human hackers for undetected attacks made analysts deviate significantly from their optimal behavior. If hackers are rewarded for their undetected attack actions, then this causes analysts to deviate from optimal defend proportions. Thus, analysts need to be trained not become overenthusiastic in defending networks. Applications of our results are to networks where the influence of monetary rewards may cause information theft and system damage.
NASA Astrophysics Data System (ADS)
Persson, A.; Connolly, J.
2016-12-01
Peatlands or mires contain about one third of the global terrestrial carbon pool and are located on between 3-6% of the global land area. In boreal and sub-arctic permafrost peatlands the soil organic carbon (SOC) pools are stable and decomposition is suspended only as long as the soil is frozen. Climate warming is projected to be greater in the high latitudes, observed mean annual air temperatures in northern Sweden have increased by 2-3oC since the 1950s. Thawing permafrost leads to new hydrological regimes potentially leading to increased production of methane. In this study, two sets of data were analysed: (i) a stereo-pair of black and white aerial photographs acquired in August 1943 by the Swedish Airforce, with a spatial resolution of 50cm, and (ii) a geo-rectified Worldview2 (WV2) multispectral image acquired on the 24th of July, 2013. The aerial photographs were digitized using a very high resolution camera, georeferenced and incorporated into a geodatabase. The analysis of image areas was performed by heads-up visual interpretation both on a computer monitor and through stereoscopes. The aim was to identify wet and dry areas in the palsa peatland. Feature Analyst (FA) object oriented image analysis (OBIA) was used with the WV2 dataset to extract features that are related to the hydrological state of the mire. Feature Analyst is an extension to ArcGIS. The method uses a black box algorithm that can be adjusted with several parameters to aid classification and feature extraction in an image. Previous studies that analysed aerial photographs from 1970 and 2000 showed that there was an increase in the amount of wet areas on the Swedish palsa bog mire Stordalen. In this study we determine the change in wet areas over a seventy-year period. The central part of the palsa mire has been extensively studied as it has been presumed that it has collapsed due to warmer temperatures in recent decades. However, our analysis shows that much of the internal hydrological patterns on this part of the palsa bog seem to be temporally stable, at least since 1943. Macro changes not identified in previous studies are observed here where it can be seen that the extent of the palsa has retreated, in areas contiguous to streamflow, possibly in response to contact with relatively warmer streamflow.
Ice tracking techniques, implementation, performance, and applications
NASA Technical Reports Server (NTRS)
Rothrock, D. A.; Carsey, F. D.; Curlander, J. C.; Holt, B.; Kwok, R.; Weeks, W. F.
1992-01-01
Present techniques of ice tracking make use both of cross-correlation and of edge tracking, the former being more successful in heavy pack ice, the latter being critical for the broken ice of the pack margins. Algorithms must assume some constraints on the spatial variations of displacements to eliminate fliers, but must avoid introducing any errors into the spatial statistics of the measured displacement field. We draw our illustrations from the implementation of an automated tracking system for kinematic analyses of ERS-1 and JERS-1 SAR imagery at the University of Alaska - the Alaska SAR Facility's Geophysical Processor System. Analyses of the ice kinematic data that might have some general interest to analysts of cloud-derived wind fields are the spatial structure of the fields, and the evaluation and variability of average deformation and its invariants: divergence, vorticity and shear. Many problems in sea ice dynamics and mechanics can be addressed with the kinematic data from SAR.
Visual mining geo-related data using pixel bar charts
NASA Astrophysics Data System (ADS)
Hao, Ming C.; Keim, Daniel A.; Dayal, Umeshwar; Wright, Peter; Schneidewind, Joern
2005-03-01
A common approach to analyze geo-related data is using bar charts or x-y plots. They are intuitive and easy to use. But important information often gets lost. In this paper, we introduce a new interactive visualization technique called Geo Pixel Bar Charts, which combines the advantages of Pixel Bar Charts and interactive maps. This technique allows analysts to visualize large amounts of spatial data without aggregation and shows the geographical regions corresponding to the spatial data attribute at the same time. In this paper, we apply Geo Pixel Bar Charts to visually mining sales transactions and Internet usage from different locations. Our experimental results show the effectiveness of this technique for providing data distribution and exceptions from the map.
Dynamics of analyst forecasts and emergence of complexity: Role of information disparity
Ahn, Kwangwon
2017-01-01
We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts’ forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems. PMID:28498831
Interpretation and the psychic future.
Cooper, S H
1997-08-01
The author applies the analyst's multi-faceted awareness of his or her view of the patient's psychic future to analytic process. Loewald's (1960) interest in the way in which the analyst anticipates the future of the patient was linked to his epistemological assumptions about the analyst's superior objectivity and maturity relative to the patient. The elucidation of the authority of the analyst (e.g. Hoffman, 1991, 1994) allows us to begin to disentangle the analyst's view of the patient's psychic future from some of these epistemological assumptions. Clinical illustrations attempt to show how the analyst's awareness of this aspect of the interpretive process is often deconstructed over time and can help to understand aspects of resistance from both analyst and patient. This perspective may provide one more avenue for understanding our various modes of influence through interpretive process.
NASA Astrophysics Data System (ADS)
Czajkowski, M.; Shilliday, A.; LoFaso, N.; Dipon, A.; Van Brackle, D.
2016-09-01
In this paper, we describe and depict the Defense Advanced Research Projects Agency (DARPA)'s OrbitOutlook Data Archive (OODA) architecture. OODA is the infrastructure that DARPA's OrbitOutlook program has developed to integrate diverse data from various academic, commercial, government, and amateur space situational awareness (SSA) telescopes. At the heart of the OODA system is its world model - a distributed data store built to quickly query big data quantities of information spread out across multiple processing nodes and data centers. The world model applies a multi-index approach where each index is a distinct view on the data. This allows for analysts and analytics (algorithms) to access information through queries with a variety of terms that may be of interest to them. Our indices include: a structured global-graph view of knowledge, a keyword search of data content, an object-characteristic range search, and a geospatial-temporal orientation of spatially located data. In addition, the world model applies a federated approach by connecting to existing databases and integrating them into one single interface as a "one-stop shopping place" to access SSA information. In addition to the world model, OODA provides a processing platform for various analysts to explore and analytics to execute upon this data. Analytic algorithms can use OODA to take raw data and build information from it. They can store these products back into the world model, allowing analysts to gain situational awareness with this information. Analysts in turn would help decision makers use this knowledge to address a wide range of SSA problems. OODA is designed to make it easy for software developers who build graphical user interfaces (GUIs) and algorithms to quickly get started with working with this data. This is done through a multi-language software development kit that includes multiple application program interfaces (APIs) and a data model with SSA concepts and terms such as: space observation, observable, measurable, metadata, track, space object, catalog, expectation, and maneuver.
MetaboAnalystR: an R package for flexible and reproducible analysis of metabolomics data.
Chong, Jasmine; Xia, Jianguo
2018-06-28
The MetaboAnalyst web application has been widely used for metabolomics data analysis and interpretation. Despite its user-friendliness, the web interface has presented its inherent limitations (especially for advanced users) with regard to flexibility in creating customized workflow, support for reproducible analysis, and capacity in dealing with large data. To address these limitations, we have developed a companion R package (MetaboAnalystR) based on the R code base of the web server. The package has been thoroughly tested to ensure that the same R commands will produce identical results from both interfaces. MetaboAnalystR complements the MetaboAnalyst web server to facilitate transparent, flexible and reproducible analysis of metabolomics data. MetaboAnalystR is freely available from https://github.com/xia-lab/MetaboAnalystR. Supplementary data are available at Bioinformatics online.
Search for Spatially Extended Fermi-LAT Sources Using Two Years of Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lande, Joshua; Ackermann, Markus; Allafort, Alice
2012-07-13
Spatial extension is an important characteristic for correctly associating {gamma}-ray-emitting sources with their counterparts at other wavelengths and for obtaining an unbiased model of their spectra. We present a new method for quantifying the spatial extension of sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi). We perform a series of Monte Carlo simulations to validate this tool and calculate the LAT threshold for detecting the spatial extension of sources. We then test all sources in the second Fermi -LAT catalog (2FGL) for extension. We report the detection of sevenmore » new spatially extended sources.« less
Training analysis and reanalysis in the development of the psychoanalyst.
Meyer, Jon K
2007-01-01
A psychoanalyst faces the extraordinary demand of becoming instrumental in the psychoanalytic process. In the candidate's attempt to rise to that expectation, the first step is the training analysis. As the center-piece of psychoanalytic education, it is no ordinary analysis and bears special burdens intrinsic to its multiple functions and institutionalization. Recognizing the difficulties of both analytic education and analytic practice, Freud suggested that the analyst be periodically reanalyzed; for many, reanalysis is integral to their analytic development. Indeed, an analyst is actually never "made" but is always "in the making," developing and maturing in life and in practice. Reanalysis serves to focus elements of transference and resistance, rework defenses, facilitate more extensive regression in the service of the ego, deepen emotional integration, rework those elements of psychoanalysis itself that have been incorporated into defensive structure, and further the maturation of the analyzing instrument. If analysis is our most powerful mode of initial education, reanalysis is the most powerful form of continuing education. That remarkably little attention has been paid to reanalysis is testimony to the infantile fantasies that remain invested in our personal analyses.
Using the ENTLN lightning catalog to identify thunder signals in the USArray Transportable Array
NASA Astrophysics Data System (ADS)
Tytell, J. E.; Reyes, J. C.; Vernon, F.; Sloop, C.; Heckman, S.
2013-12-01
Severe weather events can pose a challenge for seismic analysts who regularly see non-seismic signals recorded at the stations. Sometimes, the noise from thunder can be confused with signals from seismic events such as quarry blasts or earthquakes depending on where and when the noise is observed. Automatic analysis of data is also severely affected by big amplitude arrivals that we could safely ignore. A comprehensive lightning catalog for the continental US in conjunction with a travel time model for thunder arrivals can help analysts identify some of these unknown sources. Researchers from Earthscope's USArray Transportable Array (TA) have partnered with the Earth Networks Total Lightning Network (ENTLN) in an effort to create such a catalog. Predicted thunder arrivals from some powerful meteorological systems affecting the main TA footprint will undergo extensive evaluation. We will examine the veracity of the predicted arrivals at different distances and azimuths and the time accuracy of the model. A combination of barometric pressure and seismic signals will be use to verify these arrivals.
A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice
ERIC Educational Resources Information Center
Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.
2015-01-01
To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…
"This strange disease": adolescent transference and the analyst's sexual orientation.
Burton, John K; Gilmore, Karen
2010-08-01
The treatment of adolescents by gay analysts is uncharted territory regarding the impact of the analyst's sexuality on the analytic process. Since a core challenge of adolescence involves the integration of the adult sexual body, gender role, and reproductive capacities into evolving identity, and since adolescents seek objects in their environment to facilitate both identity formation and the establishment of autonomy from primary objects, the analyst's sexual orientation is arguably a potent influence on the outcome of adolescent development. However, because sexual orientation is a less visible characteristic of the analyst than gender, race, or age, for example, the line between reality and fantasy is less clearly demarcated. This brings up special considerations regarding discovery and disclosure in the treatment. To explore these issues, the case of a late adolescent girl in treatment with a gay male analyst is presented. In this treatment, the question of the analyst's sexual orientation, and the demand by the patient for the analyst's self-disclosure, became a transference nucleus around which the patient's individual dynamics and adolescent dilemmas could be explored and clarified.
Public mental hospital work: pros and cons for psychiatrists.
Miller, R D
1984-09-01
The extensive literature concerning public mental hospitals has largely been written from the perspective of administrators and systems analysts; most of the reports emphasize the frustrations and problems of working in public mental hospitals and the continued exodus of psychiatrists from these facilities. The author addresses the pros and cons of such a career choice from the viewpoint of one who has been an "Indian" rather than a "chief" for a decade. He suggests that the current financial situation in both private practice and academia makes work in public mental hospitals increasingly attractive.
NASA and USGS ASTER Expedited Satellite Data Services for Disaster Situations
NASA Astrophysics Data System (ADS)
Duda, K. A.
2012-12-01
Significant international disasters related to storms, floods, volcanoes, wildfires and numerous other themes reoccur annually, often inflicting widespread human suffering and fatalities with substantial economic consequences. During and immediately after such events it can be difficult to access the affected areas and become aware of the overall impacts, but insight on the spatial extent and effects can be gleaned from above through satellite images. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) on the Terra spacecraft has offered such views for over a decade. On short notice, ASTER continues to deliver analysts multispectral imagery at 15 m spatial resolution in near real-time to assist participating responders, emergency managers, and government officials in planning for such situations and in developing appropriate responses after they occur. The joint U.S./Japan ASTER Science Team has developed policies and procedures to ensure such ongoing support is accessible when needed. Processing and distribution of data products occurs at the NASA Land Processes Distributed Active Archive Center (LP DAAC) located at the USGS Earth Resources Observation and Science Center in South Dakota. In addition to current imagery, the long-term ASTER mission has generated an extensive collection of nearly 2.5 million global 3,600 km2 scenes since the launch of Terra in late 1999. These are archived and distributed by LP DAAC and affiliates at Japan Space Systems in Tokyo. Advanced processing is performed to create higher level products of use to researchers. These include a global digital elevation model. Such pre-event imagery provides a comparative basis for use in detecting changes associated with disasters and to monitor land use trends to portray areas of increased risk. ASTER imagery acquired via the expedited collection and distribution process illustrates the utility and relevancy of such data in crisis situations.
Cost approach of health care entity intangible asset valuation.
Reilly, Robert F
2012-01-01
In the valuation synthesis and conclusion process, the analyst should consider the following question: Does the selected valuation approach(es) and method(s) accomplish the analyst's assignment? Also, does the selected valuation approach and method actually quantify the desired objective of the intangible asset analysis? The analyst should also consider if the selected valuation approach and method analyzes the appropriate bundle of legal rights. The analyst should consider if there were sufficient empirical data available to perform the selected valuation approach and method. The valuation synthesis should consider if there were sufficient data available to make the analyst comfortable with the value conclusion. The valuation analyst should consider if the selected approach and method will be understandable to the intended audience. In the valuation synthesis and conclusion, the analyst should also consider which approaches and methods deserve the greatest consideration with respect to the intangible asset's RUL. The intangible asset RUL is a consideration of each valuation approach. In the income approach, the RUL may affect the projection period for the intangible asset income subject to either yield capitalization or direct capitalization. In the cost approach, the RUL may affect the total amount of obsolescence, if any, from the estimate cost measure (that is, the intangible reproduction cost new or replacement cost new). In the market approach, the RUL may effect the selection, rejection, and/or adjustment of the comparable or guideline intangible asset sale and license transactional data. The experienced valuation analyst will use professional judgment to weight the various value indications to conclude a final intangible asset value, based on: The analyst's confidence in the quantity and quality of available data; The analyst's level of due diligence performed on that data; The relevance of the valuation method to the intangible asset life cycle stage and degree of marketability; and The degree of variation in the range of value indications. Valuation analysts value health care intangible assets for a number of reasons. In addition to regulatory compliance reasons, these reasons include various transaction, taxation, financing, litigation, accounting, bankruptcy, and planning purposes. The valuation analyst should consider all generally accepted intangible asset valuation approaches, methods, and procedures. Many valuation analysts are more familiar with market approach and income approach valuation methods. However, there are numerous instances when cost approach valuation methods are also applicable to the health care intangible asset valuation. This discussion summarized the analyst's procedures and considerations with regard to the cost approach. The cost approach is often applicable to the valuation of intangible assets in the health care industry. However, the cost approach is only applicable if the valuation analyst (1) appropriately considers all of the cost components and (2) appropriately identifies and quantifies all obsolescence allowances. Regardless of the health care intangible asset or the reason for the valuation, the analyst should be familiar with all generally accepted valuation approaches and methods. And, the valuation analyst should have a clear, convincing, and cogent rationale for (1) accepting each approach and method applied and (2) rejecting each approach and method not applied. That way, the valuation analyst will best achieve the purpose and objective of the health care intangible asset valuation.
ERIC Educational Resources Information Center
Arellano, Eduardo C.; Martinez, Mario C.
2009-01-01
This study compares the extent to which higher education policy analysts and master's and doctoral faculty of higher education and public affairs programs match on a set of competencies thought to be important to higher education policy analysis. Analysts matched master's faculty in three competencies while analysts and doctoral faculty matched in…
The Variability of Crater Identification Among Expert and Community Crater Analysts
NASA Astrophysics Data System (ADS)
Robbins, S. J.; Antonenko, I.; Kirchoff, M. R.; Chapman, C. R.; Fassett, C. I.; Herrick, R. R.; Singer, K.; Zanetti, M.; Lehan, C.; Huang, D.; Gay, P.
2014-04-01
Statistical studies of impact crater populations have been used to model ages of planetary surfaces for several decades [1]. This assumes that crater counts are approximately invariant and a "correct" population will be identified if the analyst is skilled and diligent. However, the reality is that crater identification is somewhat subjective, so variability between analysts, or even a single analyst's variation from day-to-day, is expected [e.g., 2, 3]. This study was undertaken to quantify that variability within an expert analyst population and between experts and minimally trained volunteers.
The relation between space and math: developmental and educational implications.
Mix, Kelly S; Cheng, Yi-Ling
2012-01-01
There is a well-known relation between spatial ability and mathematics dating back to the work of early twentieth century factor analysts. This connection is a ripe opportunity for educators, who might use spatial training to improve math learning. However, a closer look at the literature reveals gaps that impede direct application. The primary problem is that although this relation is well established in older children and adults, its emergence in early development and subsequent developmental interactions are not well documented. Moreover, there is a need for more mechanistic explanations that might be leveraged to improve math education. In this chapter, we attempt to address these issues by reviewing the existing literature to identify instances where answers are available and others where further research is needed.
Metric Learning for Hyperspectral Image Segmentation
NASA Technical Reports Server (NTRS)
Bue, Brian D.; Thompson, David R.; Gilmore, Martha S.; Castano, Rebecca
2011-01-01
We present a metric learning approach to improve the performance of unsupervised hyperspectral image segmentation. Unsupervised spatial segmentation can assist both user visualization and automatic recognition of surface features. Analysts can use spatially-continuous segments to decrease noise levels and/or localize feature boundaries. However, existing segmentation methods use tasks-agnostic measures of similarity. Here we learn task-specific similarity measures from training data, improving segment fidelity to classes of interest. Multiclass Linear Discriminate Analysis produces a linear transform that optimally separates a labeled set of training classes. The defines a distance metric that generalized to a new scenes, enabling graph-based segmentation that emphasizes key spectral features. We describe tests based on data from the Compact Reconnaissance Imaging Spectrometer (CRISM) in which learned metrics improve segment homogeneity with respect to mineralogical classes.
Analyst-to-Analyst Variability in Simulation-Based Prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glickman, Matthew R.; Romero, Vicente J.
This report describes findings from the culminating experiment of the LDRD project entitled, "Analyst-to-Analyst Variability in Simulation-Based Prediction". For this experiment, volunteer participants solving a given test problem in engineering and statistics were interviewed at different points in their solution process. These interviews are used to trace differing solutions to differing solution processes, and differing processes to differences in reasoning, assumptions, and judgments. The issue that the experiment was designed to illuminate -- our paucity of understanding of the ways in which humans themselves have an impact on predictions derived from complex computational simulations -- is a challenging and openmore » one. Although solution of the test problem by analyst participants in this experiment has taken much more time than originally anticipated, and is continuing past the end of this LDRD, this project has provided a rare opportunity to explore analyst-to-analyst variability in significant depth, from which we derive evidence-based insights to guide further explorations in this important area.« less
The patient who believes and the analyst who does not (1).
Lijtmaer, Ruth M
2009-01-01
A patient's religious beliefs and practices challenge the clinical experience and self-knowledge of the analyst owing to a great complexity of factors, and often take the form of the analyst's resistances and countertransference reactions to spiritual and religious issues. The analyst's feelings about the patient's encounters with religion and other forms of healing experiences may result in impasses and communication breakdown for a variety of reasons. These reasons include the analyst's own unresolved issues around her role as a psychoanalyst-which incorporates in some way psychoanalysis's views of religious belief-and these old conflicts may be irritated by the religious themes expressed by the patient. Vignettes from the treatments of two patients provide examples of the analyst's countertransference conflicts, particularly envy in the case of a therapist who is an atheist.
Using the living laboratory framework as a basis for understanding next-generation analyst work
NASA Astrophysics Data System (ADS)
McNeese, Michael D.; Mancuso, Vincent; McNeese, Nathan; Endsley, Tristan; Forster, Pete
2013-05-01
The preparation of next generation analyst work requires alternative levels of understanding and new methodological departures from the way current work transpires. Current work practices typically do not provide a comprehensive approach that emphasizes the role of and interplay between (a) cognition, (b) emergent activities in a shared situated context, and (c) collaborative teamwork. In turn, effective and efficient problem solving fails to take place, and practice is often composed of piecemeal, techno-centric tools that isolate analysts by providing rigid, limited levels of understanding of situation awareness. This coupled with the fact that many analyst activities are classified produces a challenging situation for researching such phenomena and designing and evaluating systems to support analyst cognition and teamwork. Through our work with cyber, image, and intelligence analysts we have realized that there is more required of researchers to study human-centered designs to provide for analyst's needs in a timely fashion. This paper identifies and describes how The Living Laboratory Framework can be utilized as a means to develop a comprehensive, human-centric, and problem-focused approach to next generation analyst work, design, and training. We explain how the framework is utilized for specific cases in various applied settings (e.g., crisis management analysis, image analysis, and cyber analysis) to demonstrate its value and power in addressing an area of utmost importance to our national security. Attributes of analyst work settings are delineated to suggest potential design affordances that could help improve cognitive activities and awareness. Finally, the paper puts forth a research agenda for the use of the framework for future work that will move the analyst profession in a viable manner to address the concerns identified.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Sripada, Bhaskar
2015-09-01
Freud stated that any line of investigation which recognizes transference and resistance, regardless of its results, was entitled to call itself psychoanalysis (Freud, 1914a, p. 16). Separately he wrote that psychoanalysis was the science of unconscious mental processes (Freud, 1925, p. 70). Combining these two ideas defines Essential Psychoanalysis: Any line of treatment, theory, or science which recognizes the facts of unconscious, transference, or resistance, and takes them as the starting point of its work, regardless of its results, is psychoanalysis. Freud formulated two conflicting definitions of psychoanalysis: Essential Psychoanalysis, applicable to all analysts regardless of their individuality and Extensive Psychoanalysis, modeled on his individuality. They differ in how psychoanalytic technique is viewed. For Essential Psychoanalysis, flexible recommendations constitute psychoanalytic technique, whereas for Extensive Psychoanalysis, rules constitute a key part of psychoanalytic technique.
Perspectives on the geographic stability and mobility of people in cities
Hanson, Susan
2005-01-01
A class of questions in the human environment sciences focuses on the relationship between individual or household behavior and local geographic context. Central to these questions is the nature of people's geographic mobility as well as the duration of their locational stability at varying spatial and temporal scales. The problem for researchers is that the processes of mobility/stability are temporally and spatially dynamic and therefore difficult to measure. Whereas time and space are continuous, analysts must select levels of aggregation for both length of time in place and spatial scale of place that fit with the problem in question. Previous work has emphasized mobility and suppressed stability as an analytic category. I focus here on stability and show how analyzing individuals' stability requires also analyzing their mobility. Through an empirical example centered on the relationship between entrepreneurship and place, I demonstrate how a spotlight on stability illuminates a resolution to the measurement problem by highlighting the interdependence between the time and space dimensions of stability/mobility. PMID:16230616
NASA Astrophysics Data System (ADS)
Ahmad, Sajid Rashid
With the understanding that far more research remains to be done on the development and use of innovative and functional geospatial techniques and procedures to investigate coastline changes this thesis focussed on the integration of remote sensing, geographical information systems (GIS) and modelling techniques to provide meaningful insights on the spatial and temporal dynamics of coastline changes. One of the unique strengths of this research was the parameterization of the GIS with long-term empirical and remote sensing data. Annual empirical data from 1941--2007 were analyzed by the GIS, and then modelled with statistical techniques. Data were also extracted from Landsat TM and ETM+ images. The band ratio method was used to extract the coastlines. Topographic maps were also used to extract digital map data. All data incorporated into ArcGIS 9.2 were analyzed with various modules, including Spatial Analyst, 3D Analyst, and Triangulated Irregular Networks. The Digital Shoreline Analysis System was used to analyze and predict rates of coastline change. GIS results showed the spatial locations along the coast that will either advance or retreat over time. The linear regression results highlighted temporal changes which are likely to occur along the coastline. Box-Jenkins modelling procedures were utilized to determine statistical models which best described the time series (1941--2007) of coastline change data. After several iterations and goodness-of-fit tests, second-order spatial cyclic autoregressive models, first-order autoregressive models and autoregressive moving average models were identified as being appropriate for describing the deterministic and random processes operating in Guyana's coastal system. The models highlighted not only cyclical patterns in advance and retreat of the coastline, but also the existence of short and long-term memory processes. Long-term memory processes could be associated with mudshoal propagation and stabilization while short-term memory processes were indicative of transitory hydrodynamic and other processes. An innovative framework for a spatio-temporal information-based system (STIBS) was developed. STIBS incorporated diverse datasets within a GIS, dynamic computer-based simulation models, and a spatial information query and graphical subsystem. Tests of the STIBS proved that it could be used to simulate and visualize temporal variability in shifting morphological states of the coastline.
NASA Astrophysics Data System (ADS)
Sadler, Laurel
2017-05-01
In today's battlefield environments, analysts are inundated with real-time data received from the tactical edge that must be evaluated and used for managing and modifying current missions as well as planning for future missions. This paper describes a framework that facilitates a Value of Information (VoI) based data analytics tool for information object (IO) analysis in a tactical and command and control (C2) environment, which reduces analyst work load by providing automated or analyst assisted applications. It allows the analyst to adjust parameters for data matching of the IOs that will be received and provides agents for further filtering or fusing of the incoming data. It allows for analyst enhancement and markup to be made to and/or comments to be attached to the incoming IOs, which can then be re-disseminated utilizing the VoI based dissemination service. The analyst may also adjust the underlying parameters before re-dissemination of an IO, which will subsequently adjust the value of the IO based on this new/additional information that has been added, possibly increasing the value from the original. The framework is flexible and extendable, providing an easy to use, dynamically changing Command and Control decision aid that focuses and enhances the analyst workflow.
The analyst: his professional novel.
Ambrosiano, Laura
2005-12-01
The psychoanalyst needs to be in touch with a community of colleagues; he needs to feel part of a group with which he can share cognitive tension and therapeutic knowledge. Yet group ties are an aspect we analysts seldom discuss. The author defines the analyst's 'professional novel' as the emotional vicissitudes with the group that have marked the professional itinerary of every analyst; his relationship with institutions and with theories, and the emotional nuance of these relationships. The analyst's professional novel is the narrative elaboration of his professional autobiography. It is capable of transforming the individual's need to belong and the paths of identification and de-identification. Experience of the oedipal configuration allows the analyst to begin psychic work aimed at gaining spaces of separateness in his relationship with the group. This passage is marked by the work on mourning that separation involves, but also of mourning implicit in the awareness of the representative limits of our theories. Right from the start of analysis, the patient observes the emotional nuance of the analyst's connection to his group and theories; the patient notices how much this connection is governed by rigid needs to belong, and how much freedom of thought and exploration it allows the analyst. The author uses clinical examples to illustrate these hypotheses.
NASA Astrophysics Data System (ADS)
Gao, Jing; Burt, James E.
2017-12-01
This study investigates the usefulness of a per-pixel bias-variance error decomposition (BVD) for understanding and improving spatially-explicit data-driven models of continuous variables in environmental remote sensing (ERS). BVD is a model evaluation method originated from machine learning and have not been examined for ERS applications. Demonstrated with a showcase regression tree model mapping land imperviousness (0-100%) using Landsat images, our results showed that BVD can reveal sources of estimation errors, map how these sources vary across space, reveal the effects of various model characteristics on estimation accuracy, and enable in-depth comparison of different error metrics. Specifically, BVD bias maps can help analysts identify and delineate model spatial non-stationarity; BVD variance maps can indicate potential effects of ensemble methods (e.g. bagging), and inform efficient training sample allocation - training samples should capture the full complexity of the modeled process, and more samples should be allocated to regions with more complex underlying processes rather than regions covering larger areas. Through examining the relationships between model characteristics and their effects on estimation accuracy revealed by BVD for both absolute and squared errors (i.e. error is the absolute or the squared value of the difference between observation and estimate), we found that the two error metrics embody different diagnostic emphases, can lead to different conclusions about the same model, and may suggest different solutions for performance improvement. We emphasize BVD's strength in revealing the connection between model characteristics and estimation accuracy, as understanding this relationship empowers analysts to effectively steer performance through model adjustments.
Godsil, Geraldine
2018-02-01
This paper discusses the residues of a somatic countertransference that revealed its meaning several years after apparently successful analytic work had ended. Psychoanalytic and Jungian analytic ideas on primitive communication, dissociation and enactment are explored in the working through of a shared respiratory symptom between patient and analyst. Growth in the analyst was necessary so that the patient's communication at a somatic level could be understood. Bleger's concept that both the patient's and analyst's body are part of the setting was central in the working through. © 2018, The Society of Analytical Psychology.
Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.
Xia, Jianguo; Wishart, David S
2016-09-07
MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
An interactive program for computer-aided map design, display, and query: EMAPKGS2
Pouch, G.W.
1997-01-01
EMAPKGS2 is a user-friendly, PC-based electronic mapping tool for use in hydrogeologic exploration and appraisal. EMAPKGS2 allows the analyst to construct maps interactively from data stored in a relational database, perform point-oriented spatial queries such as locating all wells within a specified radius, perform geographic overlays, and export the data to other programs for further analysis. EMAPKGS2 runs under Microsoft?? Windows??? 3.1 and compatible operating systems. EMAPKGS2 is a public domain program available from the Kansas Geological Survey. EMAPKGS2 is the centerpiece of WHEAT, the Windows-based Hydrogeologic Exploration and Appraisal Toolkit, a suite of user-friendly Microsoft?? Windows??? programs for natural resource exploration and management. The principal goals in development of WHEAT have been ease of use, hardware independence, low cost, and end-user extensibility. WHEAT'S native data format is a Microsoft?? Access?? database. WHEAT stores a feature's geographic coordinates as attributes so they can be accessed easily by the user. The WHEAT programs are designed to be used in conjunction with other Microsoft?? Windows??? software to allow the natural resource scientist to perform work easily and effectively. WHEAT and EMAPKGS have been used at several of Kansas' Groundwater Management Districts and the Kansas Geological Survey on groundwater management operations, groundwater modeling projects, and geologic exploration projects. ?? 1997 Elsevier Science Ltd.
The Pope's confessor: a metaphor relating to illness in the analyst.
Clark, R W
1995-01-01
This paper examines some of the internal and external eventualities in the situation of illness in the analyst. The current emphasis on the use of the self as part of the analyzing instrument makes impairments in the analyst's physical well-being potentially disabling to the analytic work. A recommendation is made for analysts, both individually and as a professional group, to always consider this aspect of a personal medical problem.
Desire and the female analyst.
Schaverien, J
1996-04-01
The literature on erotic transference and countertransference between female analyst and male patient is reviewed and discussed. It is known that female analysts are less likely than their male colleagues to act out sexually with their patients. It has been claimed that a) male patients do not experience sustained erotic transferences, and b) female analysts do not experience erotic countertransferences with female or male patients. These views are challenged and it is argued that, if there is less sexual acting out by female analysts, it is not because of an absence of eros in the therapeutic relationship. The literature review covers material drawn from psychoanalysis, feminist psychotherapy, Jungian analysis, as well as some sociological and cultural sources. It is organized under the following headings: the gender of the analyst, sexual acting out, erotic transference, maternal and paternal transference, gender and power, countertransference, incest taboo--mothers and sons and sexual themes in the transference.
Techniques for automatic large scale change analysis of temporal multispectral imagery
NASA Astrophysics Data System (ADS)
Mercovich, Ryan A.
Change detection in remotely sensed imagery is a multi-faceted problem with a wide variety of desired solutions. Automatic change detection and analysis to assist in the coverage of large areas at high resolution is a popular area of research in the remote sensing community. Beyond basic change detection, the analysis of change is essential to provide results that positively impact an image analyst's job when examining potentially changed areas. Present change detection algorithms are geared toward low resolution imagery, and require analyst input to provide anything more than a simple pixel level map of the magnitude of change that has occurred. One major problem with this approach is that change occurs in such large volume at small spatial scales that a simple change map is no longer useful. This research strives to create an algorithm based on a set of metrics that performs a large area search for change in high resolution multispectral image sequences and utilizes a variety of methods to identify different types of change. Rather than simply mapping the magnitude of any change in the scene, the goal of this research is to create a useful display of the different types of change in the image. The techniques presented in this dissertation are used to interpret large area images and provide useful information to an analyst about small regions that have undergone specific types of change while retaining image context to make further manual interpretation easier. This analyst cueing to reduce information overload in a large area search environment will have an impact in the areas of disaster recovery, search and rescue situations, and land use surveys among others. By utilizing a feature based approach founded on applying existing statistical methods and new and existing topological methods to high resolution temporal multispectral imagery, a novel change detection methodology is produced that can automatically provide useful information about the change occurring in large area and high resolution image sequences. The change detection and analysis algorithm developed could be adapted to many potential image change scenarios to perform automatic large scale analysis of change.
A Self-Tuning Kalman Filter for Autonomous Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, Son H.
1999-01-01
Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS (Global Positioning Systems) data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.
A Self-Tuning Kalman Filter for Autonomous Navigation using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.
1999-01-01
Most navigation systems currently operated by NASA are ground-based, and require extensive support to produce accurate results. Recently developed systems that use Kalman filter and GPS data for orbit determination greatly reduce dependency on ground support, and have potential to provide significant economies for NASA spacecraft navigation. These systems, however, still rely on manual tuning from analysts. A sophisticated neuro-fuzzy component fully integrated with the flight navigation system can perform the self-tuning capability for the Kalman filter and help the navigation system recover from estimation errors in real time.
NASA Astrophysics Data System (ADS)
Dhesi, Gurjeet; Ausloos, Marcel
2016-07-01
Following a Geometrical Brownian Motion extension into an Irrational Fractional Brownian Motion model, we re-examine agent behaviour reacting to time dependent news on the log-returns thereby modifying a financial market evolution. We specifically discuss the role of financial news or economic information positive or negative feedback of such irrational (or contrarian) agents upon the price evolution. We observe a kink-like effect reminiscent of soliton behaviour, suggesting how analysts' forecasts errors induce stock prices to adjust accordingly, thereby proposing a measure of the irrational force in a market.
BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.
1981-06-01
This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin; Slaughter, Andrew; Veeraraghavan, Swetha
Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON) is a finite element application that aims at analyzing the response of 3-D soil-structure systems to natural and man-made hazards such as earthquakes, floods and fire. MASTODON currently focuses on the simulation of seismic events and has the capability to perform extensive ‘source-to-site’ simulations including earthquake fault rupture, nonlinear wave propagation and nonlinear soil-structure interaction (NLSSI) analysis. MASTODON is being developed to be a dynamic probabilistic risk assessment framework that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment.
NASA Astrophysics Data System (ADS)
Appel, Marius; Lahn, Florian; Pebesma, Edzer; Buytaert, Wouter; Moulds, Simon
2016-04-01
Today's amount of freely available data requires scientists to spend large parts of their work on data management. This is especially true in environmental sciences when working with large remote sensing datasets, such as obtained from earth-observation satellites like the Sentinel fleet. Many frameworks like SpatialHadoop or Apache Spark address the scalability but target programmers rather than data analysts, and are not dedicated to imagery or array data. In this work, we use the open-source data management and analytics system SciDB to bring large earth-observation datasets closer to analysts. Its underlying data representation as multidimensional arrays fits naturally to earth-observation datasets, distributes storage and computational load over multiple instances by multidimensional chunking, and also enables efficient time-series based analyses, which is usually difficult using file- or tile-based approaches. Existing interfaces to R and Python furthermore allow for scalable analytics with relatively little learning effort. However, interfacing SciDB and file-based earth-observation datasets that come as tiled temporal snapshots requires a lot of manual bookkeeping during ingestion, and SciDB natively only supports loading data from CSV-like and custom binary formatted files, which currently limits its practical use in earth-observation analytics. To make it easier to work with large multi-temporal datasets in SciDB, we developed software tools that enrich SciDB with earth observation metadata and allow working with commonly used file formats: (i) the SciDB extension library scidb4geo simplifies working with spatiotemporal arrays by adding relevant metadata to the database and (ii) the Geospatial Data Abstraction Library (GDAL) driver implementation scidb4gdal allows to ingest and export remote sensing imagery from and to a large number of file formats. Using added metadata on temporal resolution and coverage, the GDAL driver supports time-based ingestion of imagery to existing multi-temporal SciDB arrays. While our SciDB plugin works directly in the database, the GDAL driver has been specifically developed using a minimum amount of external dependencies (i.e. CURL). Source code for both tools is available from github [1]. We present these tools in a case-study that demonstrates the ingestion of multi-temporal tiled earth-observation data to SciDB, followed by a time-series analysis using R and SciDBR. Through the exclusive use of open-source software, our approach supports reproducibility in scalable large-scale earth-observation analytics. In the future, these tools can be used in an automated way to let scientists only work on ready-to-use SciDB arrays to significantly reduce the data management workload for domain scientists. [1] https://github.com/mappl/scidb4geo} and \\url{https://github.com/mappl/scidb4gdal
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
Smarter Cities Marketing Insights 2.0 initiative, a data quality analyst at EnerNOC for its demand wind energy as a wind program analyst for Green Energy Ohio in 2005 and as a data analyst for The
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Dustin Yewell
Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less
Poore, Joshua C; Forlines, Clifton L; Miller, Sarah M; Regan, John R; Irvine, John M
2014-12-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures-personality, cognitive style, motivated cognition-predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated "top-down" cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains.
Forlines, Clifton L.; Miller, Sarah M.; Regan, John R.; Irvine, John M.
2014-01-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures—personality, cognitive style, motivated cognition—predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated “top-down” cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains. PMID:25983670
Elements of analytic style: Bion's clinical seminars.
Ogden, Thomas H
2007-10-01
The author finds that the idea of analytic style better describes significant aspects of the way he practices psychoanalysis than does the notion of analytic technique. The latter is comprised to a large extent of principles of practice developed by previous generations of analysts. By contrast, the concept of analytic style, though it presupposes the analyst's thorough knowledge of analytic theory and technique, emphasizes (1) the analyst's use of his unique personality as reflected in his individual ways of thinking, listening, and speaking, his own particular use of metaphor, humor, irony, and so on; (2) the analyst's drawing on his personal experience, for example, as an analyst, an analysand, a parent, a child, a spouse, a teacher, and a student; (3) the analyst's capacity to think in a way that draws on, but is independent of, the ideas of his colleagues, his teachers, his analyst, and his analytic ancestors; and (4) the responsibility of the analyst to invent psychoanalysis freshly for each patient. Close readings of three of Bion's 'Clinical seminars' are presented in order to articulate some of the elements of Bion's analytic style. Bion's style is not presented as a model for others to emulate or, worse yet, imitate; rather, it is described in an effort to help the reader consider from a different vantage point (provided by the concept of analytic style) the way in which he, the reader, practices psychoanalysis.
From fields to objects: A review of geographic boundary analysis
NASA Astrophysics Data System (ADS)
Jacquez, G. M.; Maruca, S.; Fortin, M.-J.
Geographic boundary analysis is a relatively new approach unfamiliar to many spatial analysts. It is best viewed as a technique for defining objects - geographic boundaries - on spatial fields, and for evaluating the statistical significance of characteristics of those boundary objects. This is accomplished using null spatial models representative of the spatial processes expected in the absence of boundary-generating phenomena. Close ties to the object-field dialectic eminently suit boundary analysis to GIS data. The majority of existing spatial methods are field-based in that they describe, estimate, or predict how attributes (variables defining the field) vary through geographic space. Such methods are appropriate for field representations but not object representations. As the object-field paradigm gains currency in geographic information science, appropriate techniques for the statistical analysis of objects are required. The methods reviewed in this paper are a promising foundation. Geographic boundary analysis is clearly a valuable addition to the spatial statistical toolbox. This paper presents the philosophy of, and motivations for geographic boundary analysis. It defines commonly used statistics for quantifying boundaries and their characteristics, as well as simulation procedures for evaluating their significance. We review applications of these techniques, with the objective of making this promising approach accessible to the GIS-spatial analysis community. We also describe the implementation of these methods within geographic boundary analysis software: GEM.
Web-based GIS for spatial pattern detection: application to malaria incidence in Vietnam.
Bui, Thanh Quang; Pham, Hai Minh
2016-01-01
There is a great concern on how to build up an interoperable health information system of public health and health information technology within the development of public information and health surveillance programme. Technically, some major issues remain regarding to health data visualization, spatial processing of health data, health information dissemination, data sharing and the access of local communities to health information. In combination with GIS, we propose a technical framework for web-based health data visualization and spatial analysis. Data was collected from open map-servers and geocoded by open data kit package and data geocoding tools. The Web-based system is designed based on Open-source frameworks and libraries. The system provides Web-based analyst tool for pattern detection through three spatial tests: Nearest neighbour, K function, and Spatial Autocorrelation. The result is a web-based GIS, through which end users can detect disease patterns via selecting area, spatial test parameters and contribute to managers and decision makers. The end users can be health practitioners, educators, local communities, health sector authorities and decision makers. This web-based system allows for the improvement of health related services to public sector users as well as citizens in a secure manner. The combination of spatial statistics and web-based GIS can be a solution that helps empower health practitioners in direct and specific intersectional actions, thus provide for better analysis, control and decision-making.
GPM Timeline Inhibits For IT Processing
NASA Technical Reports Server (NTRS)
Dion, Shirley K.
2014-01-01
The Safety Inhibit Timeline Tool was created as one approach to capturing and understanding inhibits and controls from IT through launch. Global Precipitation Measurement (GPM) Mission, which launched from Japan in March 2014, was a joint mission under a partnership between the National Aeronautics and Space Administration (NASA) and the Japan Aerospace Exploration Agency (JAXA). GPM was one of the first NASA Goddard in-house programs that extensively used software controls. Using this tool during the GPM buildup allowed a thorough review of inhibit and safety critical software design for hazardous subsystems such as the high gain antenna boom, solar array, and instrument deployments, transmitter turn-on, propulsion system release, and instrument radar turn-on. The GPM safety team developed a methodology to document software safety as part of the standard hazard report. As a result of this process, a new tool safety inhibit timeline was created for management of inhibits and their controls during spacecraft buildup and testing during IT at GSFC and at the launch range in Japan. The Safety Inhibit Timeline Tool was a pathfinder approach for reviewing software that controls the electrical inhibits. The Safety Inhibit Timeline Tool strengthens the Safety Analysts understanding of the removal of inhibits during the IT process with safety critical software. With this tool, the Safety Analyst can confirm proper safe configuration of a spacecraft during each IT test, track inhibit and software configuration changes, and assess software criticality. In addition to understanding inhibits and controls during IT, the tool allows the Safety Analyst to better communicate to engineers and management the changes in inhibit states with each phase of hardware and software testing and the impact of safety risks. Lessons learned from participating in the GPM campaign at NASA and JAXA will be discussed during this session.
Optical vs. electronic enhancement of remote sensing imagery
NASA Technical Reports Server (NTRS)
Colwell, R. N.; Katibah, E. F.
1976-01-01
Basic aspects of remote sensing are considered and a description is provided of the methods which are employed in connection with the optical or electronic enhancement of remote sensing imagery. The advantages and limitations of various image enhancement methods and techniques are evaluated. It is pointed out that optical enhancement methods and techniques are currently superior to electronic ones with respect to spatial resolution and equipment cost considerations. Advantages of electronic procedures, on the other hand, are related to a greater flexibility regarding the presentation of the information as an aid for the interpretation by the image analyst.
Gravity Data for West-Central Colorado
Richard Zehner
2012-04-06
Modeled Bouger-Corrected Gravity data was extracted from the Pan American Center for Earth and Environmental Studies Gravity Database of the U.S. at http://irpsrvgis08.utep.edu/viewers/Flex/GravityMagnetic/GravityMagnetic_CyberShare/ on 2/29/2012. The downloaded text file was opened in an Excel spreadsheet. This spreadsheet data was then converted into an ESRI point shapefile in UTM Zone 13 NAD27 projection, showing location and gravity (in milligals). This data was then converted to grid and then contoured using ESRI Spatial Analyst. Data from From University of Texas: Pan American Center for Earth and Environmental Studies
One decade of the Data Fusion Information Group (DFIG) model
NASA Astrophysics Data System (ADS)
Blasch, Erik
2015-05-01
The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.
Assessing the performance of regional landslide early warning models: the EDuMaP method
NASA Astrophysics Data System (ADS)
Calvello, M.; Piciullo, L.
2015-10-01
The paper proposes the evaluation of the technical performance of a regional landslide early warning system by means of an original approach, called EDuMaP method, comprising three successive steps: identification and analysis of the Events (E), i.e. landslide events and warning events derived from available landslides and warnings databases; definition and computation of a Duration Matrix (DuMa), whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model Performance (P) by means of performance criteria and indicators applied to the duration matrix. During the first step, the analyst takes into account the features of the warning model by means of ten input parameters, which are used to identify and classify landslide and warning events according to their spatial and temporal characteristics. In the second step, the analyst computes a time-based duration matrix having a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The proposed method is based on a framework clearly distinguishing between local and regional landslide early warning systems as well as among correlation laws, warning models and warning systems. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warnings data from the municipal early warning system operating in Rio de Janeiro (Brazil).
Assessing the performance of regional landslide early warning models: the EDuMaP method
NASA Astrophysics Data System (ADS)
Calvello, M.; Piciullo, L.
2016-01-01
A schematic of the components of regional early warning systems for rainfall-induced landslides is herein proposed, based on a clear distinction between warning models and warning systems. According to this framework an early warning system comprises a warning model as well as a monitoring and warning strategy, a communication strategy and an emergency plan. The paper proposes the evaluation of regional landslide warning models by means of an original approach, called the "event, duration matrix, performance" (EDuMaP) method, comprising three successive steps: identification and analysis of the events, i.e., landslide events and warning events derived from available landslides and warnings databases; definition and computation of a duration matrix, whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model performance by means of performance criteria and indicators applied to the duration matrix. During the first step the analyst identifies and classifies the landslide and warning events, according to their spatial and temporal characteristics, by means of a number of model parameters. In the second step, the analyst computes a time-based duration matrix with a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warning data from the municipal early warning system operating in Rio de Janeiro (Brazil).
SafetyAnalyst : software tools for safety management of specific highway sites
DOT National Transportation Integrated Search
2010-07-01
SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...
Exploring the Analytical Processes of Intelligence Analysts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Kuchar, Olga A.; Wolf, Katherine E.
We present an observational case study in which we investigate and analyze the analytical processes of intelligence analysts. Participating analysts in the study carry out two scenarios where they organize and triage information, conduct intelligence analysis, report results, and collaborate with one another. Through a combination of artifact analyses, group interviews, and participant observations, we explore the space and boundaries in which intelligence analysts work and operate. We also assess the implications of our findings on the use and application of relevant information technologies.
Reflections: can the analyst share a traumatizing experience with a traumatized patient?
Lijtmaer, Ruth
2010-01-01
This is a personal account of a dreadful event in the analyst's life that was similar to a patient's trauma. It is a reflection on how the analyst dealt with her own trauma, the patient's trauma, and the transference and countertransference dynamics. Included is a description of the analyst's inner struggles with self-disclosure, continuance of her professional work, and the need for persistent self-scrutiny. The meaning of objects in people's life, particularly the concept of home, will be addressed.
Do Sell-Side Stock Analysts Exhibit Escalation of Commitment?
Milkman, Katherine L.
2010-01-01
This paper presents evidence that when an analyst makes an out-of-consensus forecast of a company’s quarterly earnings that turns out to be incorrect, she escalates her commitment to maintaining an out-of-consensus view on the company. Relative to an analyst who was close to the consensus, the out-of-consensus analyst adjusts her forecasts for the current fiscal year’s earnings less in the direction of the quarterly earnings surprise. On average, this type of updating behavior reduces forecasting accuracy, so it does not seem to reflect superior private information. Further empirical results suggest that analysts do not have financial incentives to stand by extreme stock calls in the face of contradictory evidence. Managerial and financial market implications are discussed. PMID:21516220
Mortality, integrity, and psychoanalysis (who are you to me? Who am I to you?).
Pinsky, Ellen
2014-01-01
The author narrates her experience of mourning her therapist's sudden death. The profession has neglected implications of the analyst's mortality: what is lost or vulnerable to loss? What is that vulnerability's function? The author's process of mourning included her writing and her becoming an analyst. Both pursuits inspired reflections on mortality in two overlapping senses: bodily (the analyst is mortal and can die) and character (the analyst is mortal and can err). The subject thus expands to include impaired character and ethical violations. Paradoxically, the analyst's human limitations threaten each psychoanalytic situation, but also enable it: human imperfection animates the work. The essay ends with a specific example of integrity. © 2014 The Psychoanalytic Quarterly, Inc.
The tobacco industry's use of Wall Street analysts in shaping policy.
Alamar, B C; Glantz, S A
2004-09-01
To document how the tobacco industry has used Wall Street analysts to further its public policy objectives. Searching tobacco documents available on the internet, newspaper articles, and transcripts of public hearings. The tobacco industry used nominally independent Wall Street analysts as third parties to support the tobacco industry's legislative agenda at both national and state levels in the USA. The tobacco industry has, for example, edited the testimony of at least one analyst before he testified to the US Senate Judiciary Committee, while representing himself as independent of the industry. The tobacco industry has used undisclosed collaboration with Wall Street analysts, as they have used undisclosed relationships with research scientists and academics, to advance the interests of the tobacco industry in public policy.
75 FR 20385 - Amended Certification Regarding Eligibility To Apply for Worker Adjustment Assistance
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-19
..., Inconen, CTS, Hi-Tec, Woods, Ciber, Kelly Services, Analysts International Corp, Comsys, Filter LLC..., Ciber, Kelly Services, Analysts International Corp, Comsys, Filter LLC, Excell, Entegee, Chipton- Ross..., Kelly Services, Analysts International Corp, Comsys, Filter LLC, Excell, Entegee, Chipton- Ross, Ian...
Subcellular object quantification with Squassh3C and SquasshAnalyst.
Rizk, Aurélien; Mansouri, Maysam; Ballmer-Hofer, Kurt; Berger, Philipp
2015-11-01
Quantitative image analysis plays an important role in contemporary biomedical research. Squassh is a method for automatic detection, segmentation, and quantification of subcellular structures and analysis of their colocalization. Here we present the applications Squassh3C and SquasshAnalyst. Squassh3C extends the functionality of Squassh to three fluorescence channels and live-cell movie analysis. SquasshAnalyst is an interactive web interface for the analysis of Squassh3C object data. It provides segmentation image overview and data exploration, figure generation, object and image filtering, and a statistical significance test in an easy-to-use interface. The overall procedure combines the Squassh3C plug-in for the free biological image processing program ImageJ and a web application working in conjunction with the free statistical environment R, and it is compatible with Linux, MacOS X, or Microsoft Windows. Squassh3C and SquasshAnalyst are available for download at www.psi.ch/lbr/SquasshAnalystEN/SquasshAnalyst.zip.
This art of psychoanalysis. Dreaming undreamt dreams and interrupted cries.
Ogden, Thomas H
2004-08-01
It is the art of psychoanalysis in the making, a process inventing itself as it goes, that is the subject of this paper. The author articulates succinctly how he conceives of psychoanalysis, and offers a detailed clinical illustration. He suggests that each analysand unconsciously (and ambivalently) is seeking help in dreaming his 'night terrors' (his undreamt and undreamable dreams) and his 'nightmares' (his dreams that are interrupted when the pain of the emotional experience being dreamt exceeds his capacity for dreaming). Undreamable dreams are understood as manifestations of psychotic and psychically foreclosed aspects of the personality; interrupted dreams are viewed as reflections of neurotic and other non-psychotic parts of the personality. The analyst's task is to generate conditions that may allow the analysand--with the analyst's participation--to dream the patient's previously undreamable and interrupted dreams. A significant part of the analyst's participation in the patient's dreaming takes the form of the analyst's reverie experience. In the course of this conjoint work of dreaming in the analytic setting, the analyst may get to know the analysand sufficiently well for the analyst to be able to say something that is true to what is occurring at an unconscious level in the analytic relationship. The analyst's use of language contributes significantly to the possibility that the patient will be able to make use of what the analyst has said for purposes of dreaming his own experience, thereby dreaming himself more fully into existence.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-05
... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small And Medium Business, Tampa, Florida; Verizon Business Networks Services, Inc., Senior Coordinator-Order... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
...,968B] Verizon Business Networks Services, Inc. Senior Analysts-Sales Impletmentation (SA-SI) Birmingham, Alabama; Verizon Business Networks Services, Inc. Senior Analysts-Sales Impletmentation (SA-SI) Service Program Delivery Division San Francisco, California; Verizon Business Networks Services, Inc.Senior...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-01
.... Securities Offering. Series 86 Research Analyst--Analysis..... From $160 to $175. Series 87 Research Analyst... Order Processing Assistant Representatives, Research Analysts and Operations Professionals, respectively... examination.\\7\\ \\6\\ PROCTOR is a computer system that is specifically designed for the administration and...
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
NASA Technical Reports Server (NTRS)
Smith, Andrew; LaVerde, Bruce; Jones, Douglas; Towner, Robert; Hunt, Ron
2013-01-01
Fluid structural interaction problems that estimate panel vibration from an applied pressure field excitation are quite dependent on the spatial correlation of the pressure field. There is a danger of either over estimating a low frequency response or under predicting broad band panel response in the more modally dense bands if the pressure field spatial correlation is not accounted for adequately. Even when the analyst elects to use a fitted function for the spatial correlation an error may be introduced if the choice of patch density is not fine enough to represent the more continuous spatial correlation function throughout the intended frequency range of interest. Both qualitative and quantitative illustrations evaluating the adequacy of different patch density assumptions to approximate the fitted spatial correlation function are provided. The actual response of a typical vehicle panel system is then evaluated in a convergence study where the patch density assumptions are varied over the same finite element model. The convergence study results are presented illustrating the impact resulting from a poor choice of patch density. The fitted correlation function used in this study represents a Diffuse Acoustic Field (DAF) excitation of the panel to produce vibration response.
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
78 FR 77769 - Data Collection Available for Public Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-24
... comments to Amy Garcia, Program Analyst, Office of Government Contracting, Small Business Administration, 409 3rd Street, 7th Floor, Washington, DC 20416. FOR FURTHER INFORMATION CONTACT: Amy Garcia, Program Analyst, 202-205- 6842, amy.garcia@sba.gov , or Curtis B. Rich, Management Analyst, 202- 205-7030, curtis...
Collaborative human-machine analysis using a controlled natural language
NASA Astrophysics Data System (ADS)
Mott, David H.; Shemanski, Donald R.; Giammanco, Cheryl; Braines, Dave
2015-05-01
A key aspect of an analyst's task in providing relevant information from data is the reasoning about the implications of that data, in order to build a picture of the real world situation. This requires human cognition, based upon domain knowledge about individuals, events and environmental conditions. For a computer system to collaborate with an analyst, it must be capable of following a similar reasoning process to that of the analyst. We describe ITA Controlled English (CE), a subset of English to represent analyst's domain knowledge and reasoning, in a form that it is understandable by both analyst and machine. CE can be used to express domain rules, background data, assumptions and inferred conclusions, thus supporting human-machine interaction. A CE reasoning and modeling system can perform inferences from the data and provide the user with conclusions together with their rationale. We present a logical problem called the "Analysis Game", used for training analysts, which presents "analytic pitfalls" inherent in many problems. We explore an iterative approach to its representation in CE, where a person can develop an understanding of the problem solution by incremental construction of relevant concepts and rules. We discuss how such interactions might occur, and propose that such techniques could lead to better collaborative tools to assist the analyst and avoid the "pitfalls".
Osborne, Nikola K P; Taylor, Michael C; Healey, Matthew; Zajac, Rachel
2016-03-01
It is becoming increasingly apparent that contextual information can exert a considerable influence on decisions about forensic evidence. Here, we explored accuracy and contextual influence in bloodstain pattern classification, and how these variables might relate to analyst characteristics. Thirty-nine bloodstain pattern analysts with varying degrees of experience each completed measures of compliance, decision-making style, and need for closure. Analysts then examined a bloodstain pattern without any additional contextual information, and allocated votes to listed pattern types according to favoured and less favoured classifications. Next, if they believed it would assist with their classification, analysts could request items of contextual information - from commonly encountered sources of information in bloodstain pattern analysis - and update their vote allocation. We calculated a shift score for each item of contextual information based on vote reallocation. Almost all forms of contextual information influenced decision-making, with medical findings leading to the highest shift scores. Although there was a small positive association between shift scores and the degree to which analysts displayed an intuitive decision-making style, shift scores did not vary meaningfully as a function of experience or the other characteristics measured. Almost all of the erroneous classifications were made by novice analysts. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
This study examined inter-analyst classification variability based on training site signature selection only for six classifications from a 10 km2 Landsat ETM+ image centered over a highly heterogeneous area in south-central Virginia. Six analysts classified the image...
Avila, Manuel; Graterol, Eduardo; Alezones, Jesús; Criollo, Beisy; Castillo, Dámaso; Kuri, Victoria; Oviedo, Norman; Moquete, Cesar; Romero, Marbella; Hanley, Zaida; Taylor, Margie
2012-06-01
The appearance of rice grain is a key aspect in quality determination. Mainly, this analysis is performed by expert analysts through visual observation; however, due to the subjective nature of the analysis, the results may vary among analysts. In order to evaluate the concordance between analysts from Latin-American rice quality laboratories for rice grain appearance through digital images, an inter-laboratory test was performed with ten analysts and images of 90 grains captured with a high resolution scanner. Rice grains were classified in four categories including translucent, chalky, white belly, and damaged grain. Data was categorized using statistic parameters like mode and its frequency, the relative concordance, and the reproducibility parameter kappa. Additionally, a referential image gallery of typical grain for each category was constructed based on mode frequency. Results showed a Kappa value of 0.49, corresponding to a moderate reproducibility, attributable to subjectivity in the visual analysis of grain images. These results reveal the need for standardize the evaluation criteria among analysts to improve the confidence of the determination of rice grain appearance.
Composable Analytic Systems for next-generation intelligence analysis
NASA Astrophysics Data System (ADS)
DiBona, Phil; Llinas, James; Barry, Kevin
2015-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.
Sociocultural Behavior Influence Modelling & Assessment: Current Work and Research Frontiers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, Michael Lewis
A common problem associated with the effort to better assess potential behaviors of various individuals within different countries is the shear difficulty in comprehending the dynamic nature of populations, particularly over time and considering feedback effects. This paper discusses a theory-based analytical capability designed to enable analysts to better assess the influence of events on individuals interacting within a country or region. These events can include changes in policy, man-made or natural disasters, migration, war, or other changes in environmental/economic conditions. In addition, this paper describes potential extensions of this type of research to enable more timely and accurate assessments.
A content analysis of analyst research: health care through the eyes of analysts.
Nielsen, Christian
2008-01-01
This article contributes to the understanding of how health care companies may communicate the business models by studying financial analysts' analyst reports. The study examines the differences between the information conveyed in recurrent and fundamental analyst reports as well as whether the characteristics of the analysts and their environment affect their business model analyses. A medium-sized health care company in the medical-technology sector, internationally renowned for its state-of-the-art business reporting, was chosen as the basis for the study. An analysis of 111 fundamental and recurrent analyst reports on this company by each investment bank actively following it was conducted using a content analysis methodology. The study reveals that the recurrent analyses are concerned with evaluating the information disclosed by the health care company itself and not so much with digging up new information. It also indicates that while maintenance work might be focused on evaluating specific details, fundamental research is more concerned with extending the understanding of the general picture, i.e., the sustainability and performance of the overall business model. The amount of financial information disclosed in either type of report is not correlated to the other disclosures in the reports. In comparison to business reporting practices, the fundamental analyst reports put considerably less weight on social and sustainability, intellectual capital and corporate governance information, and they disclose much less comparable non-financial information. The suggestion made is that looking at the types of information financial analysts consider important and convey to their "customers," the investors and fund managers, constitutes a valuable indication to health care companies regarding the needs of the financial market users of their reports and other communications. There are some limitations to the possibility of applying statistical tests to the data-set as well as methodological limitations in relation to the exclusion of tables and graphs.
Incorporating Spatial Data into Enterprise Applications
NASA Astrophysics Data System (ADS)
Akiki, Pierre; Maalouf, Hoda
The main goal of this chapter is to discuss the usage of spatial data within enterprise as well as smaller line-of-business applications. In particular, this chapter proposes new methodologies for storing and manipulating vague spatial data and provides methods for visualizing both crisp and vague spatial data. It also provides a comparison between different types of spatial data, mainly 2D crisp and vague spatial data, and their respective fields of application. Additionally, it compares existing commercial relational database management systems, which are the most widely used with enterprise applications, and discusses their deficiencies in terms of spatial data support. A new spatial extension package called Spatial Extensions (SPEX) is provided in this chapter and is tested on a software prototype.
Human/autonomy collaboration for the automated generation of intelligence products
NASA Astrophysics Data System (ADS)
DiBona, Phil; Schlachter, Jason; Kuter, Ugur; Goldman, Robert
2017-05-01
Intelligence Analysis remains a manual process despite trends toward autonomy in information processing. Analysts need agile decision--support tools that can adapt to the evolving information needs of the mission, allowing the analyst to pose novel analytic questions. Our research enables the analysts to only provide a constrained English specification of what the intelligence product should be. Using HTN planning, the autonomy discovers, decides, and generates a workflow of algorithms to create the intelligence product. Therefore, the analyst can quickly and naturally communicate to the autonomy what information product is needed, rather than how to create it.
ERIC Educational Resources Information Center
Watson, William J.
Occupational analysts using Comprehensive Occupational Data Analysis Programs (CODAP) make subjective decisions at various stages in their analysis of an occupation. The possibility exists that two different analysts could reach different conclusions in analyzing an occupation, and thereby provide divergent guidance to management. Two analysts,…
ERIC Educational Resources Information Center
Cepeda-Cuervo, Edilberto; Núñez-Antón, Vicente
2013-01-01
In this article, a proposed Bayesian extension of the generalized beta spatial regression models is applied to the analysis of the quality of education in Colombia. We briefly revise the beta distribution and describe the joint modeling approach for the mean and dispersion parameters in the spatial regression models' setting. Finally, we motivate…
Ground Magnetic Data for West-Central Colorado
Richard Zehner
2012-03-08
Modeled ground magnetic data was extracted from the Pan American Center for Earth and Environmental Studies database at http://irpsrvgis08.utep.edu/viewers/Flex/GravityMagnetic/GravityMagnetic_CyberShare/ on 2/29/2012. The downloaded text file was then imported into an Excel spreadsheet. This spreadsheet data was converted into an ESRI point shapefile in UTM Zone 13 NAD27 projection, showing location and magnetic field strength in nano-Teslas. This point shapefile was then interpolated to an ESRI grid using an inverse-distance weighting method, using ESRI Spatial Analyst. The grid was used to create a contour map of magnetic field strength.
The Pacor 2 expert system: A case-based reasoning approach to troubleshooting
NASA Technical Reports Server (NTRS)
Sary, Charisse
1994-01-01
The Packet Processor 2 (Pacor 2) Data Capture Facility (DCF) acquires, captures, and performs level-zero processing of packet telemetry for spaceflight missions that adhere to communication services recommendations established by the Consultative Committee for Space Data Systems (CCSDS). A major goal of this project is to reduce life-cycle costs. One way to achieve this goal is to increase automation. Through automation, using expert systems, and other technologies, staffing requirements will remain static, which will enable the same number of analysts to support more missions. Analysts provide packet telemetry data evaluation and analysis services for all data received. Data that passes this evaluation is forwarded to the Data Distribution Facility (DDF) and released to scientists. Through troubleshooting, data that fails this evaluation is dumped and analyzed to determine if its quality can be improved before it is released. This paper describes a proof-of-concept prototype that troubleshoots data quality problems. The Pacor 2 expert system prototype uses the case-based reasoning (CBR) approach to development, an alternative to a rule-based approach. Because Pacor 2 is not operational, the prototype has been developed using cases that describe existing troubleshooting experience from currently operating missions. Through CBR, this experience will be available to analysts when Pacor 2 becomes operational. As Pacor 2 unique experience is gained, analysts will update the case base. In essence, analysts are training the system as they learn. Once the system has learned the cases most likely to recur, it can serve as an aide to inexperienced analysts, a refresher to experienced analysts for infrequently occurring problems, or a training tool for new analysts. The Expert System Development Methodology (ESDM) is being used to guide development.
Understanding the health care business model: the financial analysts' point of view.
Bukh, Per Nikolaj; Nielsen, Christian
2010-01-01
This study focuses on how financial analysts understand the strategy of a health care company and which elements, from such a strategy perspective, they perceive as constituting the cornerstone of a health care company's business model. The empirical part of this study is based on semi-structured interviews with analysts following a large health care company listed on the Copenhagen Stock Exchange. The authors analyse how the financial analysts view strategy and value creation within the framework of a business model. Further, the authors analyze whether the characteristics emerging from a comprehensive literature review are reflected in the financial analysts' perceptions of which information is decision-relevant and important to communicate to the financial markets. Among the conclusions of the study is the importance of distinguishing between the health care companies' business model and the model by which the payment of revenues are allocated between end users and reimbursing organizations.
The analyst's authenticity: "if you see something, say something".
Goldstein, George; Suzuki, Jessica Y
2015-05-01
The history of authenticity in psychoanalysis is as old as analysis itself, but the analyst's authenticity in particular has become an increasingly important area of focus in recent decades. This article traces the development of conceptions of analytic authenticity and proposes that the analyst's spontaneous verbalization of his or her unformulated experience in session can be a potent force in the course of an analysis. We acknowledge that although analytic authenticity can be a challenging ideal for the analyst to strive for, it contains the power to transform the experience of the patient and the analyst, as well as the meaning of their work together. Whether it comes in the form of an insight-oriented comment or a simple acknowledgment of things as they seem to be, a therapist's willingness to speak aloud something that has lost its language is a powerful clinical phenomenon that transcends theoretical orientation and modality. © 2015 Wiley Periodicals, Inc.
Instruction in information structuring improves Bayesian judgment in intelligence analysts.
Mandel, David R
2015-01-01
An experiment was conducted to test the effectiveness of brief instruction in information structuring (i.e., representing and integrating information) for improving the coherence of probability judgments and binary choices among intelligence analysts. Forty-three analysts were presented with comparable sets of Bayesian judgment problems before and immediately after instruction. After instruction, analysts' probability judgments were more coherent (i.e., more additive and compliant with Bayes theorem). Instruction also improved the coherence of binary choices regarding category membership: after instruction, subjects were more likely to invariably choose the category to which they assigned the higher probability of a target's membership. The research provides a rare example of evidence-based validation of effectiveness in instruction to improve the statistical assessment skills of intelligence analysts. Such instruction could also be used to improve the assessment quality of other types of experts who are required to integrate statistical information or make probabilistic assessments.
The lure of the symptom in psychoanalytic treatment.
Ogden, Thomas H; Gabbard, Glen O
2010-06-01
Psychoanalysis, which at its core is a search for truth, stands in a subversive position vis-à-vis the contemporary therapeutic culture that places a premium on symptomatic "cure." Nevertheless, analysts are vulnerable to succumbing to the internal and external pressures for the achievement of symptomatic improvement. In this communication we trace the evolution of Freud's thinking about the relationship between the aims of psychoanalysis and the alleviation of symptoms. We note that analysts today may recapitulate Freud's early struggles in their pursuit of symptom removal. We present an account of a clinical consultation in which the analytic pair were ensnared in an impasse that involved the analyst's preoccupation with the intransigence of one of the patient's symptoms. We suggest alternative ways of working with these clinical issues and offer some thoughts on how our own work as analysts and consultants to colleagues has been influenced by our understanding of what frequently occurs when the analyst becomes symptom-focused.
Self-disclosure, trauma and the pressures on the analyst.
West, Marcus
2017-09-01
This paper argues that self-disclosure is intimately related to traumatic experience and the pressures on the analyst not to re-traumatize the patient or repeat traumatic dynamics. The paper gives a number of examples of such pressures and outlines the difficulties the analyst may experience in adopting an analytic attitude - attempting to stay as closely as possible with what the patient brings. It suggests that self-disclosure may be used to try to disconfirm the patient's negative sense of themselves or the analyst, or to try to induce a positive sense of self or of the analyst which, whilst well-meaning, may be missing the point and may be prolonging the patient's distress. Examples are given of staying with the co-construction of the traumatic early relational dynamics and thus working through the traumatic complex; this attitude is compared and contrasted with some relational psychoanalytic attitudes. © 2017, The Society of Analytical Psychology.
Hybrid methods for cybersecurity analysis :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Warren Leon,; Dunlavy, Daniel M.
2014-01-01
Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling andmore » analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and years to hours and days for the application of new modeling and analysis capabilities to emerging threats. The development and deployment framework has been generalized into the Hybrid Framework and incor- porated into several LDRD, WFO, and DOE/CSL projects and proposals. And most importantly, the Hybrid project has provided Sandia security analysts with new, scalable, extensible analytic capabilities that have resulted in alerts not detectable using their previous work ow tool sets.« less
Lee, Bruce Y; Wong, Kim F; Bartsch, Sarah M; Yilmaz, S Levent; Avery, Taliser R; Brown, Shawn T; Song, Yeohan; Singh, Ashima; Kim, Diane S; Huang, Susan S
2013-06-01
As healthcare systems continue to expand and interconnect with each other through patient sharing, administrators, policy makers, infection control specialists, and other decision makers may have to take account of the entire healthcare 'ecosystem' in infection control. We developed a software tool, the Regional Healthcare Ecosystem Analyst (RHEA), that can accept user-inputted data to rapidly create a detailed agent-based simulation model (ABM) of the healthcare ecosystem (ie, all healthcare facilities, their adjoining community, and patient flow among the facilities) of any region to better understand the spread and control of infectious diseases. To demonstrate RHEA's capabilities, we fed extensive data from Orange County, California, USA, into RHEA to create an ABM of a healthcare ecosystem and simulate the spread and control of methicillin-resistant Staphylococcus aureus. Various experiments explored the effects of changing different parameters (eg, degree of transmission, length of stay, and bed capacity). Our model emphasizes how individual healthcare facilities are components of integrated and dynamic networks connected via patient movement and how occurrences in one healthcare facility may affect many other healthcare facilities. A decision maker can utilize RHEA to generate a detailed ABM of any healthcare system of interest, which in turn can serve as a virtual laboratory to test different policies and interventions.
NASA Astrophysics Data System (ADS)
Al-Abadi, Alaa M.; Al-Shamma'a, Ayser M.; Aljabbari, Mukdad H.
2017-03-01
In this study, intrinsic groundwater vulnerability for the shallow aquifer in northeastern Missan governorate, south of Iraq is evaluated using commonly used DRASTIC model in framework of GIS environment. Preparation of DRASTIC parameters is attained through gathering data from different sources including field survey, geological and meteorological data, a digital elevation model DEM of the study area, archival database, and published research. The different data used to build DRASTIC model are arranged in a geospatial database using spatial analyst extension of ArcGIS 10.2 software. The obtained results related to the vulnerability to general contaminants show that the study area is characterized by two vulnerability zones: low and moderate. Ninety-four percentage (94 %) of the study area has a low class of groundwater vulnerability to contamination, whereas a total of (6 %) of the study area has moderate vulnerability. The pesticides DRASTIC index map shows that the study area is also characterized by two zones of vulnerability: low and moderate. The DRASTIC map of this version clearly shows that small percentage (13 %) of the study area has low vulnerability to contamination, and most parts have moderate vulnerability (about 87 %). The final results indicate that the aquifer system in the interested area is relatively protected from contamination on the groundwater surface. To mitigate the contamination risks in the moderate vulnerability zones, a protective measure must be put before exploiting the aquifer and before comprehensive agricultural activities begin in the area.
Department of the Air Force Information Technology Program FY 95 President’s Budget
1994-03-01
2095 2200 552 900 1032 Description: Contractor hardware maintenan support, systems analyst support software development and maintenance, and off -the...hardware maintenance support, systems analyst support, operations support, configuration management, test support, and off -the-shelf software license...2419 2505 2594 Description: Contractor hardware maintenance support, systems analyst support, operations support, and off -the-shelf software license
Micro-based fact collection tool user's manual
NASA Technical Reports Server (NTRS)
Mayer, Richard
1988-01-01
A procedure designed for use by an analyst to assist in the collection and organization of data gathered during the interview processes associated with system analysis and modeling task is described. The basic concept behind the development of this tool is that during the interview process an analyst is presented with assertions of facts by the domain expert. The analyst also makes observations of the domain. These facts need to be collected and preserved in such a way as to allow them to serve as the basis for a number of decision making processes throughout the system development process. This tool can be thought of as a computerization of the analysts's notebook.
Nothing but the truth: self-disclosure, self-revelation, and the persona of the analyst.
Levine, Susan S
2007-01-01
The question of the analyst's self-disclosure and self-revelation inhabits every moment of every psychoanalytic treatment. All self-disclosures and revelations, however, are not equivalent, and differentiating among them allows us to define a construct that can be called the analytic persona. Analysts already rely on an unarticulated concept of an analytic persona that guides them, for instance, as they decide what constitutes appropriate boundaries. Clinical examples illustrate how self-disclosures and revelations from within and without the analytic persona feel different, for both patient and analyst. The analyst plays a specific role for each patient and is both purposefully and unconsciously different in this context than in other settings. To a great degree, the self is a relational phenomenon. Our ethics call for us to tell nothing but the truth and simultaneously for us not to tell the whole truth. The unarticulated working concept of an analytic persona that many analysts have refers to the self we step out of at the close of each session and the self we step into as the patient enters the room. Attitudes toward self-disclosure and self-revelation can be considered reflections of how we conceptualize this persona.
Analyst-centered models for systems design, analysis, and development
NASA Technical Reports Server (NTRS)
Bukley, A. P.; Pritchard, Richard H.; Burke, Steven M.; Kiss, P. A.
1988-01-01
Much has been written about the possible use of Expert Systems (ES) technology for strategic defense system applications, particularly for battle management algorithms and mission planning. It is proposed that ES (or more accurately, Knowledge Based System (KBS)) technology can be used in situations for which no human expert exists, namely to create design and analysis environments that allow an analyst to rapidly pose many different possible problem resolutions in game like fashion and to then work through the solution space in search of the optimal solution. Portions of such an environment exist for expensive AI hardware/software combinations such as the Xerox LOOPS and Intellicorp KEE systems. Efforts are discussed to build an analyst centered model (ACM) using an ES programming environment, ExperOPS5 for a simple missile system tradeoff study. By analyst centered, it is meant that the focus of learning is for the benefit of the analyst, not the model. The model's environment allows the analyst to pose a variety of what if questions without resorting to programming changes. Although not an ES per se, the ACM would allow for a design and analysis environment that is much superior to that of current technologies.
Fault Tree in the Trenches, A Success Story
NASA Technical Reports Server (NTRS)
Long, R. Allen; Goodson, Amanda (Technical Monitor)
2000-01-01
Getting caught up in the explanation of Fault Tree Analysis (FTA) minutiae is easy. In fact, most FTA literature tends to address FTA concepts and methodology. Yet there seems to be few articles addressing actual design changes resulting from the successful application of fault tree analysis. This paper demonstrates how fault tree analysis was used to identify and solve a potentially catastrophic mechanical problem at a rocket motor manufacturer. While developing the fault tree given in this example, the analyst was told by several organizations that the piece of equipment in question had been evaluated by several committees and organizations, and that the analyst was wasting his time. The fault tree/cutset analysis resulted in a joint-redesign of the control system by the tool engineering group and the fault tree analyst, as well as bragging rights for the analyst. (That the fault tree found problems where other engineering reviews had failed was not lost on the other engineering groups.) Even more interesting was that this was the analyst's first fault tree which further demonstrates how effective fault tree analysis can be in guiding (i.e., forcing) the analyst to take a methodical approach in evaluating complex systems.
NASA Astrophysics Data System (ADS)
Davenport, Jack H.
2016-05-01
Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.
Homeland security application of the Army Soft Target Exploitation and Fusion (STEF) system
NASA Astrophysics Data System (ADS)
Antony, Richard T.; Karakowski, Joseph A.
2010-04-01
A fusion system that accommodates both text-based extracted information along with more conventional sensor-derived input has been developed and demonstrated in a terrorist attack scenario as part of the Empire Challenge (EC) 09 Exercise. Although the fusion system was developed to support Army military analysts, the system, based on a set of foundational fusion principles, has direct applicability to department of homeland security (DHS) & defense, law enforcement, and other applications. Several novel fusion technologies and applications were demonstrated in EC09. One such technology is location normalization that accommodates both fuzzy semantic expressions such as behind Library A, across the street from the market place, as well as traditional spatial representations. Additionally, the fusion system provides a range of fusion products not supported by traditional fusion algorithms. Many of these additional capabilities have direct applicability to DHS. A formal test of the fusion system was performed during the EC09 exercise. The system demonstrated that it was able to (1) automatically form tracks, (2) help analysts visualize behavior of individuals over time, (3) link key individuals based on both explicit message-based information as well as discovered (fusion-derived) implicit relationships, and (4) suggest possible individuals of interest based on their association with High Value Individuals (HVI) and user-defined key locations.
Thode, Aaron M; Kim, Katherine H; Blackwell, Susanna B; Greene, Charles R; Nations, Christopher S; McDonald, Trent L; Macrander, A Michael
2012-05-01
An automated procedure has been developed for detecting and localizing frequency-modulated bowhead whale sounds in the presence of seismic airgun surveys. The procedure was applied to four years of data, collected from over 30 directional autonomous recording packages deployed over a 280 km span of continental shelf in the Alaskan Beaufort Sea. The procedure has six sequential stages that begin by extracting 25-element feature vectors from spectrograms of potential call candidates. Two cascaded neural networks then classify some feature vectors as bowhead calls, and the procedure then matches calls between recorders to triangulate locations. To train the networks, manual analysts flagged 219 471 bowhead call examples from 2008 and 2009. Manual analyses were also used to identify 1.17 million transient signals that were not whale calls. The network output thresholds were adjusted to reject 20% of whale calls in the training data. Validation runs using 2007 and 2010 data found that the procedure missed 30%-40% of manually detected calls. Furthermore, 20%-40% of the sounds flagged as calls are not present in the manual analyses; however, these extra detections incorporate legitimate whale calls overlooked by human analysts. Both manual and automated methods produce similar spatial and temporal call distributions.
Better Incident Response with SCOT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruner, Todd
2015-04-01
SCOT is an incident response management system and knowledge base designed for incident responders by incident responders. SCOT increases the effectiveness of the team without adding undue burdens. Focused on reducing the friction between analysts and their tools, SCOT enables analysts to document and share their research and response efforts in near real time. Automatically identifying indicators and correlating those indicators, SCOT helps analysts discover and respond to advanced threats.
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel Anne
Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show that the basic Image Interpretation Elements continue to be important despite technological augmentation of the interpretation process. These results are used to derive a set of design guidelines for developing geovisual analytic tools and training to support image analysis.
Jardine, Andrew; Mullan, Narelle; Gudes, Ori; Cosford, James; Moncrieff, Simon; West, Geoff; Xiao, Jianguo; Yun, Grace; Someford, Peter
Place is of critical importance to health as it can reveal patterns of disease spread and clustering, associations with risk factors, and areas with greatest need for, or least access to healthcare services and promotion activities. Furthermore, in order to get a good understanding of the health status and needs of a particular area a broad range of data are required which can often be difficult and time consuming to obtain and collate. This process has been expedited by bringing together multiple data sources and making them available in an online geo-visualisation, HealthTracks, which consists of a mapping and reporting component. The overall aim of the HealthTracks project is to make spatial health information more accessible to policymakers, analysts, planners and program managers to inform decision-making across the Department of Health Western Australia. Preliminary mapping and reporting applications that have been utilised to inform service planning, increased awareness of the utility of spatial information and improved efficiency in data access were developed. The future for HealthTracks involves expanding the range of data available and developing new analytical capabilities in order to work towards providing external agencies, researchers and eventually the general public access to rich local area spatial data.
Problems of internalization: a button is a button is-not.
Rockwell, Shelley
2014-01-01
Analysts hope to help the patient internalize a relationship with the analyst that contrasts with the original archaic object relation. In this paper, the author describes particular difficulties in working with a patient whose defenses and anxieties were bulimic, her movement toward internalization inevitably undone. Several issues are considered: how does the nonsymbolizing patient come to internalize the analyst's understanding, and when this does not hold, what is the nature of the patient's subsequent methods of dispersal? When the patient can maintain connection to the analyst as a good object, even fleetingly, in the depressive position, the possibility of internalization and symbolic communication is increased. © 2014 The Psychoanalytic Quarterly, Inc.
Telling about the analyst's pregnancy.
Uyehara, L A; Austrian, S; Upton, L G; Warner, R H; Williamson, R A
1995-01-01
Pregnancy is one of several events in the life of an analyst which may affect an analysis, calling for special technical considerations. For the analyst, this exception to the tenet of anonymity, along with countertransference guilt, narcissistic preoccupation, heightened infantile conflicts, and intense patient responses, may stimulate anxiety that becomes focused on the timing and manner of informing the patient. For the patient, preoccupation with the timing of the telling may serve as a displacement from other meanings of the pregnancy. Candidate analysts may face particular difficulties managing the impact of their pregnancies on control cases. We address practical and technical considerations in telling, the transference and counter-transference surrounding it, ethical concerns, and the challenges of supervising a pregnant candidate.
USDA analyst review of the LACIE IMAGE-100 hybrid system test
NASA Technical Reports Server (NTRS)
Ashburn, P.; Buelow, K.; Hansen, H. L.; May, G. A. (Principal Investigator)
1979-01-01
Fifty operational segments from the U.S.S.R., 40 test segments from Canada, and 24 test segments from the United States were used to provide a wide range of geographic conditions for USDA analysts during a test to determine the effectiveness of labeling single pixel training fields (dots) using Procedure 1 on the 1-100 hybrid system, and clustering and classifying on the Earth Resources Interactive Processing System. The analysts had additional on-line capabilities such as interactive dot labeling, class or cluster map overlay flickers, and flashing of all dots of equal spectral value. Results on the 1-100 hybrid system are described and analyst problems and recommendations are discussed.
The Generic Spacecraft Analyst Assistant (gensaa): a Tool for Developing Graphical Expert Systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M.
1993-01-01
During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real-time data. The analysts must watch for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As the satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At NASA GSFC, fault-isolation expert systems are in operation supporting this data monitoring task. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will readily support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.
2016-11-01
Display Design, Methods , and Results for a User Study by Christopher J Garneau and Robert F Erbacher Approved for public...NOV 2016 US Army Research Laboratory Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods ...January 2013–September 2015 4. TITLE AND SUBTITLE Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods
AN ANALYST'S UNCERTAINTY AND FEAR.
Chused, Judith Fingert
2016-10-01
The motivations for choosing psychoanalysis as a profession are many and differ depending on the psychology of the analyst. However, common to most psychoanalysts is the desire to forge a helpful relationship with the individuals with whom they work therapeutically. This article presents an example of what happens when an analyst is confronted by a patient for whom being in a relationship and being helped are intolerable. © 2016 The Psychoanalytic Quarterly, Inc.
Mander, Luke; Baker, Sarah J.; Belcher, Claire M.; Haselhorst, Derek S.; Rodriguez, Jacklyn; Thorn, Jessica L.; Tiwari, Shivangi; Urrego, Dunia H.; Wesseln, Cassandra J.; Punyasena, Surangi W.
2014-01-01
• Premise of the study: Humans frequently identify pollen grains at a taxonomic rank above species. Grass pollen is a classic case of this situation, which has led to the development of computational methods for identifying grass pollen species. This paper aims to provide context for these computational methods by quantifying the accuracy and consistency of human identification. • Methods: We measured the ability of nine human analysts to identify 12 species of grass pollen using scanning electron microscopy images. These are the same images that were used in computational identifications. We have measured the coverage, accuracy, and consistency of each analyst, and investigated their ability to recognize duplicate images. • Results: Coverage ranged from 87.5% to 100%. Mean identification accuracy ranged from 46.67% to 87.5%. The identification consistency of each analyst ranged from 32.5% to 87.5%, and each of the nine analysts produced considerably different identification schemes. The proportion of duplicate image pairs that were missed ranged from 6.25% to 58.33%. • Discussion: The identification errors made by each analyst, which result in a decline in accuracy and consistency, are likely related to psychological factors such as the limited capacity of human memory, fatigue and boredom, recency effects, and positivity bias. PMID:25202649
ATTDES: An Expert System for Satellite Attitude Determination and Control. 2
NASA Technical Reports Server (NTRS)
Mackison, Donald L.; Gifford, Kevin
1996-01-01
The design, analysis, and flight operations of satellite attitude determintion and attitude control systems require extensive mathematical formulations, optimization studies, and computer simulation. This is best done by an analyst with extensive education and experience. The development of programs such as ATTDES permit the use of advanced techniques by those with less experience. Typical tasks include the mission analysis to select stabilization and damping schemes, attitude determination sensors and algorithms, and control system designs to meet program requirements. ATTDES is a system that includes all of these activities, including high fidelity orbit environment models that can be used for preliminary analysis, parameter selection, stabilization schemes, the development of estimators covariance analyses, and optimization, and can support ongoing orbit activities. The modification of existing simulations to model new configurations for these purposes can be an expensive, time consuming activity that becomes a pacing item in the development and operation of such new systems. The use of an integrated tool such as ATTDES significantly reduces the effort and time required for these tasks.
A note on notes: note taking and containment.
Levine, Howard B
2007-07-01
In extreme situations of massive projective identification, both the analyst and the patient may come to share a fantasy or belief that his or her own psychic reality will be annihilated if the psychic reality of the other is accepted or adopted (Britton 1998). In the example of' Dr. M and his patient, the paradoxical dilemma around note taking had highly specific transference meanings; it was not simply an instance of the generalized human response of distracted attention that Freud (1912) had spoken of, nor was it the destabilization of analytic functioning that I tried to describe in my work with Mr. L. Whether such meanings will always exist in these situations remains a matter to be determined by further clinical experience. In reopening a dialogue about note taking during sessions, I have attempted to move the discussion away from categorical injunctions about what analysis should or should not do, and instead to foster a more nuanced, dynamic, and pair-specific consideration of the analyst's functioning in the immediate context of the analytic relationship. There is, of course, a wide variety of listening styles among analysts, and each analyst's mental functioning may be affected differently by each patient whom the analyst sees. I have raised many questions in the hopes of stimulating an expanded discussion that will allow us to share our experiences and perhaps reach additional conclusions. Further consideration may lead us to decide whether note taking may have very different meanings for other analysts and analyst-patient pairs, and whether it may serve useful functions in addition to the one that I have described.
A psychoanalytical phenomenology of perversion.
Jiménez, Juan Pablo
2004-02-01
After stating that the current tasks of psychoanalytic research should fundamentally include the exploration of the analyst's mental processes in sessions with the patient, the author describes the analytical relation as one having an intersubjective nature. Seen from the outside, the analytical relation evidences two poles: a symmetric structural pole where both analyst and patient share a single world and a single approach to reality, and a functional asymmetric pole that defines the assignment of the respective roles. In the analysis of a perverse patient, the symmetry-asymmetry polarities acquire some very particular characteristics. Seen from the perspective of the analyst's subjectivity, perversion appears in the analyst's mind as a surreptitious and unexpected transgression of the basic agreement that facilitates and structures intersubjective encounters. It may go as far as altering the Aristotelian rules of logic. When coming into contact with the psychic reality of a perverse patient, what happens in the analyst's mind is that a world takes shape. This world is misleadingly coloured by an erotisation that sooner or later will acquire some characteristics of violence. The perverse nucleus, as a false reality, remains dangling in mid-air as an experience that is inaccessible to the analyst's empathy. The only way the analyst can reach it is from the 'periphery' of the patient's psychic reality, by trying in an indirect way to lead him back to his intersubjective roots. At this point, the author's intention is to explain this intersubjective phenomenon in terms of metapsychological and empirical research-based theories. Finally, some ideas on the psychogenesis of perversion are set forth.
COMET-AR User's Manual: COmputational MEchanics Testbed with Adaptive Refinement
NASA Technical Reports Server (NTRS)
Moas, E. (Editor)
1997-01-01
The COMET-AR User's Manual provides a reference manual for the Computational Structural Mechanics Testbed with Adaptive Refinement (COMET-AR), a software system developed jointly by Lockheed Palo Alto Research Laboratory and NASA Langley Research Center under contract NAS1-18444. The COMET-AR system is an extended version of an earlier finite element based structural analysis system called COMET, also developed by Lockheed and NASA. The primary extensions are the adaptive mesh refinement capabilities and a new "object-like" database interface that makes COMET-AR easier to extend further. This User's Manual provides a detailed description of the user interface to COMET-AR from the viewpoint of a structural analyst.
Chang, Jeff; Ip, Matthew; Yang, Michael; Wong, Brendon; Power, Theresa; Lin, Lisa; Xuan, Wei; Phan, Tri Giang; Leong, Rupert W
2016-04-01
Confocal laser endomicroscopy can dynamically assess intestinal mucosal barrier defects and increased intestinal permeability (IP). These are functional features that do not have corresponding appearance on histopathology. As such, previous pathology training may not be beneficial in learning these dynamic features. This study aims to evaluate the diagnostic accuracy, learning curve, inter- and intraobserver agreement for identifying features of increased IP in experienced and inexperienced analysts and pathologists. A total of 180 endoscopic confocal laser endomicroscopy (Pentax EC-3870FK; Pentax, Tokyo, Japan) images of the terminal ileum, subdivided into 6 sets of 30 were evaluated by 6 experienced analysts, 13 inexperienced analysts, and 2 pathologists, after a 30-minute teaching session. Cell-junction enhancement, fluorescein leak, and cell dropout were used to represent increased IP and were either present or absent in each image. For each image, the diagnostic accuracy, confidence, and quality were assessed. Diagnostic accuracy was significantly higher for experienced analysts compared with inexperienced analysts from the first set (96.7% vs 83.1%, P < .001) to the third set (95% vs 89.7, P = .127). No differences in accuracy were noted between inexperienced analysts and pathologists. Confidence (odds ratio, 8.71; 95% confidence interval, 5.58-13.57) and good image quality (odds ratio, 1.58; 95% confidence interval, 1.22-2.03) were associated with improved interpretation. Interobserver agreement κ values were high and improved with experience (experienced analysts, 0.83; inexperienced analysts, 0.73; and pathologists, 0.62). Intraobserver agreement was >0.86 for experienced observers. Features representative of increased IP can be rapidly learned with high inter- and intraobserver agreement. Confidence and image quality were significant predictors of accurate interpretation. Previous pathology training did not have an effect on learning. Copyright © 2016 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.
Netzel, Pawel
2017-01-01
The United States is increasingly becoming a multi-racial society. To understand multiple consequences of this overall trend to our neighborhoods we need a methodology capable of spatio-temporal analysis of racial diversity at the local level but also across the entire U.S. Furthermore, such methodology should be accessible to stakeholders ranging from analysts to decision makers. In this paper we present a comprehensive framework for visualizing and analyzing diversity data that fulfills such requirements. The first component of our framework is a U.S.-wide, multi-year database of race sub-population grids which is freely available for download. These 30 m resolution grids have being developed using dasymetric modeling and are available for 1990-2000-2010. We summarize numerous advantages of gridded population data over commonly used Census tract-aggregated data. Using these grids frees analysts from constructing their own and allows them to focus on diversity analysis. The second component of our framework is a set of U.S.-wide, multi-year diversity maps at 30 m resolution. A diversity map is our product that classifies the gridded population into 39 communities based on their degrees of diversity, dominant race, and population density. It provides spatial information on diversity in a single, easy-to-understand map that can be utilized by analysts and end users alike. Maps based on subsequent Censuses provide information about spatio-temporal dynamics of diversity. Diversity maps are accessible through the GeoWeb application SocScape (http://sil.uc.edu/webapps/socscape_usa/) for an immediate online exploration. The third component of our framework is a proposal to quantitatively analyze diversity maps using a set of landscape metrics. Because of its form, a grid-based diversity map could be thought of as a diversity “landscape” and analyzed quantitatively using landscape metrics. We give a brief summary of most pertinent metrics and demonstrate how they can be applied to diversity maps. PMID:28358862
MAGIC Computer Simulation. Volume 2: Analyst Manual, Part 1
1971-05-01
A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1971 4. TITLE AND SUBTITLE MAGIC Computer Simulation Analyst Manual Part 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...14. ABSTRACT The MAGIC computer simulation generates target description data consisting of item-by-item listings of the target’s components and air
2013-11-01
by existing cyber-attack detection tools far exceeds the analysts’ cognitive capabilities. Grounded in perceptual and cognitive theory , many visual...Processes Inspired by the sense-making theory discussed earlier, we model the analytical reasoning process of cyber analysts using three key...analyst are called “working hypotheses”); each hypothesis could trigger further actions to confirm or disconfirm it. New actions will lead to new
Evaluating the O*NET Occupational Analysis System for Army Competency Development
2008-07-01
Experts (SMEs) and collecting ability and skill ratings using trained analysts. The results showed that Army SMEs as well as other types of analysts could...Sciences 2511 Jefferson Davis Highway, Arlington, Virginia 22202-3926 4 July 2008 Army Project Number Personnel and Training 665803D730 Analysis...using trained analysts. SMEs were non-commissioned officers (NCOs) or officers with several years of experience in the Army and their occupations, and
Artman-Meeker, Kathleen; Rosenberg, Nancy; Badgett, Natalie; Yang, Xueyan; Penney, Ashley
2017-09-01
Behavior analysts play an important role in supporting the behavior and learning of young children with disabilities in natural settings. However, there is very little research related specifically to developing the skills and competencies needed by pre-service behavior analysts. This study examined the effects of "bug-in-ear" (BIE) coaching on pre-service behavior analysts' implementation of functional communication training with pre-school children with autism in their classrooms. BIE coaching was associated with increases in the rate of functional communication training trials each intern initiated per session and in the fidelity with which interns implemented functional communication training. Adults created more intentional opportunities for children to communicate, and adults provided more systematic instruction around those opportunities.
Physics-based and human-derived information fusion for analysts
NASA Astrophysics Data System (ADS)
Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael
2017-05-01
Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.
NASA Technical Reports Server (NTRS)
Mcmurtry, G. J.; Petersen, G. W. (Principal Investigator)
1975-01-01
The author has identified the following significant results. It was found that the high speed man machine interaction capability is a distinct advantage of the image 100; however, the small size of the digital computer in the system is a definite limitation. The system can be highly useful in an analysis mode in which it complements a large general purpose computer. The image 100 was found to be extremely valuable in the analysis of aircraft MSS data where the spatial resolution begins to approach photographic quality and the analyst can exercise interpretation judgements and readily interact with the machine.
Gis-Based Accessibility Analysis of Urban Emergency Shelters: the Case of Adana City
NASA Astrophysics Data System (ADS)
Unal, M.; Uslu, C.
2016-10-01
Accessibility analysis of urban emergency shelters can help support urban disaster prevention planning. Pre-disaster emergency evacuation zoning has become a significant topic on disaster prevention and mitigation research. In this study, we assessed the level of serviceability of urban emergency shelters within maximum capacity, usability, sufficiency and a certain walking time limit by employing spatial analysis techniques of GIS-Network Analyst. The methodology included the following aspects: the distribution analysis of emergency evacuation demands, the calculation of shelter space accessibility and the optimization of evacuation destinations. This methodology was applied to Adana, a city in Turkey, which is located within the Alpine-Himalayan orogenic system, the second major earthquake belt after the Pacific-Belt. It was found that the proposed methodology could be useful in aiding to understand the spatial distribution of urban emergency shelters more accurately and establish effective future urban disaster prevention planning. Additionally, this research provided a feasible way for supporting emergency management in terms of shelter construction, pre-disaster evacuation drills and rescue operations.
Ristić, Vladica; Maksin, Marija; Nenković-Riznić, Marina; Basarić, Jelena
2018-01-15
The process of making decisions on sustainable development and construction begins in spatial and urban planning when defining the suitability of using land for sustainable construction in a protected area (PA) and its immediate and regional surroundings. The aim of this research is to propose and assess a model for evaluating land-use suitability for sustainable construction in a PA and its surroundings. The methodological approach of Multi-Criteria Decision Analysis was used in the formation of this model and adapted for the research; it was combined with the adapted Analytical hierarchy process and the Delphi process, and supported by a geographical information system (GIS) within the framework of ESRI ArcGIS software - Spatial analyst. The model is applied to the case study of Sara mountain National Park in Kosovo. The result of the model is a "map of integrated assessment of land-use suitability for sustainable construction in a PA for the natural factor". Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Munawar, Iqra
2016-07-01
Crime mapping is a dynamic process. It can be used to assist all stages of the problem solving process. Mapping crime can help police protect citizens more effectively. The decision to utilize a certain type of map or design element may change based on the purpose of a map, the audience or the available data. If the purpose of the crime analysis map is to assist in the identification of a particular problem, selected data may be mapped to identify patterns of activity that have been previously undetected. The main objective of this research was to study the spatial distribution patterns of the four common crimes i.e Narcotics, Arms, Burglary and Robbery in Gujranwala City using spatial statistical techniques to identify the hotspots. Hotspots or location of clusters were identified using Getis-Ord Gi* Statistic. Crime analysis mapping can be used to conduct a comprehensive spatial analysis of the problem. Graphic presentations of such findings provide a powerful medium to communicate conditions, patterns and trends thus creating an avenue for analysts to bring about significant policy changes. Moreover Crime mapping also helps in the reduction of crime rate.
Multivariate Statistics Applied to Seismic Phase Picking
NASA Astrophysics Data System (ADS)
Velasco, A. A.; Zeiler, C. P.; Anderson, D.; Pingitore, N. E.
2008-12-01
The initial effort of the Seismogram Picking Error from Analyst Review (SPEAR) project has been to establish a common set of seismograms to be picked by the seismological community. Currently we have 13 analysts from 4 institutions that have provided picks on the set of 26 seismograms. In comparing the picks thus far, we have identified consistent biases between picks from different institutions; effects of the experience of analysts; and the impact of signal-to-noise on picks. The institutional bias in picks brings up the important concern that picks will not be the same between different catalogs. This difference means less precision and accuracy when combing picks from multiple institutions. We also note that depending on the experience level of the analyst making picks for a catalog the error could fluctuate dramatically. However, the experience level is based off of number of years in picking seismograms and this may not be an appropriate criterion for determining an analyst's precision. The common data set of seismograms provides a means to test an analyst's level of precision and biases. The analyst is also limited by the quality of the signal and we show that the signal-to-noise ratio and pick error are correlated to the location, size and distance of the event. This makes the standard estimate of picking error based on SNR more complex because additional constraints are needed to accurately constrain the measurement error. We propose to extend the current measurement of error by adding the additional constraints of institutional bias and event characteristics to the standard SNR measurement. We use multivariate statistics to model the data and provide constraints to accurately assess earthquake location and measurement errors.
Station Set Residual: Event Classification Using Historical Distribution of Observing Stations
NASA Astrophysics Data System (ADS)
Procopio, Mike; Lewis, Jennifer; Young, Chris
2010-05-01
Analysts working at the International Data Centre in support of treaty monitoring through the Comprehensive Nuclear-Test-Ban Treaty Organization spend a significant amount of time reviewing hypothesized seismic events produced by an automatic processing system. When reviewing these events to determine their legitimacy, analysts take a variety of approaches that rely heavily on training and past experience. One method used by analysts to gauge the validity of an event involves examining the set of stations involved in the detection of an event. In particular, leveraging past experience, an analyst can say that an event located in a certain part of the world is expected to be detected by Stations A, B, and C. Implicit in this statement is that such an event would usually not be detected by Stations X, Y, or Z. For some well understood parts of the world, the absence of one or more "expected" stations—or the presence of one or more "unexpected" stations—is correlated with a hypothesized event's legitimacy and to its survival to the event bulletin. The primary objective of this research is to formalize and quantify the difference between the observed set of stations detecting some hypothesized event, versus the expected set of stations historically associated with detecting similar nearby events close in magnitude. This Station Set Residual can be quantified in many ways, some of which are correlated with the analysts' determination of whether or not the event is valid. We propose that this Station Set Residual score can be used to screen out certain classes of "false" events produced by automatic processing with a high degree of confidence, reducing the analyst burden. Moreover, we propose that the visualization of the historically expected distribution of detecting stations can be immediately useful as an analyst aid during their review process.
2016-01-01
of data science within DIA and ensure the activities assist and inform DIA’s decisionmakers, analysts , and operators. The research addressed two key...by an analyst or researcher . This type of identifi- cation can be time-consuming and potentially full of errors. GENIE learns from ana- 1 Interview... analysts . The protocol can be found in Appendix A. The protocol was intended to elicit information in five broad research areas. First, we asked a
Grand Strategy: Contending Contemporary Analyst Views and Implications for the U.S. Navy
2011-11-01
Grand Strategy Contending Contemporary Analyst Views and Implications for the U.S. Navy Elbridge Colby CRM D0025423.A2/Final November...NOV 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Grand Strategy: Contending Contemporary Analyst Views...implications for the country, the U.S. armed forces, and the U.S. Navy. Two other categories—isolationism (an oft-mentioned contender in political
Proactive human-computer collaboration for information discovery
NASA Astrophysics Data System (ADS)
DiBona, Phil; Shilliday, Andrew; Barry, Kevin
2016-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.
Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less
The reality of the other: dreaming of the analyst.
Ferruta, Anna
2009-02-01
The author discusses the obstacles to symbolization encountered when the analyst appears in the first dream of an analysis: the reality of the other is represented through the seeming recognition of the person of the analyst, who is portrayed in undisguised form. The interpretation of this first dream gives rise to reflections on the meaning of the other's reality in analysis: precisely this realistic representation indicates that the function of the other in the construction of the psychic world has been abolished. An analogous phenomenon is observed in the countertransference, as the analyst's mental processes are occluded by an exclusively self-generated interpretation of the patient's psychic world. For the analyst too, the reality of the other proves not to play a significant part in the construction of her interpretation. A 'turning-point' dream after five years bears witness to the power of the transforming function performed by the other throughout the analysis, by way of the representation of characters who stand for the necessary presence of a third party in the construction of a personal psychic reality. The author examines the mutual denial of the other's otherness, as expressed by the vicissitudes of the transference and countertransference between analyst and patient, otherness being experienced as a disturbance of self-sufficient narcissistic functioning. The paper ends with an analysis of the transformations that took place in the analytic relationship.
Gabbard, Glen O; Ogden, Thomas H
2009-04-01
One has the opportunity and responsibility to become an analyst in one's own terms in the course of the years of practice that follow the completion of formal analytic training. The authors discuss their understanding of some of the maturational experiences that have contributed to their becoming analysts in their own terms. They believe that the most important element in the process of their maturation as analysts has been the development of the capacity to make use of what is unique and idiosyncratic to each of them; each, when at his best, conducts himself as an analyst in a way that reflects his own analytic style; his own way of being with, and talking with, his patients; his own form of the practice of psychoanalysis. The types of maturational experiences that the authors examine include situations in which they have learned to listen to themselves speak with their patients and, in so doing, begin to develop a voice of their own; experiences of growth that have occurred in the context of presenting clinical material to a consultant; making self-analytic use of their experience with their patients; creating/discovering themselves as analysts in the experience of analytic writing (with particular attention paid to the maturational experience involved in writing the current paper); and responding to a need to keep changing, to be original in their thinking and behavior as analysts.
Neurotechnology for intelligence analysts
NASA Astrophysics Data System (ADS)
Kruse, Amy A.; Boyd, Karen C.; Schulman, Joshua J.
2006-05-01
Geospatial Intelligence Analysts are currently faced with an enormous volume of imagery, only a fraction of which can be processed or reviewed in a timely operational manner. Computer-based target detection efforts have failed to yield the speed, flexibility and accuracy of the human visual system. Rather than focus solely on artificial systems, we hypothesize that the human visual system is still the best target detection apparatus currently in use, and with the addition of neuroscience-based measurement capabilities it can surpass the throughput of the unaided human severalfold. Using electroencephalography (EEG), Thorpe et al1 described a fast signal in the brain associated with the early detection of targets in static imagery using a Rapid Serial Visual Presentation (RSVP) paradigm. This finding suggests that it may be possible to extract target detection signals from complex imagery in real time utilizing non-invasive neurophysiological assessment tools. To transform this phenomenon into a capability for defense applications, the Defense Advanced Research Projects Agency (DARPA) currently is sponsoring an effort titled Neurotechnology for Intelligence Analysts (NIA). The vision of the NIA program is to revolutionize the way that analysts handle intelligence imagery, increasing both the throughput of imagery to the analyst and overall accuracy of the assessments. Successful development of a neurobiologically-based image triage system will enable image analysts to train more effectively and process imagery with greater speed and precision.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogen, Paul Logasa; McKenzie, Amber T; Gillen, Rob
Forensic document analysis has become an important aspect of investigation of many different kinds of crimes from money laundering to fraud and from cybercrime to smuggling. The current workflow for analysts includes powerful tools, such as Palantir and Analyst s Notebook, for moving from evidence to actionable intelligence and tools for finding documents among the millions of files on a hard disk, such as FTK. However, the analysts often leave the process of sorting through collections of seized documents to filter out the noise from the actual evidence to a highly labor-intensive manual effort. This paper presents the Redeye Analysismore » Workbench, a tool to help analysts move from manual sorting of a collection of documents to performing intelligent document triage over a digital library. We will discuss the tools and techniques we build upon in addition to an in-depth discussion of our tool and how it addresses two major use cases we observed analysts performing. Finally, we also include a new layout algorithm for radial graphs that is used to visualize clusters of documents in our system.« less
The Analyst's "Use" of Theory or Theories: The Play of Theory.
Cooper, Steven H
2017-10-01
Two clinical vignettes demonstrate a methodological approach that guides the analyst's attention to metaphors and surfaces that are the focus of different theories. Clinically, the use of different theories expands the metaphorical language with which the analyst tries to make contact with the patient's unconscious life. Metaphorical expressions may be said to relate to each other as the syntax of unconscious fantasy (Arlow 1979). The unconscious fantasy itself represents a metaphorical construction of childhood experience that has persisted, dynamically expressive and emergent into adult life. This persistence is evident in how, in some instances, long periods of an analysis focus on translating one or a few metaphors, chiefly because the manifest metaphorical expressions of a central theme regularly lead to better understanding of an unconscious fantasy. At times employing another model or theory assists in a level of self-reflection about clinical understanding and clinical decisions. The analyst's choice of theory or theories is unique to the analyst and is not prescriptive, except as illustrating a way to think about these issues. The use of multiple models in no way suggests or implies that theories may be integrated.
Racing chemistry: A century of challenges and progress.
Kremmer, Christopher
2017-09-01
Horseracing has been called 'one of the first quintessentially modern sports'. Its urge towards standardization, its mathematically set odds, its concern with weights, and its pioneering embrace of drug-testing reflect an empirical temperament crucial to its transformation from a gentleman's pastime to a global industry funded by wagering. Ironically, in the late nineteenth century, it was modern science itself, and in particular the purification and synthesis of the drugs of nature, that turned the doping of racing animals - a practice recorded in antiquity - into an organized criminal enterprise. This paper presents original research into the history of racing chemistry in Australia in the context of developments in the field worldwide. Using a case-study approach based on extensive archival materials, it reveals unpublished diaries kept by an analyst working at Sydney Racing Laboratory in the 1950s that document conflicts between scientists over identification of performance drugs in racing animals. The author presents evidence that augments and revises earlier narratives concerning the history of the establishment of laboratory control at Australian racetracks and the removal of the country's first official analyst for racing, Miss Jean Kimble. The Kimble case illustrates the inevitable political, professional, and personal pressures that bear upon drug-testing in sports, and also conflicts between scientists over standards and priorities. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Training for spacecraft technical analysts
NASA Technical Reports Server (NTRS)
Ayres, Thomas J.; Bryant, Larry
1989-01-01
Deep space missions such as Voyager rely upon a large team of expert analysts who monitor activity in the various engineering subsystems of the spacecraft and plan operations. Senior teammembers generally come from the spacecraft designers, and new analysts receive on-the-job training. Neither of these methods will suffice for the creation of a new team in the middle of a mission, which may be the situation during the Magellan mission. New approaches are recommended, including electronic documentation, explicit cognitive modeling, and coached practice with archived data.
What's in a name: what analyst and patient call each other.
Barron, Grace Caroline
2006-01-01
Awkward moments often arise between patient and analyst involving the question, "What do we call each other?" The manner in which the dyad address each other contains material central to the patient's inner life. Names, like dreams, deserve a privileged status as providing a royal road into the paradoxical analytic relationship and the unconscious conflicts that feed it. Whether an analyst addresses the patient formally, informally, or not at all, awareness of the issues surrounding names is important.
Smith, J. LaRue; Damar, Nancy A.; Charlet, David A.; Westenburg, Craig L.
2014-01-01
DigitalGlobe’s QuickBird satellite high-resolution multispectral imagery was classified by using Visual Learning Systems’ Feature Analyst feature extraction software to produce land-cover data sets for the Red Rock Canyon National Conservation Area and the Coyote Springs, Piute-Eldorado Valley, and Mormon Mesa Areas of Critical Environmental Concern in Clark County, Nevada. Over 1,000 vegetation field samples were collected at the stand level. The field samples were classified to the National Vegetation Classification Standard, Version 2 hierarchy at the alliance level and above. Feature extraction models were developed for vegetation on the basis of the spectral and spatial characteristics of selected field samples by using the Feature Analyst hierarchical learning process. Individual model results were merged to create one data set for the Red Rock Canyon National Conservation Area and one for each of the Areas of Critical Environmental Concern. Field sample points and photographs were used to validate and update the data set after model results were merged. Non-vegetation data layers, such as roads and disturbed areas, were delineated from the imagery and added to the final data sets. The resulting land-cover data sets are significantly more detailed than previously were available, both in resolution and in vegetation classes.
Pattern detection in stream networks: Quantifying spatialvariability in fish distribution
Torgersen, Christian E.; Gresswell, Robert E.; Bateman, Douglas S.
2004-01-01
Biological and physical properties of rivers and streams are inherently difficult to sample and visualize at the resolution and extent necessary to detect fine-scale distributional patterns over large areas. Satellite imagery and broad-scale fish survey methods are effective for quantifying spatial variability in biological and physical variables over a range of scales in marine environments but are often too coarse in resolution to address conservation needs in inland fisheries management. We present methods for sampling and analyzing multiscale, spatially continuous patterns of stream fishes and physical habitat in small- to medium-size watersheds (500–1000 hectares). Geospatial tools, including geographic information system (GIS) software such as ArcInfo dynamic segmentation and ArcScene 3D analyst modules, were used to display complex biological and physical datasets. These tools also provided spatial referencing information (e.g. Cartesian and route-measure coordinates) necessary for conducting geostatistical analyses of spatial patterns (empirical semivariograms and wavelet analysis) in linear stream networks. Graphical depiction of fish distribution along a one-dimensional longitudinal profile and throughout the stream network (superimposed on a 10-metre digital elevation model) provided the spatial context necessary for describing and interpreting the relationship between landscape pattern and the distribution of coastal cutthroat trout (Oncorhynchus clarki clarki) in western Oregon, U.S.A. The distribution of coastal cutthroat trout was highly autocorrelated and exhibited a spherical semivariogram with a defined nugget, sill, and range. Wavelet analysis of the main-stem longitudinal profile revealed periodicity in trout distribution at three nested spatial scales corresponding ostensibly to landscape disturbances and the spacing of tributary junctions.
Standardized acquisition, storing and provision of 3D enabled spatial data
NASA Astrophysics Data System (ADS)
Wagner, B.; Maier, S.; Peinsipp-Byma, E.
2017-05-01
In the area of working with spatial data, in addition to the classic, two-dimensional geometrical data (maps, aerial images, etc.), the needs for three-dimensional spatial data (city models, digital elevation models, etc.) is increasing. Due to this increased demand the acquiring, storing and provision of 3D enabled spatial data in Geographic Information Systems (GIS) is more and more important. Existing proprietary solutions quickly reaches their limits during data exchange and data delivery to other systems. They generate a large workload, which will be very costly. However, it is noticeable that these expenses and costs can generally be significantly reduced using standards. The aim of this research is therefore to develop a concept in the field of three-dimensional spatial data that runs on existing standards whenever possible. In this research, the military image analysts are the preferred user group of the system. To achieve the objective of the widest possible use of standards in spatial 3D data, existing standards, proprietary interfaces and standards under discussion have been analyzed. Since the here used GIS of the Fraunhofer IOSB is already using and supporting OGC (Open Geospatial Consortium) and NATO-STANAG (NATO-Standardization Agreement) standards for the most part of it, a special attention for possible use was laid on their standards. The most promising standard is the OGC standard 3DPS (3D Portrayal Service) with its occurrences W3DS (Web 3D Service) and WVS (Web View Service). A demo system was created, using a standardized workflow from the data acquiring, storing and provision and showing the benefit of our approach.
An open-source solution for advanced imaging flow cytometry data analysis using machine learning.
Hennig, Holger; Rees, Paul; Blasi, Thomas; Kamentsky, Lee; Hung, Jane; Dao, David; Carpenter, Anne E; Filby, Andrew
2017-01-01
Imaging flow cytometry (IFC) enables the high throughput collection of morphological and spatial information from hundreds of thousands of single cells. This high content, information rich image data can in theory resolve important biological differences among complex, often heterogeneous biological samples. However, data analysis is often performed in a highly manual and subjective manner using very limited image analysis techniques in combination with conventional flow cytometry gating strategies. This approach is not scalable to the hundreds of available image-based features per cell and thus makes use of only a fraction of the spatial and morphometric information. As a result, the quality, reproducibility and rigour of results are limited by the skill, experience and ingenuity of the data analyst. Here, we describe a pipeline using open-source software that leverages the rich information in digital imagery using machine learning algorithms. Compensated and corrected raw image files (.rif) data files from an imaging flow cytometer (the proprietary .cif file format) are imported into the open-source software CellProfiler, where an image processing pipeline identifies cells and subcellular compartments allowing hundreds of morphological features to be measured. This high-dimensional data can then be analysed using cutting-edge machine learning and clustering approaches using "user-friendly" platforms such as CellProfiler Analyst. Researchers can train an automated cell classifier to recognize different cell types, cell cycle phases, drug treatment/control conditions, etc., using supervised machine learning. This workflow should enable the scientific community to leverage the full analytical power of IFC-derived data sets. It will help to reveal otherwise unappreciated populations of cells based on features that may be hidden to the human eye that include subtle measured differences in label free detection channels such as bright-field and dark-field imagery. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czuchlewski, Kristina Rodriguez; Hart, William E.
Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less
Interpreting sources of variation in clinical gait analysis: A case study.
King, Stephanie L; Barton, Gabor J; Ranganath, Lakshminarayan R
2017-02-01
To illustrate and discuss sources of gait deviations (experimental, genuine and intentional) during a gait analysis and how these deviations inform clinical decision making. A case study of a 24-year old male diagnosed with Alkaptonuria undergoing a routine gait analysis. A 3D motion capture with the Helen-Hayes marker set was used to quantify lower-limb joint kinematics during barefoot walking along a 10m walkway at a self-selected pace. Additional 2D video data were recorded in the sagittal and frontal plane. The patient reported no aches or pains in any joint and described his lifestyle as active. Temporal-spatial parameters were within normal ranges for his age and sex. Three sources of gait deviations were identified; the posteriorly rotated pelvis was due to an experimental error and marker misplacement, the increased rotation of the pelvis in the horizontal plane was genuine and observed in both 3D gait curves and in 2D video analysis, finally the inconsistency in knee flexion/extension combined with a seemingly innocuous interest in the consequences of abnormal gait suggested an intentional gait deviation. Gait analysis is an important analytical tool in the management of a variety of conditions that negatively impact on movement. Experienced gait analysts have the ability to recognise genuine gait adaptations that forms part of the decision-making process for that patient. However, their role also necessitates the ability to identify and correct for experimental errors and critically evaluate when a deviation may not be genuine. Copyright © 2016 Elsevier B.V. All rights reserved.
77 FR 11617 - Data Collection Available for Public Comments and Recommendations
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
... the quality of the collection, to Sandra Johnston, Program Analyst, Office of Financial Assistance... CONTACT: Sandra Johnston, Program Analyst, 202- 205-7528, Sandra[email protected] Curtis B. Rich...
Self-confidence in financial analysis: a study of younger and older male professional analysts.
Webster, R L; Ellis, T S
2001-06-01
Measures of reported self-confidence in performing financial analysis by 59 professional male analysts, 31 born between 1946 and 1964 and 28 born between 1965 and 1976, were investigated and reported. Self-confidence in one's ability is important in the securities industry because it affects recommendations and decisions to buy, sell, and hold securities. The respondents analyzed a set of multiyear corporate financial statements and reported their self-confidence in six separate financial areas. Data from the 59 male financial analysts were tallied and analyzed using both univariate and multivariate statistical tests. Rated self-confidence was not significantly different for the younger and the older men. These results are not consistent with a similar prior study of female analysts in which younger women showed significantly higher self-confidence than older women.
An eye tracking study of bloodstain pattern analysts during pattern classification.
Arthur, R M; Hoogenboom, J; Green, R D; Taylor, M C; de Bruin, K G
2018-05-01
Bloodstain pattern analysis (BPA) is the forensic discipline concerned with the classification and interpretation of bloodstains and bloodstain patterns at the crime scene. At present, it is unclear exactly which stain or pattern properties and their associated values are most relevant to analysts when classifying a bloodstain pattern. Eye tracking technology has been widely used to investigate human perception and cognition. Its application to forensics, however, is limited. This is the first study to use eye tracking as a tool for gaining access to the mindset of the bloodstain pattern expert. An eye tracking method was used to follow the gaze of 24 bloodstain pattern analysts during an assigned task of classifying a laboratory-generated test bloodstain pattern. With the aid of an automated image-processing methodology, the properties of selected features of the pattern were quantified leading to the delineation of areas of interest (AOIs). Eye tracking data were collected for each AOI and combined with verbal statements made by analysts after the classification task to determine the critical range of values for relevant diagnostic features. Eye-tracking data indicated that there were four main regions of the pattern that analysts were most interested in. Within each region, individual elements or groups of elements that exhibited features associated with directionality, size, colour and shape appeared to capture the most interest of analysts during the classification task. The study showed that the eye movements of trained bloodstain pattern experts and their verbal descriptions of a pattern were well correlated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
Seeing, mirroring, desiring: the impact of the analyst's pregnant body on the patient's body image.
Yakeley, Jessica
2013-08-01
The paper explores the impact of the analyst's pregnant body on the course of two analyses, a young man, and a young woman, specifically focusing on how each patient's visual perception and affective experience of being with the analyst's pregnant body affected their own body image and subjective experience of their body. The pre-verbal or 'subsymbolic' material evoked in the analyses contributed to a greater understanding of the patients' developmental experiences in infancy and adolescence, which had resulted in both carrying a profoundly distorted body image into adulthood. The analyst's pregnancy offered a therapeutic window in which a shift in the patient's body image could be initiated. Clinical material is presented in detail with reference to the psychoanalytic literature on the pregnant analyst, and that of the development of the body image, particularly focusing on the role of visual communication and the face. The author proposes a theory of psychic change, drawing on Bucci's multiple code theory, in which the patients' unconscious or 'subsymbolic' awareness of her pregnancy, which were manifest in their bodily responses, feeling states and dreams, as well as in the analyst s countertransference, could gradually be verbalized and understood within the transference. Thus visual perception, or 'external seeing', could gradually become 'internal seeing', or insight into unconscious phantasies, leading to a shift in the patients internal object world towards a less persecutory state and more realistic appraisal of their body image. Copyright © 2013 Institute of Psychoanalysis.
Collaborative interactive visualization: exploratory concept
NASA Astrophysics Data System (ADS)
Mokhtari, Marielle; Lavigne, Valérie; Drolet, Frédéric
2015-05-01
Dealing with an ever increasing amount of data is a challenge that military intelligence analysts or team of analysts face day to day. Increased individual and collective comprehension goes through collaboration between people. Better is the collaboration, better will be the comprehension. Nowadays, various technologies support and enhance collaboration by allowing people to connect and collaborate in settings as varied as across mobile devices, over networked computers, display walls, tabletop surfaces, to name just a few. A powerful collaboration system includes traditional and multimodal visualization features to achieve effective human communication. Interactive visualization strengthens collaboration because this approach is conducive to incrementally building a mental assessment of the data meaning. The purpose of this paper is to present an overview of the envisioned collaboration architecture and the interactive visualization concepts underlying the Sensemaking Support System prototype developed to support analysts in the context of the Joint Intelligence Collection and Analysis Capability project at DRDC Valcartier. It presents the current version of the architecture, discusses future capabilities to help analyst(s) in the accomplishment of their tasks and finally recommends collaboration and visualization technologies allowing to go a step further both as individual and as a team.
SnapShot: Visualization to Propel Ice Hockey Analytics.
Pileggi, H; Stolper, C D; Boyle, J M; Stasko, J T
2012-12-01
Sports analysts live in a world of dynamic games flattened into tables of numbers, divorced from the rinks, pitches, and courts where they were generated. Currently, these professional analysts use R, Stata, SAS, and other statistical software packages for uncovering insights from game data. Quantitative sports consultants seek a competitive advantage both for their clients and for themselves as analytics becomes increasingly valued by teams, clubs, and squads. In order for the information visualization community to support the members of this blossoming industry, it must recognize where and how visualization can enhance the existing analytical workflow. In this paper, we identify three primary stages of today's sports analyst's routine where visualization can be beneficially integrated: 1) exploring a dataspace; 2) sharing hypotheses with internal colleagues; and 3) communicating findings to stakeholders.Working closely with professional ice hockey analysts, we designed and built SnapShot, a system to integrate visualization into the hockey intelligence gathering process. SnapShot employs a variety of information visualization techniques to display shot data, yet given the importance of a specific hockey statistic, shot length, we introduce a technique, the radial heat map. Through a user study, we received encouraging feedback from several professional analysts, both independent consultants and professional team personnel.
Don't Wag the Dog: Extending the Reach of Applied Behavior Analysis
Normand, Matthew P.; Kohn, Carolynn S.
2013-01-01
We argue that the field of behavior analysis would be best served if behavior analysts worked to extend the reach of behavioral services into a more diverse range of settings and with more varied populations, with an emphasis on the establishment of new career opportunities for graduating students. This is not a new proposal, but it is a tall order; it is not difficult to see why many would choose a surer route to gainful employment. Currently, the most fruitful career path for behavior analysts in practice is in the area of autism and developmental disabilities. For the continued growth of the field of behavior analysis, however, it is important to foster new career opportunities for those trained as behavior analysts. Toward this end, we identify several fields that seem well suited to behavior analysts and summarize the training requirements and likely professional outcomes for behavior analysts who pursue education and certification in these fields. These fields require relatively little additional formal training in the hopes of minimizing the response effort necessary for individuals who have already completed a rigorous program of graduate study in behavior analysis. PMID:25729134
NASA Astrophysics Data System (ADS)
Le Bras, Ronan; Kushida, Noriyuki; Mialle, Pierrick; Tomuta, Elena; Arora, Nimar
2017-04-01
The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing a Bayesian method and software to perform the key step of automatic association of seismological, hydroacoustic, and infrasound (SHI) parametric data. In our preliminary testing in the CTBTO, NET_VISA shows much better performance than its currently operating automatic association module, with a rate for automatic events matching the analyst-reviewed events increased by 10%, signifying that the percentage of missed events is lowered by 40%. Initial tests involving analysts also showed that the new software will complete the automatic bulletins of the CTBTO by adding previously missed events. Because products by the CTBTO are also widely distributed to its member States as well as throughout the seismological community, the introduction of a new technology must be carried out carefully, and the first step of operational integration is to first use NET-VISA results within the interactive analysts' software so that the analysts can check the robustness of the Bayesian approach. We report on the latest results both on the progress for automatic processing and for the initial introduction of NET-VISA results in the analyst review process
Conceptualisation of clinical facts in the analytic process.
Riesenberg-Malcolm, R
1994-12-01
In this paper the author discusses what she understands to be a clinical fact, stressing that it takes place within the analytic situation between patient and analyst. It is in the process of conceptualising the fact that the analyst comes to define it. In order to conceptualise, the analyst must have a frame of reference, a theoretical basis through which he perceives his patient's communications and is able to give meaning to them. In analytic work, the analyst uses his theory in mainly two ways. When working with his patient it operates mostly unconsciously, but interspersed by quick more conscious thinking. When away from the patient, theory needs to come to the front of the analyst's mind, consciously used by him. A clinical case is used to illustrate these two aspects of theoretical work. In the material presented, aspects of a first session are tentatively conceptualised. Then material from the same patient some years later is described, the method of working and the way of understanding is discussed and thus the process of conceptualising can be illustrated. The theme of hope has been singled out as a linking point between the earlier and later pieces of material.
Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.
Stolper, Charles D; Perer, Adam; Gotz, David
2014-12-01
As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.
DOT National Transportation Integrated Search
2010-06-01
This manual provides information and recommended procedures to be utilized by an agencys Weigh-in-Motion (WIM) Office Data Analyst to perform validation and quality control (QC) checks of WIM traffic data. This manual focuses on data generated by ...
SafetyAnalyst Testing and Implementation
DOT National Transportation Integrated Search
2009-03-01
SafetyAnalyst is a software tool developed by the Federal Highway Administration to assist state and local transportation agencies on analyzing safety data and managing their roadway safety programs. This research report documents the major tasks acc...
VAUD: A Visual Analysis Approach for Exploring Spatio-Temporal Urban Data.
Chen, Wei; Huang, Zhaosong; Wu, Feiran; Zhu, Minfeng; Guan, Huihua; Maciejewski, Ross
2017-10-02
Urban data is massive, heterogeneous, and spatio-temporal, posing a substantial challenge for visualization and analysis. In this paper, we design and implement a novel visual analytics approach, Visual Analyzer for Urban Data (VAUD), that supports the visualization, querying, and exploration of urban data. Our approach allows for cross-domain correlation from multiple data sources by leveraging spatial-temporal and social inter-connectedness features. Through our approach, the analyst is able to select, filter, aggregate across multiple data sources and extract information that would be hidden to a single data subset. To illustrate the effectiveness of our approach, we provide case studies on a real urban dataset that contains the cyber-, physical-, and socialinformation of 14 million citizens over 22 days.
The use of historical imagery in the remediation of an urban hazardous waste site
Slonecker, E.T.
2011-01-01
The information derived from the interpretation of historical aerial photographs is perhaps the most basic multitemporal application of remote-sensing data. Aerial photographs dating back to the early 20th century can be extremely valuable sources of historical landscape activity. In this application, imagery from 1918 to 1927 provided a wealth of information about chemical weapons testing, storage, handling, and disposal of these hazardous materials. When analyzed by a trained photo-analyst, the 1918 aerial photographs resulted in 42 features of potential interest. When compared with current remedial activities and known areas of contamination, 33 of 42 or 78.5% of the features were spatially correlated with areas of known contamination or other remedial hazardous waste cleanup activity. ?? 2010 IEEE.
The use of historical imagery in the remediation of an urban hazardous waste site
Slonecker, E.T.
2011-01-01
The information derived from the interpretation of historical aerial photographs is perhaps the most basic multitemporal application of remote-sensing data. Aerial photographs dating back to the early 20th century can be extremely valuable sources of historical landscape activity. In this application, imagery from 1918 to 1927 provided a wealth of information about chemical weapons testing, storage, handling, and disposal of these hazardous materials. When analyzed by a trained photo-analyst, the 1918 aerial photographs resulted in 42 features of potential interest. When compared with current remedial activities and known areas of contamination, 33 of 42 or 78.5% of the features were spatially correlated with areas of known contamination or other remedial hazardous waste cleanup activity.
2011-07-21
Phillips, Richard Spencer, and Leigh Warner. Catherine Whittington served as the Board Staff Analyst. PROCESS The Task Group conducted more than...Chair) Mr. Pierre Chao Mr. William Phillips Mr. Richard Spencer Ms. Leigh Warner DBB Staff Analyst Catherine Whittington 2 Methodology
Modernizing the Military Retirement System
2011-05-01
Patrick Gross, David Langstaff, Philip Odeen, Mark Ronald, Robert Stein, and Jack Zoeller. Catherine Whittington served as the Board Staff Analyst...Chair) Patrick Gross David Langstaff Philip Odeen Mark Ronald Robert Stein Jack Zoeller DBB Staff Analyst Catherine Whittington Methodology
Development of a Nevada Statewide Database for Safety Analyst Software
DOT National Transportation Integrated Search
2017-02-02
Safety Analyst is a software package developed by the Federal Highway Administration (FHWA) and twenty-seven participating state and local agencies including the Nevada Department of Transportation (NDOT). The software package implemented many of the...
Plaut, Alfred B J
2005-02-01
In this paper the author explores the theoretical and technical issues relating to taking notes of analytic sessions, using an introspective approach. The paper discusses the lack of a consistent approach to note taking amongst analysts and sets out to demonstrate that systematic note taking can be helpful to the analyst. The author describes his discovery that an initial phase where as much data was recorded as possible did not prove to be reliably helpful in clinical work and initially actively interfered with recall in subsequent sessions. The impact of the nature of the analytic session itself and the focus of the analyst's interest on recall is discussed. The author then describes how he modified his note taking technique to classify information from sessions into four categories which enabled the analyst to select which information to record in notes. The characteristics of memory and its constructive nature are discussed in relation to the problems that arise in making accurate notes of analytic sessions.
Users manual for the US baseline corn and soybean segment classification procedure
NASA Technical Reports Server (NTRS)
Horvath, R.; Colwell, R. (Principal Investigator); Hay, C.; Metzler, M.; Mykolenko, O.; Odenweller, J.; Rice, D.
1981-01-01
A user's manual for the classification component of the FY-81 U.S. Corn and Soybean Pilot Experiment in the Foreign Commodity Production Forecasting Project of AgRISTARS is presented. This experiment is one of several major experiments in AgRISTARS designed to measure and advance the remote sensing technologies for cropland inventory. The classification procedure discussed is designed to produce segment proportion estimates for corn and soybeans in the U.S. Corn Belt (Iowa, Indiana, and Illinois) using LANDSAT data. The estimates are produced by an integrated Analyst/Machine procedure. The Analyst selects acquisitions, participates in stratification, and assigns crop labels to selected samples. In concert with the Analyst, the machine digitally preprocesses LANDSAT data to remove external effects, stratifies the data into field like units and into spectrally similar groups, statistically samples the data for Analyst labeling, and combines the labeled samples into a final estimate.
Interpersonal psychoanalysis' radical façade.
Hirsch, Irwin
2002-01-01
The participant-observation model initiated the relational turn, as well as the shift from modernism to postmodernism in psychoanalysis. This two-person, coparticipant conceptualization of the psychoanalytic situation moved psychoanalysis from the realm of alleged objective science toward intersubjectivity and hermeneutics. From this perspective, the analyst as subjective other is constantly engaged affectively with the patient in ways that are very often out of awareness. Analyst and patient both, for better or for worse, are believed to unwittingly influence one another. This description of the analytic dyad has led many to mistakingly conclude that interpersonal psychoanalysts advocate wittinly affective expressiveness, often in the form of deliberate self-disclosure of feelings, as part of a standard analytic stance. Upon closer examination, radical interventions are no more emblematic of interpersonal analysts than they are of analysts from most other traditions, though the interpersonalists have indeed expanded what had theretofore been a rather narrow repertoire of interventions.
A survey of functional behavior assessment methods used by behavior analysts in practice.
Oliver, Anthony C; Pratt, Leigh A; Normand, Matthew P
2015-12-01
To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially descriptive assessments. Moreover, the data suggest that the majority of students are being formally taught about the various FBA methods and that educators are emphasizing the range of FBA methods in their teaching. However, less than half of the respondents reported using functional analyses in practice, although many considered descriptive assessments and functional analyses to be the most useful FBA methods. Most respondents reported using informant and descriptive assessments more frequently than functional analyses, and a majority of respondents indicated that they "never" or "almost never" used functional analyses to identify the function of behavior. © Society for the Experimental Analysis of Behavior.
Exploring the role of contextual information in bloodstain pattern analysis: A qualitative approach.
Osborne, Nikola K P; Taylor, Michael C; Zajac, Rachel
2016-03-01
During Bloodstain Pattern Analysis (BPA), an analyst may encounter various sources of contextual information. Although contextual bias has emerged as a valid concern for the discipline, little is understood about how contextual information informs BPA. To address this issue, we asked 15 experienced bloodstain pattern analysts from New Zealand and Australia to think aloud as they classified bloodstain patterns from two homicide cases. Analysts could request items of contextual information, and were required to state how each item would inform their analysis. Pathology reports and additional photographs of the scene were the most commonly requested items of information. We coded analysts' reasons for requesting contextual information--and the way in which they integrated this information--according to thematic analysis. We identified considerable variation in both of these variables, raising important questions about the role and necessity of contextual information in decisions about bloodstain pattern evidence. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Birisan, Mihnea; Beling, Peter
2011-01-01
New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.
Comparison of air-coupled GPR data analysis results determined by multiple analysts
NASA Astrophysics Data System (ADS)
Martino, Nicole; Maser, Ken
2016-04-01
Current bridge deck condition assessments using ground penetrating radar (GPR) requires a trained analyst to manually interpret substructure layering information from B-scan images in order to proceed with an intended analysis (pavement thickness, concrete cover, effects of rebar corrosion, etc.) For example, a recently developed method to rapidly and accurately analyze air-coupled GPR data based on the effects of rebar corrosion, requires that a user "picks" a layer of rebar reflections in each B-scan image collected along the length of the deck. These "picks" have information like signal amplitude and two way travel time. When a deck is new, or has little rebar corrosion, the resulting layer of rebar reflections is readily evident and there is little room for subjectivity. However, when a deck is severely deteriorated, the rebar layer may be difficult to identify, and different analysts may make different interpretations of the appropriate layer to analyze. One highly corroded bridge deck, was assessed with a number of nondestructive evaluation techniques including 2GHz air-coupled GPR. Two trained analysts separately selected the rebar layer in each B-scan image, choosing as much information as possible, even in areas of significant deterioration. The post processing of the selected data points was then completed and the results from each analyst were contour plotted to observe any discrepancies. The paper describes the differences between ground coupled and air-coupled GPR systems, the data collection and analysis methods used by two different analysts for one case study, and the results of the two different analyses.
Some developments from the work of Melanie Klein.
Spillius, E B
1983-01-01
This paper discusses four areas of work in which several followers of Melanie Klein in Britain have developed some of the discoveries and ideas she initiated. First, extension of her concept of projective identification is briefly described, with emphasis on Bion's and Rosenfeld's stress on its communicative as well as its pathological aspects. Second, the extension of Klein's ideas about the epistemophilic instinct, on symbolism, and about projective identification is described in the work of Segal, Bion, Money-Kyrle, and Bick on the development of the capacity to think. Third, certain developments in Kleinian technique are described, with emphasis on the use of the concept of projective identification in analysing transference, and on the analysis of acting out in the transference, a trend contributed to by many Kleinian analysts but perhaps most closely associated with Betty Joseph. Finally, continued refinements in the analysis of the death instinct are briefly described, together with discussion of the changes these refinements have led to in ideas about the organization and relations of parts of the self and internal objects.
Assessing the socioeconomic impact and value of open geospatial information
Pearlman, Francoise; Pearlman, Jay; Bernknopf, Richard; Coote, Andrew; Craglia, Massimo; Friedl, Lawrence; Gallo, Jason; Hertzfeld, Henry; Jolly, Claire; Macauley, Molly K.; Shapiro, Carl; Smart, Alan
2016-03-10
The workshop included 68 participants coming from international organizations, the U.S. public and private sectors, nongovernmental organizations, and academia. Participants included policy makers and analysts, financial analysts, economists, information scientists, geospatial practitioners, and other discipline experts.
Spatial Data Transfer Standard (SDTS), part 5 : SDTS raster profile and extensions
DOT National Transportation Integrated Search
1999-02-01
The Spatial Data Transfer Standard (SDTS) defines a general mechanism for the transfer of : geographically referenced spatial data and its supporting metadata, i.e., attributes, data quality reports, : coordinate reference systems, security informati...
The application test system: Experiences to date and future plans
NASA Technical Reports Server (NTRS)
May, G. A.; Ashburn, P.; Hansen, H. L. (Principal Investigator)
1979-01-01
The ATS analysis component is presented focusing on methods by which the varied data sources are used by the ATS analyst. Analyst training and initial processing of data is discussed along with short and long plans for the ATS.
DataQs analyst guide : best practices for federal and state agency users.
DOT National Transportation Integrated Search
2014-12-01
The DataQs Analyst Guide provides practical guidance and : best practices to address and resolve Requests for Data : Reviews (RDRs) submitted electronically to FMCSA by motor : carriers, commercial drivers, and other persons using the : DataQs system...
LACIE analyst interpretation keys
NASA Technical Reports Server (NTRS)
Baron, J. G.; Payne, R. W.; Palmer, W. F. (Principal Investigator)
1979-01-01
Two interpretation aids, 'The Image Analysis Guide for Wheat/Small Grains Inventories' and 'The United States and Canadian Great Plains Regional Keys', were developed during LACIE phase 2 and implemented during phase 3 in order to provide analysts with a better understanding of the expected ranges in color variation of signatures for individual biostages and of the temporal sequences of LANDSAT signatures. The keys were tested using operational LACIE data, and the results demonstrate that their use provides improved labeling accuracy in all analyst experience groupings, in all geographic areas within the U.S. Great Plains, and during all periods of crop development.
Novick, Jack; Novick, Kerry Kelly
2012-01-01
D.W. Winnicott wrote, "One analyst cannot have enough cases to cover all contingencies..." (1958, p. 123). He was referring to the fact that any one analyst has a relatively small number of cases at any one time or even over a lifetime. He was talking then about termination, but his point applies to any issues. here we are grateful for detailed account of a case that offers to all of us additional experience of the successes and challenges of our work.
Gostečnik, Christian; Slavič, Tanja Repič; Lukek, Saša Poljak; Pate, Tanja; Cvetek, Robert
2017-08-01
The relationship between partners and the analyst is considered the most basic means for healing in contemporary psychoanalytic theories and analyses. It also holds as one of the most fundamental phenomenon's of psychoanalysis, so it comes as no surprise that it has always been deliberated over as an object of great interest as well as immense controversy. This same relationship, mutually co-created by the analyst and each individual and partner in analysis, represents also the core of sanctity and sacred space in contemporary psychoanalysis.
Evaluation of Bayesian Sequential Proportion Estimation Using Analyst Labels
NASA Technical Reports Server (NTRS)
Lennington, R. K.; Abotteen, K. M. (Principal Investigator)
1980-01-01
The author has identified the following significant results. A total of ten Large Area Crop Inventory Experiment Phase 3 blind sites and analyst-interpreter labels were used in a study to compare proportional estimates obtained by the Bayes sequential procedure with estimates obtained from simple random sampling and from Procedure 1. The analyst error rate using the Bayes technique was shown to be no greater than that for the simple random sampling. Also, the segment proportion estimates produced using this technique had smaller bias and mean squared errors than the estimates produced using either simple random sampling or Procedure 1.
When the analyst is ill: dimensions of self-disclosure.
Pizer, B
1997-07-01
This article examines questions related to the "inescapable," the "inadvertent," and the "deliberate" personal disclosures by an analyst. Technical and personal considerations that influence the analyst's decision to disclose, as well as the inherent responsibilities and potential clinical consequences involved in self-disclosure, are explored, with particular attention to transference-countertransference dynamics, therapeutic goals, and the negotiation of resistance. The author describes her clinical work during a period of prolonged illness, with case vignettes that illustrate how-self-disclosure may be regarded as both an occasional authentic requirement and a regular intrinsic component of clinical technique.
Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.
Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A
2018-01-01
Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.
Kurtz, S A
1986-01-01
The space the analyst creates in his consulting room gives expression to the most primitive elements in his personality. It does this despite, and even by means of, the professional conventions it incorporates. This phenomenon is first apparent in Freud's office where the space of the psychoanalytic situation originated. Here the room itself--filled with the antiquities he collected so passionately--met important narcissistic/symbiotic needs. In this sense it encodes a very early, unanalyzed level of relationship with his mother. It is suggested here that these phenomena, visible in Freud's office, are continuing elements of the analytic frame. Because of the character of the analyst and the structure of the relationship, the room becomes a mise-en-scène in which the narcissistic/symbiotic layers of both participants' characters are played out. Failing to recognize this may lead the analyst to treat seemingly regressive behavior as resistance and to intervene at developmental levels the patients has not achieved. Indeed, such "regressions" can only be understood as products of the situation itself. Phenomenologically, the analyst has become the corner in which he took refuge as a child; the corner to which the patient now comes for sanctuary. Because this connection is unconscious it cannot be called an alliance. Rather, it is a fortuitous interlocking that--like mother-child symbiosis--constitutes a matrix for new growth.
Psychotherapy in the aesthetic attitude.
Beebe, John
2010-04-01
Drawing upon the writings of Jungian analyst Joseph Henderson on unconscious attitudes toward culture that patients and analysts may bring to therapy, the author defines the aesthetic attitude as one of the basic ways that cultural experience is instinctively accessed and processed so that it can become part of an individual's self experience. In analytic treatment, the aesthetic attitude emerges as part of what Jung called the transcendent function to create new symbolic possibilities for the growth of consciousness. It can provide creative opportunities for new adaptation where individuation has become stuck in unconscious complexes, both personal and cultural. In contrast to formulations that have compared depth psychotherapy to religious ritual, philosophic discourse, and renewal of socialization, this paper focuses upon the considerations of beauty that make psychotherapy also an art. In psychotherapeutic work, the aesthetic attitude confronts both analyst and patient with the problem of taste, affects how the treatment is shaped and 'framed', and can grant a dimension of grace to the analyst's mirroring of the struggles that attend the patient's effort to be a more smoothly functioning human being. The patient may learn to extend the same grace to the analyst's fumbling attempts to be helpful. The author suggests that the aesthetic attitude is thus a help in the resolution of both countertransference and transference en route to psychological healing.
LG-ANALYST: linguistic geometry for master air attack planning
NASA Astrophysics Data System (ADS)
Stilman, Boris; Yakhnis, Vladimir; Umanskiy, Oleg
2003-09-01
We investigate the technical feasibility of implementing LG-ANALYST, a new software tool based on the Linguistic Geometry (LG) approach. The tool will be capable of modeling and providing solutions to Air Force related battlefield problems and of conducting multiple experiments to verify the quality of the solutions it generates. LG-ANALYST will support generation of the Fast Master Air Attack Plan (MAAP) with subsequent conversion into Air Tasking Order (ATO). An Air Force mission is modeled employing abstract board games (ABG). Such a mission may include, for example, an aircraft strike package moving to a target area with the opposing side having ground-to-air missiles, anti-aircraft batteries, fighter wings, and radars. The corresponding abstract board captures 3D air space, terrain, the aircraft trajectories, positions of the batteries, strategic features of the terrain, such as bridges, and their status, radars and illuminated space, etc. Various animated views are provided by LG-ANALYST including a 3D view for realistic representation of the battlespace and a 2D view for ease of analysis and control. LG-ANALYST will allow a user to model full scale intelligent enemy, plan in advance, re-plan and control in real time Blue and Red forces by generating optimal (or near-optimal) strategies for all sides of a conflict.
Transformations in dreaming and characters in the psychoanalytic field.
Ferro, Antonino
2009-04-01
Having reviewed certain similarities and differences between the various psychoanalytic models (historical reconstruction/development of the container and of the mind's metabolic and transformational function; the significance to be attributed to dream-type material; reality gradients of narrations; tolerability of truth/lies as polar opposites; and the form in which characters are understood in a psychoanalytic session), the author uses clinical material to demonstrate his conception of a session as a virtual reality in which the central operation is transformation in dreaming (de-construction, de-concretization, and re-dreaming), accompanied in particular by the development of this attitude in both patient and analyst as an antidote to the operations of transformation in hallucinosis that bear witness to the failure of the functions of meaning generation. The theoretical roots of this model are traced in the concept of the field and its developments as a constantly expanding oneiric holographic field; in the developments of Bion's ideas (waking dream thought and its derivatives, and the patient as signaller of the movements of the field); and in the contributions of narratology (narrative transformations and the transformations of characters and screenplays). Stress is also laid on the transition from a psychoanalysis directed predominantly towards contents to a psychoanalysis that emphasizes the development of the instruments for dreaming, feeling, and thinking. An extensive case history and a session reported in its entirety are presented so as to convey a living impression of the ongoing process, in the consulting room, of the unsaturated co-construction of an emotional reality in the throes of continuous transformation. The author also describes the technical implications of this model in terms of forms of interpretation, the countertransference, reveries, and, in particular, how the analyst listens to the patient's communications. The paper ends with an exploration of the concepts of grasping (in the sense of clinging to the known) and casting (in relation to what is as yet undefined but seeking representation and transformation) as a further oscillation of the minds of the analyst and the patient in addition to those familiar from classical psychoanalysis.
Deeny, Sarah R; Steventon, Adam
2015-01-01
Socrates described a group of people chained up inside a cave, who mistook shadows of objects on a wall for reality. This allegory comes to mind when considering ‘routinely collected data’—the massive data sets, generated as part of the routine operation of the modern healthcare service. There is keen interest in routine data and the seemingly comprehensive view of healthcare they offer, and we outline a number of examples in which they were used successfully, including the Birmingham OwnHealth study, in which routine data were used with matched control groups to assess the effect of telephone health coaching on hospital utilisation. Routine data differ from data collected primarily for the purposes of research, and this means that analysts cannot assume that they provide the full or accurate clinical picture, let alone a full description of the health of the population. We show that major methodological challenges in using routine data arise from the difficulty of understanding the gap between patient and their ‘data shadow’. Strategies to overcome this challenge include more extensive data linkage, developing analytical methods and collecting more data on a routine basis, including from the patient while away from the clinic. In addition, creating a learning health system will require greater alignment between the analysis and the decisions that will be taken; between analysts and people interested in quality improvement; and between the analysis undertaken and public attitudes regarding appropriate use of data. PMID:26065466
Identifying Severe Weather Impacts and Damage with Google Earth Engine
NASA Astrophysics Data System (ADS)
Molthan, A.; Burks, J. E.; Bell, J. R.
2015-12-01
Hazards associated with severe convective storms can lead to rapid changes in land surface vegetation. Depending upon the type of vegetation that has been impacted, their impacts can be relatively short lived, such as damage to seasonal crops that are eventually removed by harvest, or longer-lived, such as damage to a stand of trees or expanse of forest that require several years to recover. Since many remote sensing imagers provide their highest spatial resolution bands in the red and near-infrared to support monitoring of vegetation, these impacts can be readily identified as short-term and marked decreases in common vegetation indices such as NDVI, along with increases in land surface temperature that are observed at a reduced spatial resolution. The ability to identify an area of vegetation change is improved by understanding the conditions that are normal for a given time of year and location, along with a typical range of variability in a given parameter. This analysis requires a period of record well beyond the availability of near real-time data. These activities would typically require an analyst to download large volumes of data from sensors such as NASA's MODIS (aboard Terra and Aqua) or higher resolution imagers from the Landsat series of satellites. Google's Earth Engine offers a "big data" solution to these challenges, by providing a streamlined API and option to process the period of record of NASA MODIS and Landsat products through relatively simple Javascript coding. This presentation will highlight efforts to date in using Earth Engine holdings to produce vegetation and land surface temperature anomalies that are associated with damage to agricultural and other vegetation caused by severe thunderstorms across the Central and Southeastern United States. Earth Engine applications will show how large data holdings can be used to map severe weather damage, ascertain longer-term impacts, and share best practices learned and challenges with applying Earth Engine holdings to the analysis of severe weather damage. Other applications are also demonstrated, such as use of Earth Engine to prepare pre-event composites that can be used to subjectively identify other severe weather impacts. Future extension to flooding and wildfires is also proposed.
17 CFR 200.17 - Chief Management Analyst.
Code of Federal Regulations, 2010 CFR
2010-04-01
...; CONDUCT AND ETHICS; AND INFORMATION AND REQUESTS Organization and Program Management General Organization...) Organizational structures and delegations of authority; (d) Management information systems and concepts; and (e... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false Chief Management Analyst. 200...
17 CFR 200.17 - Chief Management Analyst.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 17 Commodity and Securities Exchanges 2 2011-04-01 2011-04-01 false Chief Management Analyst. 200.17 Section 200.17 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION ORGANIZATION; CONDUCT AND ETHICS; AND INFORMATION AND REQUESTS Organization and Program Management General Organization...
DOT National Transportation Integrated Search
2016-07-01
To enable implementation of the American Association of State Highway Transportation (AASHTO) Highway Safety Manual using : SaftetyAnalyst (an AASHTOWare software product), the Arizona Department of Transportation (ADOT) studied the data assessment :...
Preparing Florida for deployment of SafetyAnalyst for all roads.
DOT National Transportation Integrated Search
2012-05-01
SafetyAnalyst is an advanced software system designed to provide the state and local highway agencies with a comprehensive set of tools to enhance their programming of site-specific highway safety improvements. As one of the 27 states that sponsored ...
Using ArcGIS software in the pre-hospital emergency medical system.
Manole, M; Duma, Odetta; Custură, Maria Alexandra; Petrariu, F D; Manole, Alina
2014-01-01
To measure the accessibility to healtcare services in order to reveal their quality and to improve the overall coverage, continuity and other features. We used the software ESRI Arc GIS 9.3, the Network Analyst function and data provided by Ambulance Service of Iasi (A.S.I.) with emergencies statistics for the first four months of 2012, processed by Microsoft Office Excel 2010. As examples, we chose "St. Maria" Children's Emergency Hospital and "St. Spiridon" Emergency Hospital. ArcGIS Network Analyst finds the best route to get from one location to another or a route that includes multiple locations. Each route is characterized by three stops. The starting point is always the office of Ambulance Service of Iasi (A.S.I.), a second stop at the case address and the third to the hospital unit chosen according to the patient's diagnosis and age. Spatial distribution of emergency cases for the first four months of 2012 in these two examples is one unequable, with higher concentrations in districts located in two areas of the city. The presented examples highlight the poor coverage of healthcare services for the population of Iasi, Romania, especially the South-West area and its vulnerability in situations of emergency. Implementing such a broad project would lead to more complex analyses that would improve the situation of pre-hospital emergency medical services, with final goal to deserve the population, improve the quality of healthcare and develop the interdisciplinary relationships.
Cardoso, Ricardo Lopes; Leite, Rodrigo Oliveira; de Aquino, André Carlos Busanelli
2016-01-01
Previous researches support that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Moreover, literature shows that different types of graphical information can help or harm the accuracy on decision making of accountants and financial analysts. We conducted a 4×2 mixed-design experiment to examine the effects of numerical information disclosure on financial analysts' accuracy, and investigated the role of overconfidence in decision making. Results show that compared to text, column graph enhanced accuracy on decision making, followed by line graphs. No difference was found between table and textual disclosure. Overconfidence harmed accuracy, and both genders behaved overconfidently. Additionally, the type of disclosure (text, table, line graph and column graph) did not affect the overconfidence of individuals, providing evidence that overconfidence is a personal trait. This study makes three contributions. First, it provides evidence from a larger sample size (295) of financial analysts instead of a smaller sample size of students that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Second, it uses the text as a baseline comparison to test how different ways of information disclosure (line and column graphs, and tables) can enhance understandability of information. Third, it brings an internal factor to this process: overconfidence, a personal trait that harms the decision-making process of individuals. At the end of this paper several research paths are highlighted to further study the effect of internal factors (personal traits) on financial analysts' accuracy on decision making regarding numerical information presented in a graphical form. In addition, we offer suggestions concerning some practical implications for professional accountants, auditors, financial analysts and standard setters.
Review of Maintenance and Repair Times for Components in Technological Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
L. C. Cadwallader
2012-11-01
This report is a compilation of some unique component repair time data and it also presents citations of more extensive reports where lists of repair times can be found. This collection of information should support analysts who seek to quantify maintainability and availability of high technology and nuclear energy production systems. While there are newer sources of repair time information, most, if not all, of the newer sources are proprietary and cannot be shared. This report offers data that, while older, is openly accessible and can serve as reasonable estimates of repair times, at least for initial studies. Some timesmore » were found for maintenance times in radiation environments, and some guidance for multiplicative factors to use to account for work in contamination areas.« less
Tool for Rapid Analysis of Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.
2013-01-01
Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.
Annalisa Gnoleba, MSA | Division of Cancer Prevention
Mrs. Annalisa Gnoleba is the Public Health Analyst for the Cancer Prevention Fellowship Program, Division of Cancer Prevention, National Cancer Institute. In this position, Mrs. Gnoleba serves as the analyst for developing and formulating short and long range public health program goals, objectives and policies. |
Report To The Secretary Of Defense - Global Logistics Management
2011-07-01
Spencer, and Leigh Warner. Catherine Whittington served as the Board’s Staff Analyst. PROCESS The Task Group conducted more than 30 interviews...Phillips Mr. Richard Spencer Ms. Leigh Warner DBB Staff Analyst Ms. Catherine Whittington 2 Methodology Reviewed DoD Directives and
Land-use planning of Volyn region (Ukraine) using Geographic Information Systems (GIS) technologies
NASA Astrophysics Data System (ADS)
Strielko, Irina; Pereira, Paulo
2014-05-01
Land-use development planning is carried out in order to create a favourable environment for human life, sustainable socioeconomic and spatial development. Landscape planning is an important part of land-use development that aims to meet the fundamental principles of sustainable development. Geographic Information Systems (GIS) is a fundamental tool to make a better landscape planning at different territorial levels, providing data and maps to support decision making. The objective of this work is to create spatio-temporal, territorial and ecological model of development of Volyn region (Ukraine). It is based on existing spatial raster and vector data and includes the analysis of territory dynamics as the aspects responsible for it. A spatial analyst tool was used to zone the areas according to their environmental components and economic activity. This analysis is fundamental to define the basic parameters of sustainability of Volyn region. To carry out this analysis, we determined the demographic capacity of districts and the analysis of spatial parameters of land use. On the basis of the existing natural resources, we observed that there is a need of landscape protection and integration of more are natural areas in the Pan-European Ecological Network. Using GIS technologies to landscape planning in Volyn region, allowed us to identify, natural areas of interest, contribute to a better resource management and conflict resolution. Geographic Information Systems will help to formulate and implement landscape policies, reform the existing administrative system of Volyn region and contribute to a better sustainable development.
Informed consent as a prescription calling for debate between analysts and researchers.
Rodríguez Quiroga de Pereira, Andrea; Messina, Verónica María; Sansalone, Paula Andrea
2012-08-01
This article is a review of the international scientific literature on informed consent and its use in some of the constituent organizations of the International Psychoanalytical Association (IPA). Because psychoanalysis comprises a theory based on practice, the dearth of clinical material for study, training and research purposes is a serious problem for analysts. Supervisions, presentations at scientific societies and congresses, publications and teaching material involve patients to an extent that goes beyond the work done in their sessions. Should consent be requested in these cases? This contribution addresses controversial and long-standing issues such as informed consent and confidentiality, audio recording of treatments, knowledge production, the ambivalence of participating subjects over time and the perspective of analysts and patients respectively. The authors consider the various alternative approaches available for the handling of these ethical dilemmas without losing sight of the patient's dignity and personal rights, while also taking account of the position of the analyst. Copyright © 2012 Institute of Psychoanalysis.
Internalization, separation-individuation, and the nature of therapeutic action.
Blatt, S J; Behrends, R S
1987-01-01
Based on the assumption that the mutative factors that facilitate growth in psychoanalysis involve the same fundamental mechanisms that lead to psychological growth in normal development, this paper considers the constant oscillation between gratification and deprivation leading to internalization as the central therapeutic mechanism of the psychoanalytic process. Patients experience the analytic process as a series of gratifying involvements and experienced incompatibilities that facilitate internalization, whereby the patient recovers lost or disrupted regulatory, gratifying interactions with the analyst, which are real or fantasied, by appropriating these interactions, transforming them into their own, enduring, self-generated functions and characteristics. Patients internalize not only the analyst's interpretive activity, but also the analyst's sensitivity, compassion and acceptance, and, in addition, their own activity in relation to the analyst such as free association. Both interpretation and the therapeutic relationship can contain elements of gratifying involvement and experienced incompatibility that lead to internalization and therefore both can be mutative factors in the therapeutic process.
Thermal Hardware for the Thermal Analyst
NASA Technical Reports Server (NTRS)
Steinfeld, David
2015-01-01
The presentation will be given at the 26th Annual Thermal Fluids Analysis Workshop (TFAWS 2015) hosted by the Goddard Space Flight Center (GSFC) Thermal Engineering Branch (Code 545). NCTS 21070-1. Most Thermal analysts do not have a good background into the hardware which thermally controls the spacecraft they design. SINDA and Thermal Desktop models are nice, but knowing how this applies to the actual thermal hardware (heaters, thermostats, thermistors, MLI blanketing, optical coatings, etc...) is just as important. The course will delve into the thermal hardware and their application techniques on actual spacecraft. Knowledge of how thermal hardware is used and applied will make a thermal analyst a better engineer.
Putnam, Robert F; Kincaid, Donald
2015-05-01
Horner and Sugai (2015) recently wrote a manuscript providing an overview of school-wide positive behavioral interventions and supports (PBIS) and why it is an example of applied behavior analysis at the scale of social importance. This paper will describe why school-wide PBIS is important to behavior analysts, how it helps promote applied behavior analysis in schools and other organizations, and how behavior analysts can use this framework to assist them in the promotion and implementation of applied behavior analysis at both at the school and organizational level, as well as, the classroom and individual level.
On dreaming one's patient: reflections on an aspect of countertransference dreams.
Brown, Lawrence J
2007-07-01
This paper explores the phenomenon of the countertransference dream. Until very recently, such dreams have tended to be seen as reflecting either unanalyzed difficulties in the analyst or unexamined conflicts in the analytic relationship. While the analyst's dream of his/her patient may represent such problems, the author argues that such dreams may also indicate the ways in which the analyst comes to know the patient on a deep, unconscious level by processing the patient's communicative projective identifications. Two extended clinical examples of the author's countertransference dreams are offered. The author also discusses the use of countertransference dreams in psychoanalytic supervision.
3D Simulation of External Flooding Events for the RISMC Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad
2015-09-01
Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less
Bayesian geostatistics in health cartography: the perspective of malaria.
Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I
2011-06-01
Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision.
Bayesian geostatistics in health cartography: the perspective of malaria
Patil, Anand P.; Gething, Peter W.; Piel, Frédéric B.; Hay, Simon I.
2011-01-01
Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision. PMID:21420361
Citation Analysis and Discourse Analysis Revisited
ERIC Educational Resources Information Center
White, Howard D.
2004-01-01
John Swales's 1986 article "Citation analysis and discourse analysis" was written by a discourse analyst to introduce citation research from other fields, mainly sociology of science, to his own discipline. Here, I introduce applied linguists and discourse analysts to citation studies from information science, a complementary tradition not…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-03
... (which is having at least three years prior experience within the immediately preceding six years involving securities or financial analysis) and pass the Supervisory Analyst (Series 16) qualification examination. Rather than passing the entire Supervisory Analyst qualification examination, such person may...
Directed area search using socio-biological vision algorithms and cognitive Bayesian reasoning
NASA Astrophysics Data System (ADS)
Medasani, S.; Owechko, Y.; Allen, D.; Lu, T. C.; Khosla, D.
2010-04-01
Volitional search systems that assist the analyst by searching for specific targets or objects such as vehicles, factories, airports, etc in wide area overhead imagery need to overcome multiple problems present in current manual and automatic approaches. These problems include finding targets hidden in terabytes of information, relatively few pixels on targets, long intervals between interesting regions, time consuming analysis requiring many analysts, no a priori representative examples or templates of interest, detecting multiple classes of objects, and the need for very high detection rates and very low false alarm rates. This paper describes a conceptual analyst-centric framework that utilizes existing technology modules to search and locate occurrences of targets of interest (e.g., buildings, mobile targets of military significance, factories, nuclear plants, etc.), from video imagery of large areas. Our framework takes simple queries from the analyst and finds the queried targets with relatively minimum interaction from the analyst. It uses a hybrid approach that combines biologically inspired bottom up attention, socio-biologically inspired object recognition for volitionally recognizing targets, and hierarchical Bayesian networks for modeling and representing the domain knowledge. This approach has the benefits of high accuracy, low false alarm rate and can handle both low-level visual information and high-level domain knowledge in a single framework. Such a system would be of immense help for search and rescue efforts, intelligence gathering, change detection systems, and other surveillance systems.
Counter-responses as organizers in adolescent analysis and therapy.
Richmond, M Barrie
2004-01-01
The author introduces Counter-response as a phenomological term to replace theory-burdened terms like counter-transference, counter-identification, and counter-resistance. He discusses the analyst's use of self (drawing on the comparison with Winnicott's use of the object) in processing the expectable destabilizing counter-reactions that occur in working therapeutically with disturbed adolescents and their parents. Further; he discusses the counter-reaction to the patient's narrative, acting-out, and how re-enactments can serve as an organizer for understanding the patient's inner life when the analyst formulates his/her counter-response. Emphasis is placed on the therapist forming his or her own narrative with the adolescent that takes into account the evoked counter-reaction. For this purpose, the author recommends the use of a combined counter-response and metaphor-orienting perspective to acknowledge and work with the denial, illusions, reversal of perspective, and catastrophic anxieties experienced with these adolescents. The counter-response perspective permits the emergence of the disturbed adolescent's novel narrative; however, since these experiences can be destabilizing or disruptive, the author also recommends the use of a personal metaphor to anticipate the reluctance to examining, processing, and formulating the analyst's dysphoric counter-reaction. With the use of the counter-response, the analyst's therapeutic ideal is to achieve a more optimal balance between using accepted narrative theories and exploring novel enactment experiences. His swimming metaphor stratagem is designed to keep the analyst in these difficult encounters.
How star women build portable skills.
Groysberg, Boris
2008-02-01
In May 2004, with the war for talent in high gear, Groysberg and colleagues from Harvard Business School wrote in these pages about the risks of hiring star performers away from competitors. After studying the fortunes of more than 1,000 star stock analysts, they found that when a star switched companies, not only did his performance plunge, so did the effectiveness of the group he joined and the market value of his new company. But further analysis of the data reveals that it's not that simple. In fact, one group of analysts reliably maintained star rankings even after changing employers: women. Unlike their male counterparts, female stars who switched firms performed just as well, in the aggregate, as those who stayed put. The 189 star women in the sample (18% of the star analysts studied) achieved a higher rank after switching firms than the men did. Why the discrepancy? First, says the author, the best female analysts appear to have built their franchises on portable, external relationships with clients and the companies they covered, rather than on relationships rooted within their firms. By contrast, male analysts built up greater firm- and team-specific human capital by investing more in the internal networks and unique capabilities and resources of their own companies. Second, women took greater care when assessing a prospective new employer. In this article, Groysberg explores the reasons behind the star women's portable performance.
MetaboAnalyst 3.0--making metabolomics more meaningful.
Xia, Jianguo; Sinelnikov, Igor V; Han, Beomsoo; Wishart, David S
2015-07-01
MetaboAnalyst (www.metaboanalyst.ca) is a web server designed to permit comprehensive metabolomic data analysis, visualization and interpretation. It supports a wide range of complex statistical calculations and high quality graphical rendering functions that require significant computational resources. First introduced in 2009, MetaboAnalyst has experienced more than a 50X growth in user traffic (>50 000 jobs processed each month). In order to keep up with the rapidly increasing computational demands and a growing number of requests to support translational and systems biology applications, we performed a substantial rewrite and major feature upgrade of the server. The result is MetaboAnalyst 3.0. By completely re-implementing the MetaboAnalyst suite using the latest web framework technologies, we have been able substantially improve its performance, capacity and user interactivity. Three new modules have also been added including: (i) a module for biomarker analysis based on the calculation of receiver operating characteristic curves; (ii) a module for sample size estimation and power analysis for improved planning of metabolomics studies and (iii) a module to support integrative pathway analysis for both genes and metabolites. In addition, popular features found in existing modules have been significantly enhanced by upgrading the graphical output, expanding the compound libraries and by adding support for more diverse organisms. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Human-machine interaction to disambiguate entities in unstructured text and structured datasets
NASA Astrophysics Data System (ADS)
Ward, Kevin; Davenport, Jack
2017-05-01
Creating entity network graphs is a manual, time consuming process for an intelligence analyst. Beyond the traditional big data problems of information overload, individuals are often referred to by multiple names and shifting titles as they advance in their organizations over time which quickly makes simple string or phonetic alignment methods for entities insufficient. Conversely, automated methods for relationship extraction and entity disambiguation typically produce questionable results with no way for users to vet results, correct mistakes or influence the algorithm's future results. We present an entity disambiguation tool, DRADIS, which aims to bridge the gap between human-centric and machinecentric methods. DRADIS automatically extracts entities from multi-source datasets and models them as a complex set of attributes and relationships. Entities are disambiguated across the corpus using a hierarchical model executed in Spark allowing it to scale to operational sized data. Resolution results are presented to the analyst complete with sourcing information for each mention and relationship allowing analysts to quickly vet the correctness of results as well as correct mistakes. Corrected results are used by the system to refine the underlying model allowing analysts to optimize the general model to better deal with their operational data. Providing analysts with the ability to validate and correct the model to produce a system they can trust enables them to better focus their time on producing higher quality analysis products.
The analyst's participation in the analytic process.
Levine, H B
1994-08-01
The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.
NASA Astrophysics Data System (ADS)
Burns, R. G.; Meyer, R. W.; Cornwell, K.
2003-12-01
In-basin statistical relations allow for development of regional flood frequency and magnitude equations in the Cosumnes River and Mokelumne River drainage basins. Current equations were derived from data collected through 1975, and do not reflect newer data with some significant flooding. Physical basin characteristics (area, mean basin elevation, slope of longest reach, and mean annual precipitation) were correlated against predicted flood discharges for each of the 5, 10, 25, 50, 100, 200, and 500-year recurrence intervals in a multivariate analysis. Predicted maximum instantaneous flood discharges were determined using the PEAKFQ program with default settings, for 24 stream gages within the study area presumed not affected by flow management practices. For numerical comparisons, GIS-based methods using Spatial Analyst and the Arc Hydro Tools extension were applied to derive physical basin characteristics as predictor variables from a 30m digital elevation model (DEM) and a mean annual precipitation raster (PRISM). In a bivariate analysis, examination of Pearson correlation coefficients, F-statistic, and t & p thresholds show good correlation between area and flood discharges. Similar analyses show poor correlation for mean basin elevation, slope and precipitation, with flood discharge. Bivariate analysis suggests slope may not be an appropriate predictor term for use in the multivariate analysis. Precipitation and elevation correlate very well, demonstrating possible orographic effects. From the multivariate analysis, less than 6% of the variability in the correlation is not explained for flood recurrences up to 25 years. Longer term predictions up to 500 years accrue greater uncertainty with as much as 15% of the variability in the correlation left unexplained.
Military Curricula for Vocational & Technical Education. Programmer/Analyst 4-4.
ERIC Educational Resources Information Center
Department of the Army, Washington, DC.
This program of instruction and various instructional materials for a secondary-postsecondary level course for programmer/analysts is one of a number of military-developed curriculum packages selected for adaptation to vocational instruction and curriculum development in a civilian setting. The eight-week, three-section course is designed to…
The Cluster Sensitivity Index: A Basic Measure of Classification Robustness
ERIC Educational Resources Information Center
Hom, Willard C.
2010-01-01
Analysts of institutional performance have occasionally used a peer grouping approach in which they compared institutions only to other institutions with similar characteristics. Because analysts historically have used cluster analysis to define peer groups (i.e., the group of comparable institutions), the author proposes and demonstrates with…
40 CFR Appendix A to Part 63 - Test Methods
Code of Federal Regulations, 2010 CFR
2010-07-01
... components by a different analyst). 3.3Surrogate Reference Materials. The analyst may use surrogate compounds... the variance of the proposed method is significantly different from that of the validated method by... variables can be determined in eight experiments rather than 128 (W.J. Youden, Statistical Manual of the...
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Center on Education and Training for Employment.
This publication contains 25 subjects appropriate for use in a competency list for the occupation of computer programmer/analyst, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 25 units are as…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-29
... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-81,827] Verizon Business Networks... Verizon Business Network Services, Inc., Senior Analyst-Service Program Delivery, Hilliard, Ohio (subject.... Specifically, the worker group supplies service program delivery services. At the request of the State of Ohio...
Staff - Patricia E. Gallagher | Alaska Division of Geological & Geophysical
Alaska's Mineral Industry Reports AKGeology.info Rare Earth Elements WebGeochem Engineering Geology Alaska Fairbanks and is currently working toward becoming a certified GIS professional. Position: GIS Analyst professional. Professional Experience 2013-present - Cartographer/GIS Analyst, State of Alaska, Division of
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-20
..., Multax, Inconen, CTS, Hi-Tec, Woods, Ciber, Kelly Services, Analysts International Corp, Comsys, Filter..., Comsys, Filter LLC, Excell, Entegee, Chipton-Ross, Ian Martin, Can-Tech, It Services, IDEX Solutions (NW..., Kelly Services, Analysts International Corp, Comsys, Filter LLC, Excell, Entegee, Chipton-Ross, Ian...
Driver Improvement Analyst; Basic Training Program. Student Study Guide.
ERIC Educational Resources Information Center
Hale, Allen
As part of the training package for Driver Improvement Analysts, this study guide is designed to serve as the basic reference source for the students/trainees. It reinforces and supplements subject material presented in the Instructor's Lesson Plans. Subjects covered are objectives and requirements, psychology of driving, characteristics of the…
From Franchise to Programming: Jobs in Cable Television.
ERIC Educational Resources Information Center
Stanton, Michael
1985-01-01
This article takes a look at some of the key jobs at every level of the cable industry. It discusses winning a franchise, building and running the system, and programing and production. Job descriptions include engineer, market analyst, programers, financial analysts, strand mappers, customer service representatives, access coordinator, and studio…
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 9 2013-04-01 2013-04-01 false Records for chemical analysts. 1304.23 Section 1304.23 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF JUSTICE RECORDS AND REPORTS OF... authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
2016-01-07
news. Both of these resemble typical activities of intelligence analysts in OSINT processing and production applications. We assessed two task...intelligence analysts in a number of OSINT processing and production applications. (5) Summary of the most important results In both settings
DOE Office of Scientific and Technical Information (OSTI.GOV)
BERG, MICHAEL; RILEY, MARSHALL
System assessments typically yield large quantities of data from disparate sources for an analyst to scrutinize for issues. Netmeld is used to parse input from different file formats, store the data in a common format, allow users to easily query it, and enable analysts to tie different analysis tools together using a common back-end.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-28
... Patricia Newman, Program Analyst, Office of Science Policy, National Center for Research Resources, 6701...: December 20, 2010. Meryl Sufian, Supervisory Health Science Policy Analyst, Office of Science Policy, NCRR... Evaluation of the Clinical and Translational Science Awards (CTSA) Initiative SUMMARY: Under the provisions...
28 CFR 16.96 - Exemption of Federal Bureau of Investigation Systems-limited access.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) would limit the ability of trained investigators and intelligence analysts to exercise their judgment in reporting on investigations and impede the development of criminal intelligence necessary for effective law... subsection (e)(5) would limit the ability of trained investigators and intelligence analysts to exercise...
ERIC Educational Resources Information Center
Carr, James E.; Briggs, Adam M.
2011-01-01
An annotated bibliography that summarizes the "On Terms" articles on behavior-analytic terminology from "The Behavior Analyst" is provided. Thirty-five articles published between 1979 and 2010 were identified, annotated, and classified using common behavior analysis course content frameworks. (Contains 1 table.)
NASA Astrophysics Data System (ADS)
Hosking, Michael Robert
This dissertation improves an analyst's use of simulation by offering improvements in the utilization of kriging metamodels. There are three main contributions. First an analysis is performed of what comprises good experimental designs for practical (non-toy) problems when using a kriging metamodel. Second is an explanation and demonstration of how reduced rank decompositions can improve the performance of kriging, now referred to as reduced rank kriging. Third is the development of an extension of reduced rank kriging which solves an open question regarding the usage of reduced rank kriging in practice. This extension is called omni-rank kriging. Finally these results are demonstrated on two case studies. The first contribution focuses on experimental design. Sequential designs are generally known to be more efficient than "one shot" designs. However, sequential designs require some sort of pilot design from which the sequential stage can be based. We seek to find good initial designs for these pilot studies, as well as designs which will be effective if there is no following sequential stage. We test a wide variety of designs over a small set of test-bed problems. Our findings indicate that analysts should take advantage of any prior information they have about their problem's shape and/or their goals in metamodeling. In the event of a total lack of information we find that Latin hypercube designs are robust default choices. Our work is most distinguished by its attention to the higher levels of dimensionality. The second contribution introduces and explains an alternative method for kriging when there is noise in the data, which we call reduced rank kriging. Reduced rank kriging is based on using a reduced rank decomposition which artificially smoothes the kriging weights similar to a nugget effect. Our primary focus will be showing how the reduced rank decomposition propagates through kriging empirically. In addition, we show further evidence for our explanation through tests of reduced rank kriging's performance over different situations. In total, reduced rank kriging is a useful tool for simulation metamodeling. For the third contribution we will answer the question of how to find the best rank for reduced rank kriging. We do this by creating an alternative method which does not need to search for a particular rank. Instead it uses all potential ranks; we call this approach omnirank kriging. This modification realizes the potential gains from reduced rank kriging and provides a workable methodology for simulation metamodeling. Finally, we will demonstrate the use and value of these developments on two case studies, a clinic operation problem and a location problem. These cases will validate the value of this research. Simulation metamodeling always attempts to extract maximum information from limited data. Each one of these contributions will allow analysts to make better use of their constrained computational budgets.
ListingAnalyst: A program for analyzing the main output file from MODFLOW
Winston, Richard B.; Paulinski, Scott
2014-01-01
ListingAnalyst is a Windows® program for viewing the main output file from MODFLOW-2005, MODFLOW-NWT, or MODFLOW-LGR. It organizes and displays large files quickly without using excessive memory. The sections and subsections of the file are displayed in a tree-view control, which allows the user to navigate quickly to desired locations in the files. ListingAnalyst gathers error and warning messages scattered throughout the main output file and displays them all together in an error and a warning tab. A grid view displays tables in a readable format and allows the user to copy the table into a spreadsheet. The user can also search the file for terms of interest.
Analytic process and dreaming about analysis.
Sirois, François
2016-12-01
Dreams about the analytic session feature a manifest content in which the analytic setting is subject to distortion while the analyst appears undisguised. Such dreams are a consistent yet infrequent occurrence in most analyses. Their specificity consists in never reproducing the material conditions of the analysis as such. This paper puts forward the following hypothesis: dreams about the session relate to some aspects of the analyst's activity. In this sense, such dreams are indicative of the transference neurosis, prefiguring transference resistances to the analytic elaboration of key conflicts. The parts taken by the patient and by the analyst are discussed in terms of their ability to signal a deepening of the analysis. Copyright © 2016 Institute of Psychoanalysis.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-21
... Shuler and Shane Subler, International Trade Compliance Analysts, to Susan Kuhbach, Director, Office 1... Compliance Analysts, Office 1, to Susan H. Kuhbach, Office Director, AD/CVD Operations, Office 1, entitled..., Office Director, AD/CVD Operations, Office 1, entitled ``Verification Report: Tianjin Pipe (Group...
2008-05-15
intelligence enterprise to describe the idea used in this monograph. 12 David Brooks, “The Elephantiasis of Reason,” The Atlantic Monthly. (January/February... Elephantiasis of Reason”. The Atlantic Monthly. (January/February, 2003). Boyd, Dennis & Bee, Helen. 2006. Lifespan Development. Fourth Edition. Allyn
Team Collaboration: The Use of Behavior Principles for Serving Students with ASD
ERIC Educational Resources Information Center
Donaldson, Amy L.; Stahmer, Aubyn C.
2014-01-01
Purpose: Speech-language pathologists (SLPs) and behavior analysts are key members of school-based teams that serve children with autism spectrum disorders (ASD). Behavior analysts approach assessment and intervention through the lens of applied behavior analysis (ABA). ABA-based interventions have been found effective for targeting skills across…
Short term evaluation of harvesting systems for ecosystem management
Michael D. Erickson; Penn Peters; Curt Hassler
1995-01-01
Continuous time/motion studies have traditionally been the basis for productivity estimates of timber harvesting systems. The detailed data from such studies permits the researcher or analyst to develop mathematical relationships based on stand, system, and stem attributes for describing machine cycle times. The resulting equation(s) allow the analyst to estimate...
The Pentagon's Military Analyst Program
ERIC Educational Resources Information Center
Valeri, Andy
2014-01-01
This article provides an investigatory overview of the Pentagon's military analyst program, what it is, how it was implemented, and how it constitutes a form of propaganda. A technical analysis of the program is applied using the theoretical framework of the propaganda model first developed by Noam Chomsky and Edward S. Herman. Definitions…
ERIC Educational Resources Information Center
Baker, Jason R.
2017-01-01
The goals of the present action research study were to understand intelligence analysts' perceptions of weapon systems visual recognition ("vis-recce") training and to determine the impact of a Critical Thinking Training (CTT) Seminar and Formative Assessments on unit-level intelligence analysts' "vis-recce" performance at a…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-02
... committee uses third-party analyst research and a proprietary fundamental process to make allocation... investment process: Step 1: The Sub-Adviser's use of third-party research consists of analyzing the consensus... analyst research and a proprietary fundamental process to make allocation decisions. Changes to the Fund's...
The Allocation of Visual Attention in Multimedia Search Interfaces
ERIC Educational Resources Information Center
Hughes, Edith Allen
2017-01-01
Multimedia analysts are challenged by the massive numbers of unconstrained video clips generated daily. Such clips can include any possible scene and events, and generally have limited quality control. Analysts who must work with such data are overwhelmed by its volume and lack of computational tools to probe it effectively. Even with advances…
Characteristics of the Navy Laboratory Warfare Center Technical Workforce
2013-09-29
Mathematics and Information Science (M&IS) Actuarial Science 1510 Computer Science 1550 Gen. Math & Statistics 1501 Mathematics 1520 Operations...Admin. Network Systems & Data Communication Analysts Actuaries Mathematicians Operations Research Analyst Statisticians Social Science (SS...workforce was sub-divided into six broad occupational groups: Life Science , Physical Science , Engineering, Mathematics, Computer Science and Information
The Effect of a Workload-Preview on Task-Prioritization and Task-Performance
ERIC Educational Resources Information Center
Minotra, Dev
2012-01-01
With increased volume and sophistication of cyber attacks in recent years, maintaining situation awareness and effective task-prioritization strategy is critical to the task of cybersecurity analysts. However, high levels of mental-workload associated with the task of cybersecurity analyst's limits their ability to prioritize tasks.…
Improving sensor data analysis through diverse data source integration
NASA Astrophysics Data System (ADS)
Casper, Jennifer; Albuquerque, Ronald; Hyland, Jeremy; Leveille, Peter; Hu, Jing; Cheung, Eddy; Mauer, Dan; Couture, Ronald; Lai, Barry
2009-05-01
Daily sensor data volumes are increasing from gigabytes to multiple terabytes. The manpower and resources needed to analyze the increasing amount of data are not growing at the same rate. Current volumes of diverse data, both live streaming and historical, are not fully analyzed. Analysts are left mostly to analyzing the individual data sources manually. This is both time consuming and mentally exhausting. Expanding data collections only exacerbate this problem. Improved data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Improved techniques are needed to reduce an analysts decision response time and to enable more intelligent and immediate situation awareness. This paper describes the Sensor Data and Analysis Framework (SDAF) system built to provide analysts with the ability to pose integrated queries on diverse live and historical data sources, and plug in needed algorithms for upstream processing and filtering. The SDAF system was inspired by input and feedback from field analysts and experts. This paper presents SDAF's capabilities, implementation, and reasoning behind implementation decisions. Finally, lessons learned from preliminary tests and deployments are captured for future work.
Embodying analysis: the body and the therapeutic process.
Martini, Salvatore
2016-02-01
This paper considers the transfer of somatic effects from patient to analyst, which gives rise to embodied countertransference, functioning as an organ of primitive communication. By means of processes of projective identification, the analyst experiences somatic disturbances within himself or herself that are connected to the split-off complexes of the analysand. The analysty's own attempt at mind-body integration ushers the patient towards a progressive understanding and acceptance of his or her inner suffering. Such experiences of psychic contagion between patient and analyst are related to Jung's 'psychology of the transference' and the idea of the 'subtle body' as an unconscious shared area. The re-attribution of meaning to pre-verbal psychic experiences within the 'embodied reverie' of the analyst enables the analytic dyad to reach the archetypal energies and structuring power of the collective unconscious. A detailed case example is presented of how the emergence of the vitalizing connection between the psyche and the soma, severed through traumatic early relations with parents or carers, allows the instinctual impulse of the Self to manifest, thereby reactivating the process of individuation. © 2016, The Society of Analytical Psychology.
Brown, Lawrence J; Miller, Martin
2002-08-01
The use of the psychoanalyst's subjective reactions as a tool to better understand his/her patient has been a central feature of clinical thinking in recent decades. While there has been much discussion and debate about the analyst's use of countertransference in individual psychoanalysis, including possible disclosure of his/her feelings to the patient, the literature on supervision has been slower to consider such matters. The attention to parallel processes in supervision has been helpful in appreciating the impact of affects arising in either the analyst/patient or the supervisor/analyst dyads upon the analytic treatment and its supervision. This contribution addresses the ways in which overlapping aspects of the personalities of the supervisor, analyst and patient may intersect and create resistances in the treatment. That three-way intersection, described here as the triadic intersubjective matrix, is considered inevitable in all supervised treatments. A clinical example from the termination phase of a supervised analysis of an adolescent is offered to illustrate these points. Finally, the question of self-disclosure as an aspect of the supervisory alliance is also discussed.
Developing an intelligence analysis process through social network analysis
NASA Astrophysics Data System (ADS)
Waskiewicz, Todd; LaMonica, Peter
2008-04-01
Intelligence analysts are tasked with making sense of enormous amounts of data and gaining an awareness of a situation that can be acted upon. This process can be extremely difficult and time consuming. Trying to differentiate between important pieces of information and extraneous data only complicates the problem. When dealing with data containing entities and relationships, social network analysis (SNA) techniques can be employed to make this job easier. Applying network measures to social network graphs can identify the most significant nodes (entities) and edges (relationships) and help the analyst further focus on key areas of concern. Strange developed a model that identifies high value targets such as centers of gravity and critical vulnerabilities. SNA lends itself to the discovery of these high value targets and the Air Force Research Laboratory (AFRL) has investigated several network measures such as centrality, betweenness, and grouping to identify centers of gravity and critical vulnerabilities. Using these network measures, a process for the intelligence analyst has been developed to aid analysts in identifying points of tactical emphasis. Organizational Risk Analyzer (ORA) and Terrorist Modus Operandi Discovery System (TMODS) are the two applications used to compute the network measures and identify the points to be acted upon. Therefore, the result of leveraging social network analysis techniques and applications will provide the analyst and the intelligence community with more focused and concentrated analysis results allowing them to more easily exploit key attributes of a network, thus saving time, money, and manpower.
Towards an automated intelligence product generation capability
NASA Astrophysics Data System (ADS)
Smith, Alison M.; Hawes, Timothy W.; Nolan, James J.
2015-05-01
Creating intelligence information products is a time consuming and difficult process for analysts faced with identifying key pieces of information relevant to a complex set of information requirements. Complicating matters, these key pieces of information exist in multiple modalities scattered across data stores, buried in huge volumes of data. This results in the current predicament analysts find themselves; information retrieval and management consumes huge amounts of time that could be better spent performing analysis. The persistent growth in data accumulation rates will only increase the amount of time spent on these tasks without a significant advance in automated solutions for information product generation. We present a product generation tool, Automated PrOduct Generation and Enrichment (APOGEE), which aims to automate the information product creation process in order to shift the bulk of the analysts' effort from data discovery and management to analysis. APOGEE discovers relevant text, imagery, video, and audio for inclusion in information products using semantic and statistical models of unstructured content. APOGEEs mixed-initiative interface, supported by highly responsive backend mechanisms, allows analysts to dynamically control the product generation process ensuring a maximally relevant result. The combination of these capabilities results in significant reductions in the time it takes analysts to produce information products while helping to increase the overall coverage. Through evaluation with a domain expert, APOGEE has been shown the potential to cut down the time for product generation by 20x. The result is a flexible end-to-end system that can be rapidly deployed in new operational settings.
Where are you, my beloved? On absence, loss, and the enigma of telepathic dreams.
Eshel, Ofra
2006-12-01
The subject of dream telepathy (especially patients' telepathic dreams) and related phenomena in the psychoanalytic context has been a controversial, disturbing 'foreign body' ever since it was introduced into psychoanalysis by Freud in 1921. Telepathy- suffering (or intense feeling) at a distance (Greek: pathos + tele)-is the transfer or communication of thoughts, impressions and information over distance between two people without the normal operation of the recognized sense organs. The author offers a comprehensive historical review of the psychoanalytic literature on this controversial issue, beginning with Freud's years-long struggles over the possibility of thoughttransference and dream telepathy. She then describes her own analytic encounter over the years with five patients' telepathic dreams-dreams involving precise details of the time, place, sensory impressions, and experiential states that the analyst was in at that time, which the patients could not have known through ordinary sensory perception and communication. The author's ensuing explanation combines contributory factors involving patient, archaic communication and analyst. Each of these patients, in early childhood, had a mother who was emotionally absent-within-absence, due to the absence of a significant figure in her own life. This primary traumatic loss was imprinted in their nascent selves and inchoate relating to others, with a fixation on a nonverbal, archaic mode of communication. The patient's telepathic dream is formed as a search engine when the analyst is suddenly emotionally absent, in order to find the analyst and thus halt the process of abandonment and prevent collapse into the despair of the early traumatization. Hence, the telepathic dream embodies an enigmatic 'impossible' extreme of patient-analyst deep-level interconnectedness and unconscious communication in the analytic process. This paper is part of the author's endeavour to grasp the true experiential scope and therapeutic significance of this dimension of fundamental patient-analyst interconnectedness.
NASA Astrophysics Data System (ADS)
Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin
2015-05-01
The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.
Cimino, Cristiana; Correale, Antonello
2005-02-01
The authors claim that projective identification in the process of analysis should be considered in a circumscribed manner and seen as a very specific type of communication between the patient and the analyst, characterised through a modality that is simultaneously active, unconscious and discrete. In other words, the patient actively, though unconsciously and discretely--that is, in specific moments of the analysis--brings about particular changes in the analysts state. From the analyst's side, the effect of this type of communication is a sudden change in his general state--a sense of passivity and coercion and a change in the state of consciousness. This altered consciousness can range from an almost automatic repetition of a relational script to a moderate or serious contraction of the field of attention to full-fledged changes in the analyst's sense of self. The authors propose the theory that this type of communication is, in fact, the expression of traumatic contents of experiences emerging from the non-declarative memory. These contents belong to a pre-symbolic and pre-representative area of the mind. They are made of inert fragments of psychic material that are felt rather than thought, which can thus be viewed as a kind of writing to be completed. These pieces of psychic material are the expression of traumatic experiences that in turn exercise a traumatic effect on the analyst, inducing an altered state of consciousness in him as well. Such material should be understood as belonging to an unrepressed unconscious. Restitution of these fragments to the patient in representable forms must take place gradually and without trying to accelerate the timing, in order to avoid the possibility that the restitution itself constitute an acting on the part of the analyst, which would thus be a traumatic response to the traumatic action of the analytic material.
NASA Technical Reports Server (NTRS)
Romere, Paul O.; Brown, Steve Wesley
1995-01-01
Development of the Space Shuttle necessitated an extensive wind tunnel test program, with the cooperation of all the major wind tunnels in the United States. The result was approximately 100,000 hours of Space Shuttle wind tunnel testing conducted for aerodynamics, heat transfer, and structural dynamics. The test results were converted into Chrysler DATAMAN computer program format to facilitate use by analysts, a very cost effective method of collecting the wind tunnel test results from many test facilities into one centralized location. This report provides final documentation of the Space Shuttle wind tunnel program. The two-volume set covers the evolution of Space Shuttle aerodynamic configurations and gives wind tunnel test data, titles of wind tunnel data reports, sample data sets, and instructions for accessing the digital data base.
On detection and visualization techniques for cyber security situation awareness
NASA Astrophysics Data System (ADS)
Yu, Wei; Wei, Shixiao; Shen, Dan; Blowers, Misty; Blasch, Erik P.; Pham, Khanh D.; Chen, Genshe; Zhang, Hanlin; Lu, Chao
2013-05-01
Networking technologies are exponentially increasing to meet worldwide communication requirements. The rapid growth of network technologies and perversity of communications pose serious security issues. In this paper, we aim to developing an integrated network defense system with situation awareness capabilities to present the useful information for human analysts. In particular, we implement a prototypical system that includes both the distributed passive and active network sensors and traffic visualization features, such as 1D, 2D and 3D based network traffic displays. To effectively detect attacks, we also implement algorithms to transform real-world data of IP addresses into images and study the pattern of attacks and use both the discrete wavelet transform (DWT) based scheme and the statistical based scheme to detect attacks. Through an extensive simulation study, our data validate the effectiveness of our implemented defense system.
NASA Technical Reports Server (NTRS)
Romere, Paul O.; Brown, Steve Wesley
1995-01-01
Development of the space shuttle necessitated an extensive wind tunnel test program, with the cooperation of all the major wind tunnels in the United States. The result was approximately 100,000 hours of space shuttle wind tunnel testing conducted for aerodynamics, heat transfer, and structural dynamics. The test results were converted into Chrysler DATAMAN computer program format to facilitate use by analysts, a very cost effective method of collecting the wind tunnel test results from many test facilities into one centralized location. This report provides final documentation of the space shuttle wind tunnel program. The two-volume set covers evolution of space shuttle aerodynamic configurations and gives wind tunnel test data, titles of wind tunnel data reports, sample data sets, and instructions for accessing the digital data base.
Shuttle Case Study Collection Website Development
NASA Technical Reports Server (NTRS)
Ransom, Khadijah S.; Johnson, Grace K.
2012-01-01
As a continuation from summer 2012, the Shuttle Case Study Collection has been developed using lessons learned documented by NASA engineers, analysts, and contractors. Decades of information related to processing and launching the Space Shuttle is gathered into a single database to provide educators with an alternative means to teach real-world engineering processes. The goal is to provide additional engineering materials that enhance critical thinking, decision making, and problem solving skills. During this second phase of the project, the Shuttle Case Study Collection website was developed. Extensive HTML coding to link downloadable documents, videos, and images was required, as was training to learn NASA's Content Management System (CMS) for website design. As the final stage of the collection development, the website is designed to allow for distribution of information to the public as well as for case study report submissions from other educators online.
NASA Astrophysics Data System (ADS)
Ozturk, D.; Chaudhary, A.; Votava, P.; Kotfila, C.
2016-12-01
Jointly developed by Kitware and NASA Ames, GeoNotebook is an open source tool designed to give the maximum amount of flexibility to analysts, while dramatically simplifying the process of exploring geospatially indexed datasets. Packages like Fiona (backed by GDAL), Shapely, Descartes, Geopandas, and PySAL provide a stack of technologies for reading, transforming, and analyzing geospatial data. Combined with the Jupyter notebook and libraries like matplotlib/Basemap it is possible to generate detailed geospatial visualizations. Unfortunately, visualizations generated is either static or does not perform well for very large datasets. Also, this setup requires a great deal of boilerplate code to create and maintain. Other extensions exist to remedy these problems, but they provide a separate map for each input cell and do not support map interactions that feed back into the python environment. To support interactive data exploration and visualization on large datasets we have developed an extension to the Jupyter notebook that provides a single dynamic map that can be managed from the Python environment, and that can communicate back with a server which can perform operations like data subsetting on a cloud-based cluster.
geophylobuilder 1.0: an arcgis extension for creating 'geophylogenies'.
Kidd, David M; Liu, Xianhua
2008-01-01
Evolution is inherently a spatiotemporal process; however, despite this, phylogenetic and geographical data and models remain largely isolated from one another. Geographical information systems provide a ready-made spatial modelling, analysis and dissemination environment within which phylogenetic models can be explicitly linked with their associated spatial data and subsequently integrated with other georeferenced data sets describing the biotic and abiotic environment. geophylobuilder 1.0 is an extension for the arcgis geographical information system that builds a 'geophylogenetic' data model from a phylogenetic tree and associated geographical data. Geophylogenetic database objects can subsequently be queried, spatially analysed and visualized in both 2D and 3D within a geographical information systems. © 2007 The Authors.
ERIC Educational Resources Information Center
Bill & Melinda Gates Foundation, 2010
2010-01-01
This paper serves as a supplement to the "This School Works for Me: Creating Choices to Boost Achievement. A Guide for Data Analysts" report. This paper contains the following sections: (1) Initial Steps; (2) Developing Baseline District Facts; (3) Identifying Effective Options; (4) Identifying Effective Options: Preventative; and (5) Identifying…
CRS Issue Statement on State and Foreign Operations Appropriations
2010-01-12
Moss Analyst in Global Health kmoss@crs.loc.gov, 7-7314 Melissa D. Ho Analyst in Agricultural Policy mho@crs.loc.gov, 7-5342 Kennon H. Nakamura...9525 Jim Nichol Specialist in Russian and Eurasian Affairs jnichol@crs.loc.gov, 7-2289 Charles E. Hanrahan Senior Specialist in Agricultural Policy chanrahan@crs.loc.gov, 7-7235 .
ERIC Educational Resources Information Center
Smith, Tristram
2012-01-01
The extraordinary success of behavior-analytic interventions for individuals with autism spectrum disorder (ASD) has fueled the rapid growth of behavior analysis as a profession. One reason for this success is that for many years behavior analysts were virtually alone in conducting programmatic ASD intervention research. However, that era has…
78 FR 4883 - Excepted Service; Consolidated Listing of Schedules A, B, and C Exceptions
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-23
...) Professional and technical positions in grades GS-9 through 15 on the staff of the Council. (d)-(f) (Reserved... and Technology Policy-- (1) Thirty positions of Senior Policy Analyst, GS-15; Policy Analyst, GS-11/14... Secretary for Management. (2) (Reserved) (b)-(f) (Reserved) (g) Bureau of Population, Refugees, and...
Estimating two-way tables based on forest surveys
Charles T. Scott
2000-01-01
Forest survey analysts usually are interested in tables of values rather than single point estimates. A common error is to include only plots on which nonzero values of the attribute were observed when computing the variance of a mean. Similarly, analysts often exclude nonforest plots from the analysis. The development of the correct estimates of forest area, attribute...
ERIC Educational Resources Information Center
Giacobe, Nicklaus A.
2013-01-01
Cyber-security involves the monitoring a complex network of inter-related computers to prevent, identify and remediate from undesired actions. This work is performed in organizations by human analysts. These analysts monitor cyber-security sensors to develop and maintain situation awareness (SA) of both normal and abnormal activities that occur on…
ERIC Educational Resources Information Center
Ammentorp, William
There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…
Pricing Programs Spur Growth of Renewable Energy Technologies
) Golden, Colo., September 25, 2001 - A new study by the U.S. Department of Energy's (DOE) National electrical production from renewable resources such as solar and wind. The study found that the design and production," said NREL Energy Analyst Blair Swezey, who co-wrote the study with NREL Energy Analyst Lori
Visual Analysis among Novices: Training and Trend Lines as Graphic Aids
ERIC Educational Resources Information Center
Nelson, Peter M.; Van Norman, Ethan R.; Christ, Theodore J.
2017-01-01
The current study evaluated the degree to which novice visual analysts could discern trends in simulated time-series data across differing levels of variability and extreme values. Forty-five novice visual analysts were trained in general principles of visual analysis. One group received brief training on how to identify and omit extreme values.…
Science, Skepticism, and Applied Behavior Analysis
Normand, Matthew P
2008-01-01
Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice. PMID:22477687
Special Nuclear Material Gamma-Ray Signatures for Reachback Analysts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpius, Peter Joseph; Myers, Steven Charles
2016-08-29
These are slides on special nuclear material gamma-ray signatures for reachback analysts for an LSS Spectroscopy course. The closing thoughts for this presentation are the following: SNM materials have definite spectral signatures that should be readily recognizable to analysts in both bare and shielded configurations. One can estimate burnup of plutonium using certain pairs of peaks that are a few keV apart. In most cases, one cannot reliably estimate uranium enrichment in an analogous way to the estimation of plutonium burnup. The origin of the most intense peaks from some SNM items may be indirect and from ‘associated nuclides.' Indirectmore » SNM signatures sometimes have commonalities with the natural gamma-ray background.« less
Kozlikova, Barbora; Sebestova, Eva; Sustr, Vilem; Brezovsky, Jan; Strnad, Ondrej; Daniel, Lukas; Bednar, David; Pavelka, Antonin; Manak, Martin; Bezdeka, Martin; Benes, Petr; Kotry, Matus; Gora, Artur; Damborsky, Jiri; Sochor, Jiri
2014-09-15
The transport of ligands, ions or solvent molecules into proteins with buried binding sites or through the membrane is enabled by protein tunnels and channels. CAVER Analyst is a software tool for calculation, analysis and real-time visualization of access tunnels and channels in static and dynamic protein structures. It provides an intuitive graphic user interface for setting up the calculation and interactive exploration of identified tunnels/channels and their characteristics. CAVER Analyst is a multi-platform software written in JAVA. Binaries and documentation are freely available for non-commercial use at http://www.caver.cz. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
1993-09-15
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall SPace Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).
1993-12-15
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).
Methods for Attributing Land-Use Emissions to Products
NASA Astrophysics Data System (ADS)
Davis, S. J.; Burney, J. A.; Pongratz, J.; Caldeira, K.
2014-12-01
Roughly one-third of anthropogenic GHG emissions are caused by agricultural and forestry activities and land-use change (collectively, 'land-use emissions'). Understanding the ultimate drivers of these emissions requires attributing emissions to specific land-use activities and products. Although quantities of land-use emissions are matters of fact, the methodological choices and assumptions required to attribute those emissions to activities and products depend on research goals and data availability. We will demonstrate several possible accounting methods, highlighting the sensitivity of accounting to temporal distributions of emissions and the consequences of replacing spatially-explicit data with aggregate proxies such as production or harvested area data. Different accounting options emphasize different causes of land-use emissions (e.g., proximate or indirect drivers of deforestation). To support public policies that effectively balance competing objectives, analysts should carefully consider and communicate implications of accounting choices.
Communicating Geographical Risks in Crisis Management: The Need for Research.
French, Simon; Argyris, Nikolaos; Haywood, Stephanie M; Hort, Matthew C; Smith, Jim Q
2017-10-23
In any crisis, there is a great deal of uncertainty, often geographical uncertainty or, more precisely, spatiotemporal uncertainty. Examples include the spread of contamination from an industrial accident, drifting volcanic ash, and the path of a hurricane. Estimating spatiotemporal probabilities is usually a difficult task, but that is not our primary concern. Rather, we ask how analysts can communicate spatiotemporal uncertainty to those handling the crisis. We comment on the somewhat limited literature on the representation of spatial uncertainty on maps. We note that many cognitive issues arise and that the potential for confusion is high. We note that in the early stages of handling a crisis, the uncertainties involved may be deep, i.e., difficult or impossible to quantify in the time available. In such circumstance, we suggest the idea of presenting multiple scenarios. © 2017 Society for Risk Analysis.
An analysis of Milwaukee county land use
NASA Technical Reports Server (NTRS)
Todd, W. J.; Mausel, P. E.
1973-01-01
The identification and classification of urban and suburban phenomena through analysis of remotely-acquired sensor data can provide information of great potential value to many regional analysts. Such classifications, particularly those using spectral data obtained from satellites such as the first Earth Resources Technology Satellite (ERTS-1) orbited by NASA, allow rapid frequent and accurate general land use inventories that are of value in many types of spatial analyses. In this study, Milwaukee County, Wisconsin was classified into several broad land use categories on the basis of computer analysis of four bands of ERTS spectral data (ERTS Frame Number E1017-16093). Categories identified were: (1) road-central business district, (2) grass (green vegetation), (3) suburban, (4) wooded suburb, (5) heavy industry, (6) inner city, and (7) water. Overall, 90 percent accuracy was attained in classification of these urban land use categories.
Could the outcome of the 2016 US elections have been predicted from past voting patterns?
NASA Astrophysics Data System (ADS)
Schmitz, Peter M. U.; Holloway, Jennifer P.; Dudeni-Tlhone, Nontembeko; Ntlangu, Mbulelo B.; Koen, Renee
2018-05-01
In South Africa, a team of analysts has for some years been using statistical techniques to predict election outcomes during election nights in South Africa. The prediction method involves using statistical clusters based on past voting patterns to predict final election outcomes, using a small number of released vote counts. With the US presidential elections in November 2016 hitting the global media headlines during the time period directly after successful predictions were done for the South African elections, the team decided to investigate adapting their meth-od to forecast the final outcome in the US elections. In particular, it was felt that the time zone differences between states would affect the time at which results are released and thereby provide a window of opportunity for doing election night prediction using only the early results from the eastern side of the US. Testing the method on the US presidential elections would have two advantages: it would determine whether the core methodology could be generalised, and whether it would work to include a stronger spatial element in the modelling, since the early results released would be spatially biased due to time zone differences. This paper presents a high-level view of the overall methodology and how it was adapted to predict the results of the US presidential elections. A discussion on the clustering of spatial units within the US is also provided and the spatial distribution of results together with the Electoral College prediction results from both a `test-run' and the final 2016 presidential elections are given and analysed.
Bio-optical data integration based on a 4 D database system approach
NASA Astrophysics Data System (ADS)
Imai, N. N.; Shimabukuro, M. H.; Carmo, A. F. C.; Alcantara, E. H.; Rodrigues, T. W. P.; Watanabe, F. S. Y.
2015-04-01
Bio-optical characterization of water bodies requires spatio-temporal data about Inherent Optical Properties and Apparent Optical Properties which allow the comprehension of underwater light field aiming at the development of models for monitoring water quality. Measurements are taken to represent optical properties along a column of water, and then the spectral data must be related to depth. However, the spatial positions of measurement may differ since collecting instruments vary. In addition, the records should not refer to the same wavelengths. Additional difficulty is that distinct instruments store data in different formats. A data integration approach is needed to make these large and multi source data sets suitable for analysis. Thus, it becomes possible, even automatically, semi-empirical models evaluation, preceded by preliminary tasks of quality control. In this work it is presented a solution, in the stated scenario, based on spatial - geographic - database approach with the adoption of an object relational Database Management System - DBMS - due to the possibilities to represent all data collected in the field, in conjunction with data obtained by laboratory analysis and Remote Sensing images that have been taken at the time of field data collection. This data integration approach leads to a 4D representation since that its coordinate system includes 3D spatial coordinates - planimetric and depth - and the time when each data was taken. It was adopted PostgreSQL DBMS extended by PostGIS module to provide abilities to manage spatial/geospatial data. It was developed a prototype which has the mainly tools an analyst needs to prepare the data sets for analysis.
Cash across the City: Participatory Mapping & Teaching for Spatial Justice
ERIC Educational Resources Information Center
Rubel, Laurie; Lim, Vivian; Hall-Wieckert, Maren; Katz, Sara
2016-01-01
This paper explores teaching mathematics for spatial justice (Soja, 2010), as an extension of teaching mathematics for social justice (Gutstein, 2006). The study is contextualized in a 10-session curricular module focused on the spatial justice of a city's two-tiered system of personal finance institutions (mainstream vs. alternative), piloted…
NASA Astrophysics Data System (ADS)
Dabolt, T. O.
2016-12-01
The proliferation of open data and data services continues to thrive and is creating new challenges on how researchers, policy analysts and other decision makes can quickly discover and use relevant data. While traditional metadata catalog approaches used by applications such as data.gov prove to be useful starting points for data search they can quickly frustrate end users who are seeking ways to quickly find and then use data in machine to machine environs. The Geospatial Platform is overcoming these obstacles and providing end users and applications developers a richer more productive user experience. The Geospatial Platform leverages a collection of open source and commercial technology hosted on Amazon Web Services providing an ecosystem of services delivering trusted, consistent data in open formats to all users as well as a shared infrastructure for federal partners to serve their spatial data assets. It supports a diverse array of communities of practice ranging on topics from the 16 National Geospatial Data Assets Themes, to homeland security and climate adaptation. Come learn how you can contribute your data and leverage others or check it out on your own at https://www.geoplatform.gov/
Spatial Arrangement of Branches in Relation to Slope and Neighbourhood Competition
SUMIDA, AKIHIRO; TERAZAWA, IKUE; TOGASHI, ASAKO; KOMIYAMA, AKIRA
2002-01-01
To gain a better understanding of the effects of spatial structure on patterns of neighbourhood competition among hardwood trees, the three‐dimensional extension of primary branches was surveyed for ten community‐grown Castanea crenata (Fagaceae) trees with respect to the positioning of neighbouring branches and the slope of the forest floor. There were significantly more branches extending towards the lower side of the slope than towards the upper side, but structural properties such as branch length and vertical angle were not affected by slope. When horizontal extension of a branch towards its neighbour was compared for a C. crenata branch and a neighbouring heterospecific, the former was significantly narrower than the latter when the inter‐branch distance (horizontal distance between the base positions of two neighbouring branches) was short (< approx. 5 m). Castanea crenata branches tended to extend in a direction avoiding neighbouring branches of heterospecifics when the inter‐branch distance was short. Furthermore, for an inter‐branch distance <3 m, the horizontal extension of a C. crenata branch was less when it was neighbouring a heterospecific branch than when neighbouring a conspecific branch. These results suggest that horizontal extension of C. crenata branches is more prone to spatial invasion by nearby neighbouring branches of heterospecifics, and that the invasion can be lessened when C. crenata trees are spatially aggregated. The reason why such an arrangement occurs is discussed in relation to the later leaf‐flush of C. crenata compared with that of other species in the forest. PMID:12096742
Working conditions, visual fatigue, and mental health among systems analysts in São Paulo, Brazil
Rocha, L; Debert-Ribeiro, M
2004-01-01
Aims: To evaluate the association between working conditions and visual fatigue and mental health among systems analysts living in São Paulo, Brazil. Methods: A cross sectional study was carried out by a multidisciplinary team. It included: ergonomic analysis of work, individual and group interviews, and 553 self applied questionnaires in two enterprises. The comparison population numbered 136 workers in different occupations. Results: The study population mainly comprised young males. Among systems analysts, visual fatigue was associated with mental workload, inadequate equipment and workstation, low level of worker participation, being a woman, and subject's attitude of fascination by the computer. Nervousness and intellectual performance were associated with mental workload, inadequate equipment, work environment, and tools. Continuing education and leisure were protective factors. Work interfering in family life was associated with mental workload, difficulties with clients, strict deadlines, subject's attitude of fascination by the computer, and finding solutions of work problems outside work. Family support, satisfaction in life and work, and adequate work environment and tools were protective factors. Work interfering in personal life was associated with subject's attitude of fascination by the computer, strict deadlines, inadequate equipment, and high level of work participation. Satisfaction in life and work and continuing education were protective factors. The comparison population did not share common working factors with the systems analysts in the regression analysis. Conclusions: The main health effects of systems analysts' work were expressed by machine anthropomorphism, being very demanding, mental acceleration, mental absorption, and difficulty in dealing with emotions. PMID:14691269
An agile acquisition decision-support workbench for evaluating ISR effectiveness
NASA Astrophysics Data System (ADS)
Stouch, Daniel W.; Champagne, Valerie; Mow, Christopher; Rosenberg, Brad; Serrin, Joshua
2011-06-01
The U.S. Air Force is consistently evolving to support current and future operations through the planning and execution of intelligence, surveillance and reconnaissance (ISR) missions. However, it is a challenge to maintain a precise awareness of current and emerging ISR capabilities to properly prepare for future conflicts. We present a decisionsupport tool for acquisition managers to empirically compare ISR capabilities and approaches to employing them, thereby enabling the DoD to acquire ISR platforms and sensors that provide the greatest return on investment. We have developed an analysis environment to perform modeling and simulation-based experiments to objectively compare alternatives. First, the analyst specifies an operational scenario for an area of operations by providing terrain and threat information; a set of nominated collections; sensor and platform capabilities; and processing, exploitation, and dissemination (PED) capacities. Next, the analyst selects and configures ISR collection strategies to generate collection plans. The analyst then defines customizable measures of effectiveness or performance to compute during the experiment. Finally, the analyst empirically compares the efficacy of each solution and generates concise reports to document their conclusions, providing traceable evidence for acquisition decisions. Our capability demonstrates the utility of using a workbench environment for analysts to design and run experiments. Crafting impartial metrics enables the acquisition manager to focus on evaluating solutions based on specific military needs. Finally, the metric and collection plan visualizations provide an intuitive understanding of the suitability of particular solutions. This facilitates a more agile acquisition strategy that handles rapidly changing technology in response to current military needs.
Federal Employees: Appointees Converted to Career Positions, July through September 1988
1989-01-13
Media) GS-1035-13/2 GS- 1082 -12/5 Program Analyst yes Program Analyst Temporary GS-345-12/1 GS-345-12/1 Legislative Affairs yes Congressional Liaison...Officer GS-301-13/1 GM-345-14 GS-14/1 equivalent pay MERIT SYSTEMS PROTECTION BOARD Writer/Editor yes Writer/Editor Temporary GS- 1082 -12/1 GS- 1082 -12/1
Disability Evaluation System Analysis and Research Annual Report 2017
2017-11-20
Amanda L. Kelley, MPH Program Manager, AMSARA Deputy Program Manager, AMSARA Contractor, ManTech Health Contractor, ManTech Health Christine...Toolin, MS Cordie K. Campbell, MPH Public Health Analyst, AMSARA Public Health Analyst, AMSARA Contractor, ManTech Health Contractor...ManTech Health Preventive Medicine Branch Walter Reed Army Institute of Research 503 Robert Grant Road, Forest Glen Annex Silver
Using a Model of Analysts' Judgments to Augment an Item Calibration Process
ERIC Educational Resources Information Center
Hauser, Carl; Thum, Yeow Meng; He, Wei; Ma, Lingling
2015-01-01
When conducting item reviews, analysts evaluate an array of statistical and graphical information to assess the fit of a field test (FT) item to an item response theory model. The process can be tedious, particularly when the number of human reviews (HR) to be completed is large. Furthermore, such a process leads to decisions that are susceptible…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-31
...: Small Entity Compliance Guide. SUMMARY: This document is issued under the joint authority of DOD, GSA..., contact the analyst whose name appears in the table below. Please cite FAC 2005-72 and the FAR case number... 202- 501-4755. Rules Listed in FAC 2005-72 Item Subject FAR Case Analyst *I Service 2010-010 Loeb...
Guidebook for Imputation of Missing Data. Technical Report No. 17.
ERIC Educational Resources Information Center
Wise, Lauress L.; McLaughlin, Donald H.
This guidebook is designed for data analysts who are working with computer data files that contain records with incomplete data. It indicates choices the analyst must make and the criteria for making those choices in regard to the following questions: (1) What resources are available for performing the imputation? (2) How big is the data file? (3)…
The TIGER system: a Census Bureau innovation serving data analysts.
Carbaugh, L W; Marx, R W
1990-01-01
This article describes the U.S. Census Bureau's TIGER (Topologically Integrated Geographic Encoding and Referencing) system, an automated geographic data base. The emphasis is on the availability of file extracts and their usefulness to data analysts. In addition to describing the available files, it mentions various applications for the data, explains the data limitations, and notes problems encountered to date.
ERIC Educational Resources Information Center
Menendez, Anthony L.; Mayton, Michael R.; Yurick, Amanda L.
2017-01-01
When rural school districts employ Board Certified Behavior Analysts (BCBAs) to assist in meeting the needs of students with disabilities, it is important that they be aware of the ethical and professional guidelines to which BCBAs are required to adhere. This article describes the role of these guidelines within the practice of BCBAs and presents…
ICP Corporate Customer Assessment - Sampling Plan
1995-07-01
CORPORATE CUSTOMER ASSESSMENT - SAMPLING PLAN JULY 1995 Lead Analyst: Lieutenant Commander William J. Wilkinson, USN Associate Analyst: Mr. Henry J...project developed a plan for conducting recurring surveys of Defense Logistics Agency customers , in support of the DLA Corporate Customer Assessment...Team. The primary product was a sampling plan, including stratification of customers by Military Service or Federal Agency and by commodity purchased
1976-12-01
economic analysts’ familiarity with the principles of economics appears to be of secondary concern to DoD management. Management has deemed desirous, in...and the operational requirements of the present staff. To management, the need for economic analysts to be familiar with the principles of economics is
Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool
ERIC Educational Resources Information Center
Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.
2011-01-01
This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…
The Divergent Paths of Behavior Analysis and Psychology: Vive la Différence!
Thyer, Bruce A
2015-05-01
Twenty years ago I suggested that behavior analysts could effect a quiet and covert takeover of the American Psychological Association (APA). I gave as precedents the operation of similar initiatives in the nineteenth and twentieth centuries, the Darwinian-inspired X-Club, and the psychoanalytically-oriented Secret Ring. Though a conscientious program of working within established APA bylaws and rules, behavior analysts could ensure that behavior analysts were nominated for every significant elective position within the APA, and move to get their colleagues placed in appointive positions, such as journal editorships, review boards, and major committees. This would be one approach to remake psychology along behavioral lines, which was an early ambition of B. F. Skinner. The community of behavior analysts ignored my suggestion, and instead pursued the path of creating an independent discipline of practitioners, one with its own degree-granting programs, conventions, journals, and legal regulation. This effort has been immensely successful, although much critical work remains to be done. In retrospect, I was wrong to suggest changing psychology from within, and I have been delighted to witness the emergence of our new and independent field.
The problem of self-disclosure in psychoanalysis.
Meissner, W W
2002-01-01
The problem of self-disclosure is explored in relation to currently shifting paradigms of the nature of the analytic relation and analytic interaction. Relational and intersubjective perspectives emphasize the role of self-disclosure as not merely allowable, but as an essential facilitating aspect of the analytic dialogue, in keeping with the role of the analyst as a contributing partner in the process. At the opposite extreme, advocates of classical anonymity stress the importance of neutrality and abstinence. The paper seeks to chart a course between unconstrained self-disclosure and absolute anonymity, both of which foster misalliances. Self-disclosure is seen as at times contributory to the analytic process, and at times deleterious. The decision whether to self-disclose, what to disclose, and when and how, should be guided by the analyst's perspective on neutrality, conceived as a mental stance in which the analyst assesses and decides what, at any given point, seems to contribute to the analytic process and the patient's therapeutic benefit. The major risk in self-disclosure is the tendency to draw the analytic interaction into the real relation between analyst and patient, thus diminishing or distorting the therapeutic alliance, mitigating transference expression, and compromising therapeutic effectiveness.
Innovative Solution to Video Enhancement
NASA Technical Reports Server (NTRS)
2001-01-01
Through a licensing agreement, Intergraph Government Solutions adapted a technology originally developed at NASA's Marshall Space Flight Center for enhanced video imaging by developing its Video Analyst(TM) System. Marshall's scientists developed the Video Image Stabilization and Registration (VISAR) technology to help FBI agents analyze video footage of the deadly 1996 Olympic Summer Games bombing in Atlanta, Georgia. VISAR technology enhanced nighttime videotapes made with hand-held camcorders, revealing important details about the explosion. Intergraph's Video Analyst System is a simple, effective, and affordable tool for video enhancement and analysis. The benefits associated with the Video Analyst System include support of full-resolution digital video, frame-by-frame analysis, and the ability to store analog video in digital format. Up to 12 hours of digital video can be stored and maintained for reliable footage analysis. The system also includes state-of-the-art features such as stabilization, image enhancement, and convolution to help improve the visibility of subjects in the video without altering underlying footage. Adaptable to many uses, Intergraph#s Video Analyst System meets the stringent demands of the law enforcement industry in the areas of surveillance, crime scene footage, sting operations, and dash-mounted video cameras.
Bergin, Michael
2011-01-01
Qualitative data analysis is a complex process and demands clear thinking on the part of the analyst. However, a number of deficiencies may obstruct the research analyst during the process, leading to inconsistencies occurring. This paper is a reflection on the use of a qualitative data analysis program, NVivo 8, and its usefulness in identifying consistency and inconsistency during the coding process. The author was conducting a large-scale study of providers and users of mental health services in Ireland. He used NVivo 8 to store, code and analyse the data and this paper reflects some of his observations during the study. The demands placed on the analyst in trying to balance the mechanics of working through a qualitative data analysis program, while simultaneously remaining conscious of the value of all sources are highlighted. NVivo 8 as a qualitative data analysis program is a challenging but valuable means for advancing the robustness of qualitative research. Pitfalls can be avoided during analysis by running queries as the analyst progresses from tree node to tree node rather than leaving it to a stage whereby data analysis is well advanced.
The Hazard Mapping System (HMS)-a Multiplatform Remote Sensing Approach to Fire and Smoke Detection
NASA Astrophysics Data System (ADS)
Kibler, J.; Ruminski, M. G.
2003-12-01
The HMS is a multiplatform remote sensing approach to detecting fires and smoke over the US and adjacent areas of Canada and Mexico that has been in place since June 2002. This system is an integral part of the National Environmental Satellite and Data Information Service (NESDIS) near realtime hazard detection and mitigation efforts. The system utilizes NOAA's Geostationary Operational Environmental Satellites (GOES), Polar Operational Environmental Satellites (POES) and the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on NASA's Terra and Aqua spacecraft. Automated detection algorithms are employed for each of the satellites for the fire detects while smoke is added by a satellite image analyst. In June 2003 the HMS underwent an upgrade. A number of features were added for users of the products generated on the HMS. Sectors covering Alaska and Hawaii were added. The use of Geographic Information System (GIS) shape files for smoke analysis is a new feature. Shape files show the progression and time of a single smoke plume as each analysis is drawn and then updated. The analyst now has the ability to view GOES, POES, and MODIS data in a single loop. This allows the fire analyst the ability to easily confirm a fire in three different data sets. The upgraded HMS has faster satellite looping and gives the analyst the ability to design a false color image for a particular region. The GOES satellites provide a relatively coarse 4 km infrared resolution at satellite subpoint for thermal fire detection but provide the advantage of a rapid update cycle. GOES imagery is updated every 15 minutes utilizing both GOES-10 and GOES-12. POES imagery from NOAA-15, NOAA-16 and NOAA-17 and MODIS from Terra and Aqua are employed with each satellite providing twice per day coverage (more frequent over Alaska). While the frequency of imagery is much less than with GOES the higher resolution of these satellites (1 km along the suborbital track) allows for detection of smaller and/or cooler burning fires. Each of the algorithms utilizes a number of temporal, thermal and contextual filters in an attempt to screen out false detects. However, false detects do get processed by the algorithms to varying degrees. Therefore, the automated fire detects from each algorithm are quality controlled by an analyst who scans the imagery and may either accept or delete fire points. The analyst also has the ability to manually add additional fire points based on the imagery. Smoke is outlined by the analyst using visible imagery, primarily GOES which provides 1 km resolution. Occasionally a smoke plume seen in visible imagery is the only indicator of a fire and would be manually added to the fire detect file. The Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) is a forecast model that projects the trajectory and dispersion of a smoke plume over a period of time. The HYSPLIT is run for fires that are selected by the analyst that are seen to be producing a significant smoke plume. The analyst defines a smoke producing area commensurate with the size of the fire and amount of smoke detected. The output is hosted on an Air Resources Lab (ARL) web site which can be accessed from the web site listed below. All of the information is posted to the web page noted below. Besides the interactive GIS presentation users can view the product in graphical jpg format. The analyst edited points as well as the unedited automated fire detects are available for users to view directly on the web page or to download. All of the data is also archived and accessed via ftp.
On parts and holes: the spatial structure of the human body.
Donnelly, Maureen
2004-01-01
Spatial representation and reasoning is a central component of medical informatics. The spatial concepts most often used in medicine are not the quantitative, point-based concepts of classical geometry, but rather qualitative relations among extended objects such as body parts. A mereotopology is a formal theory of qualitative spatial relations, such as parthood and connection. This paper considers how an extension of mereotopology which includes also location relations can be used to represent and reason about the spatial structure of the human body.
Spatial patterning of fuels and fire hazard across a central U.S. deciduous forest region
Michael C. Stambaugh; Daniel C. Dey; Richard P. Guyette; Hong S. He; Joseph M. Marschall
2011-01-01
Information describing spatial and temporal variability of forest fuel conditions is essential to assessing overall fire hazard and risk. Limited information exists describing spatial characteristics of fuels in the eastern deciduous forest region, particularly in dry oak-dominated regions that historically burned relatively frequently. From an extensive fuels survey...
Occupancy Modeling Species-Environment Relationships with Non-ignorable Survey Designs.
Irvine, Kathryn M; Rodhouse, Thomas J; Wright, Wilson J; Olsen, Anthony R
2018-05-26
Statistical models supporting inferences about species occurrence patterns in relation to environmental gradients are fundamental to ecology and conservation biology. A common implicit assumption is that the sampling design is ignorable and does not need to be formally accounted for in analyses. The analyst assumes data are representative of the desired population and statistical modeling proceeds. However, if datasets from probability and non-probability surveys are combined or unequal selection probabilities are used, the design may be non ignorable. We outline the use of pseudo-maximum likelihood estimation for site-occupancy models to account for such non-ignorable survey designs. This estimation method accounts for the survey design by properly weighting the pseudo-likelihood equation. In our empirical example, legacy and newer randomly selected locations were surveyed for bats to bridge a historic statewide effort with an ongoing nationwide program. We provide a worked example using bat acoustic detection/non-detection data and show how analysts can diagnose whether their design is ignorable. Using simulations we assessed whether our approach is viable for modeling datasets composed of sites contributed outside of a probability design Pseudo-maximum likelihood estimates differed from the usual maximum likelihood occu31 pancy estimates for some bat species. Using simulations we show the maximum likelihood estimator of species-environment relationships with non-ignorable sampling designs was biased, whereas the pseudo-likelihood estimator was design-unbiased. However, in our simulation study the designs composed of a large proportion of legacy or non-probability sites resulted in estimation issues for standard errors. These issues were likely a result of highly variable weights confounded by small sample sizes (5% or 10% sampling intensity and 4 revisits). Aggregating datasets from multiple sources logically supports larger sample sizes and potentially increases spatial extents for statistical inferences. Our results suggest that ignoring the mechanism for how locations were selected for data collection (e.g., the sampling design) could result in erroneous model-based conclusions. Therefore, in order to ensure robust and defensible recommendations for evidence-based conservation decision-making, the survey design information in addition to the data themselves must be available for analysts. Details for constructing the weights used in estimation and code for implementation are provided. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Pike, William A; Riensche, Roderick M; Best, Daniel M; Roberts, Ian E; Whyatt, Marie V; Hart, Michelle L; Carr, Norman J; Thomas, James J
2012-09-18
Systems and computer-implemented processes for storage and management of information artifacts collected by information analysts using a computing device. The processes and systems can capture a sequence of interactive operation elements that are performed by the information analyst, who is collecting an information artifact from at least one of the plurality of software applications. The information artifact can then be stored together with the interactive operation elements as a snippet on a memory device, which is operably connected to the processor. The snippet comprises a view from an analysis application, data contained in the view, and the sequence of interactive operation elements stored as a provenance representation comprising operation element class, timestamp, and data object attributes for each interactive operation element in the sequence.
RAVE: Rapid Visualization Environment
NASA Technical Reports Server (NTRS)
Klumpar, D. M.; Anderson, Kevin; Simoudis, Avangelos
1994-01-01
Visualization is used in the process of analyzing large, multidimensional data sets. However, the selection and creation of visualizations that are appropriate for the characteristics of a particular data set and the satisfaction of the analyst's goals is difficult. The process consists of three tasks that are performed iteratively: generate, test, and refine. The performance of these tasks requires the utilization of several types of domain knowledge that data analysts do not often have. Existing visualization systems and frameworks do not adequately support the performance of these tasks. In this paper we present the RApid Visualization Environment (RAVE), a knowledge-based system that interfaces with commercial visualization frameworks and assists a data analyst in quickly and easily generating, testing, and refining visualizations. RAVE was used for the visualization of in situ measurement data captured by spacecraft.
A Seat Around the Table: Participatory Data Analysis With People Living With Dementia.
Clarke, Charlotte L; Wilkinson, Heather; Watson, Julie; Wilcockson, Jane; Kinnaird, Lindsay; Williamson, Toby
2018-05-01
The involvement of "people with experience" in research has developed considerably in the last decade. However, involvement as co-analysts at the point of data analysis and synthesis has received very little attention-in particular, there is very little work that involves people living with dementia as co-analysts. In this qualitative secondary data analysis project, we (a) analyzed data through two theoretical lenses: Douglas's cultural theory of risk and Tronto's Ethic of Care, and (b) analyzed data in workshops with people living with dementia. The design involved cycles of presenting, interpreting, representing and reinterpreting the data, and findings between multiple stakeholders. We explore ways of involving people with experience as co-analysts and explore the role of reflexivity, multiple voicing, literary styling, and performance in participatory data analysis.
From somatic pain to psychic pain: The body in the psychoanalytic field.
Hartung, Thomas; Steinbrecher, Michael
2017-03-24
The integration of psyche and soma begins with a baby's earliest contact with his or her parents. With the help of maternal empathy and reverie, β-elements are transformed into α-elements. While we understand this to be the case, we would like to enquire what actually happens to those parts of the affect which have not been transformed? For the most part they may be dealt with by evacuation, but they can also remain within the body, subsequently contributing to psychosomatic symptoms. This paper describes how the body serves as an intermediate store between the psychic (inner) and outer reality. The authors focuses on the unconscious communicative process between the analyst and the analysand, and in particular on how psychosomatic symptoms can spread to the analyst's body. The latter may become sensitive to the analysand's psychosomatic symptoms in order to better understand the psychoanalytical process. Sensory processes (visual and auditory) and psychic mechanisms such as projective identification can serve as a means for this communication. One of the first analysts to deal with this topic was Wilhelm Reich. He described one kind of psychosomatic defence like a shell, the character armour, comparing the armour formed by muscle tension with another, more psychical type of armour. This concept can be linked to Winnicott's contribution of the false self and later on to Feldman's concept of compliance as a defence. The authors links further details of the clinical material with theoretical concepts from Joyce McDougall, Piera Aulagnier, and Ricardo Rodulfo and Marilia Aisenstein. With the aid of the complex concept of projective identification, as described by Heinz Weiss, the authors discusses the important question of how the analyst gets in touch with the patient's current psychosomatic state, and describes a specific communication between the body of the psychoanalyst and the body of the patient. A vignette illustrates in greater detail the relationship between this theoretical understanding and an actual clinical example. In the session described, the analyst reacts to the patient with an intense body-countertransference, taking on the patient's symptoms for a short time. The patient, who had been unable to integrate psyche and soma (whose psyche did not indwell (Winnicott) in his body), projected the untransformed β-elements into his body, where they emerged as bodily symptoms. The body became a kind of intermediate store between inner and outer reality. By internalizing the patient's symptoms in his own body, the analyst created a bodily communication - something in between concerning the inner and the outer reality of both participants of the analytic dyad. The analyst was able to recognize his psychosomatic experience as the fear of dying, and to work through his bodily countertransference. This is described in detail. The emerging understanding of the countertransference helped the analyst to contribute to the patient's process of transforming his symptoms. The analyst was able to help the patient get in touch emotionally with many traumatic situations experienced during his life. The function of the psychosomatic symptoms was to contain the patient's fear of death. These frightening feelings could now be worked through on a psychical level; they could enter into a process of symbol formation so that the psychosomatic symptoms were no longer necessary and disappeared. Copyright © 2017 Institute of Psychoanalysis.
An Exploratory Analysis of Economic Factors in the Navy Total Force Strength Model (NTFSM)
2015-12-01
NTFSM is still in the testing phase and its overall behavior is largely unknown. In particular, the analysts that NTFSM was designed to help are...NTFSM is still in the testing phase and its overall behavior is largely unknown. In particular, the analysts that NTFSM was designed to help are...7 B. NTFSM VERIFICATION AND TESTING ......................................... 8 C
Spacecraft software training needs assessment research
NASA Technical Reports Server (NTRS)
Ratcliff, Shirley; Golas, Katharine
1990-01-01
The problems were identified, along with their causes and potential solutions, that the management analysts were encountering in performing their jobs. It was concluded that sophisticated training applications would provide the most effective solution to a substantial portion of the analysts' problems. The remainder could be alleviated through the introduction of tools that could help make retrieval of the needed information from the vast and complex information resources feasible.
2014-03-01
for OR Analysts Dr. Jim Morris Overview of Academic Research Opportunities for OR Analysts Dr. Daniel Behringer September 21, 2012 Prize Winners...In Operations Research: Experiences and Perspectives Dr. Jim Morris and Major Brady Vaira (USAF) November 30, 2012 How Does Industry Use OR to...X X Ms. Lynda Liptak Applied Research Associates Publishing With MORS X X X X X Dr. Jim Morris US Air Force Operations Research in
Enforced Sparse Non-Negative Matrix Factorization
2016-01-23
documents to find interesting pieces of information. With limited resources, analysts often employ automated text - mining tools that highlight common...represented as an undirected bipartite graph. It has become a common method for generating topic models of text data because it is known to produce good results...model and the convergence rate of the underlying algorithm. I. Introduction A common analyst challenge is searching through large quantities of text
International Food Aid Programs: Background and Issues
2010-02-03
D. Ho Analyst in Agricultural Policy Charles E. Hanrahan Senior Specialist in Agricultural Policy February 3, 2010 Congressional Research Service...U.S. Treasury. 10 In United States agricultural policy , “monetization” is a P.L. 480 provision (Section 203) first included in the Food Security Act...Contact Information Melissa D. Ho Analyst in Agricultural Policy mho@crs.loc.gov, 7-5342 Charles E. Hanrahan Senior Specialist in Agricultural
ERIC Educational Resources Information Center
Haga, Wayne; Moreno, Abel; Segall, Mark
2012-01-01
In this paper, we compare the performance of Computer Information Systems (CIS) majors on the Information Systems Analyst (ISA) Certification Exam. The impact that the form of delivery of information systems coursework may have on the exam score is studied. Using a sample that spans three years, we test for significant differences between scores…
2015-12-01
Army Training and Doctrine Command (TRADOC) Analysis Center (TRAC) to the Philippines for Operation ENDURING FREEDOM – Philippines (OEF-P). PROJECT...management, doctrine and force development, training management, system testing, system acquisition, decision analysis, and resource management, as...influenced procurement decisions and reshaped Army doctrine . Additionally, CAA itself has benefited in numerous ways. Combat experience provides analysts
Testing Cognitive Behavior With Emphasis on Analytical Propensity of Service Members
2012-04-01
TRADOC Pam 525-3-1, researchers such as Allen (2008), Hutchins et al. (2004; 2007), Fingar (2011), Krizan (1999), and Treverton and Gabbard (2008), among...Joint Military Intelligence College: Washington, DC, 2003. Treverton, G. F.; Gabbard , C. B. Assessing the Tradecraft of Intelligence Analysts. The...Treverton, G. F.; Gabbard , C. B. Assessing the Tradecraft of Intelligence Analysts. The RAND Corporation, National Security Research Division
This School Works for Me: Creating Choices to Boost Achievement. A Guide for Data Analysts
ERIC Educational Resources Information Center
Bill & Melinda Gates Foundation, 2010
2010-01-01
This document is part of a series of guides designed to help school district leaders address one of the toughest challenges in American education: dropout rates of 30 percent nationwide, 50 percent in many big cities, and 60 percent or more in the lowest-performing schools. It includes tools for data analysts to drill down into the data and use…
The fate of the dream in contemporary psychoanalysis.
Loden, Susan
2003-01-01
Freud's metapsychology of dream formation has implicitly been discarded, as indicated in a brief review of trends in psychoanalytic thinking about dreams, with a focus on the relationship of the dream process to ego capacities. The current bias toward exclusive emphasis on the exploration of the analytic relationship and the transference has evolved at the expense of classical, in-depth dream interpretation, and, by extension, at the expense of strengthening the patient's capacity for self-inquiry. This trend is shown to be especially evident in the treatment of borderline patients, who today are believed by many analysts to misuse the dream in the analytic situation. An extended clinical example of a borderline patient with whom an unmodified Freudian associative technique of dream interpretation is used with good outcome illustrates the author's contrary conviction. In clinical practice, we should neglect neither the uniqueness of the dream as a central intrapsychic event nor the Freudian art of total dream analysis.
Zdeněk Kopal: Numerical Analyst
NASA Astrophysics Data System (ADS)
Křížek, M.
2015-07-01
We give a brief overview of Zdeněk Kopal's life, his activities in the Czech Astronomical Society, his collaboration with Vladimír Vand, and his studies at Charles University, Cambridge, Harvard, and MIT. Then we survey Kopal's professional life. He published 26 monographs and 20 conference proceedings. We will concentrate on Kopal's extensive monograph Numerical Analysis (1955, 1961) that is widely accepted to be the first comprehensive textbook on numerical methods. It describes, for instance, methods for polynomial interpolation, numerical differentiation and integration, numerical solution of ordinary differential equations with initial or boundary conditions, and numerical solution of integral and integro-differential equations. Special emphasis will be laid on error analysis. Kopal himself applied numerical methods to celestial mechanics, in particular to the N-body problem. He also used Fourier analysis to investigate light curves of close binaries to discover their properties. This is, in fact, a problem from mathematical analysis.
Automating Phase Change Lines and Their Labels Using Microsoft Excel(R).
Deochand, Neil
2017-09-01
Many researchers have rallied against drawn in graphical elements and offered ways to avoid them, especially regarding the insertion of phase change lines (Deochand, Costello, & Fuqua, 2015; Dubuque, 2015; Vanselow & Bourret, 2012). However, few have offered a solution to automating the phase labels, which are often utilized in behavior analytic graphical displays (Deochand et al., 2015). Despite the fact that Microsoft Excel® is extensively utilized by behavior analysts, solutions to resolve issues in our graphing practices are not always apparent or user-friendly. Considering the insertion of phase change lines and their labels constitute a repetitious and laborious endeavor, any minimization in the steps to accomplish these graphical elements could offer substantial time-savings to the field. The purpose of this report is to provide an updated way (and templates in the supplemental materials) to add phase change lines with their respective labels, which stay embedded to the graph when they are moved or updated.
Analyzing the requirements for mass production of small wind turbine generators
NASA Astrophysics Data System (ADS)
Anuskiewicz, T.; Asmussen, J.; Frankenfield, O.
Mass producibility of small wind turbine generators to give manufacturers design and cost data for profitable production operations is discussed. A 15 kW wind turbine generator for production in annual volumes from 1,000 to 50,000 units is discussed. Methodology to cost the systems effectively is explained. The process estimate sequence followed is outlined with emphasis on the process estimate sheets compiled for each component and subsystem. These data enabled analysts to develop cost breakdown profiles crucial in manufacturing decision-making. The appraisal also led to various design recommendations including replacement of aluminum towers with cost effective carbon steel towers. Extensive cost information is supplied in tables covering subassemblies, capital requirements, and levelized energy costs. The physical layout of the plant is depicted to guide manufacturers in taking advantage of the growing business opportunity now offered in conjunction with the national need for energy development.
Tool for Rapid Analysis of Monte Carlo Simulations
NASA Technical Reports Server (NTRS)
Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.
2011-01-01
Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very di cult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The Tool for Rapid Analysis of Monte Carlo simulations (TRAM) has been used in recent design and analysis work for the Orion vehicle, greatly decreasing the time it takes to evaluate performance requirements. A previous version of this tool was developed to automatically identify driving design variables in Monte Carlo data sets. This paper describes a new, parallel version, of TRAM implemented on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.
Display/control requirements for automated VTOL aircraft
NASA Technical Reports Server (NTRS)
Hoffman, W. C.; Kleinman, D. L.; Young, L. R.
1976-01-01
A systematic design methodology for pilot displays in advanced commercial VTOL aircraft was developed and refined. The analyst is provided with a step-by-step procedure for conducting conceptual display/control configurations evaluations for simultaneous monitoring and control pilot tasks. The approach consists of three phases: formulation of information requirements, configuration evaluation, and system selection. Both the monitoring and control performance models are based upon the optimal control model of the human operator. Extensions to the conventional optimal control model required in the display design methodology include explicit optimization of control/monitoring attention; simultaneous monitoring and control performance predictions; and indifference threshold effects. The methodology was applied to NASA's experimental CH-47 helicopter in support of the VALT program. The CH-47 application examined the system performance of six flight conditions. Four candidate configurations are suggested for evaluation in pilot-in-the-loop simulations and eventual flight tests.
Microcomputer based software for biodynamic simulation
NASA Technical Reports Server (NTRS)
Rangarajan, N.; Shams, T.
1993-01-01
This paper presents a description of a microcomputer based software package, called DYNAMAN, which has been developed to allow an analyst to simulate the dynamics of a system consisting of a number of mass segments linked by joints. One primary application is in predicting the motion of a human occupant in a vehicle under the influence of a variety of external forces, specially those generated during a crash event. Extensive use of a graphical user interface has been made to aid the user in setting up the input data for the simulation and in viewing the results from the simulation. Among its many applications, it has been successfully used in the prototype design of a moving seat that aids in occupant protection during a crash, by aircraft designers in evaluating occupant injury in airplane crashes, and by users in accident reconstruction for reconstructing the motion of the occupant and correlating the impacts with observed injuries.
Application of DNA-based methods in forensic entomology.
Wells, Jeffrey D; Stevens, Jamie R
2008-01-01
A forensic entomological investigation can benefit from a variety of widely practiced molecular genotyping methods. The most commonly used is DNA-based specimen identification. Other applications include the identification of insect gut contents and the characterization of the population genetic structure of a forensically important insect species. The proper application of these procedures demands that the analyst be technically expert. However, one must also be aware of the extensive list of standards and expectations that many legal systems have developed for forensic DNA analysis. We summarize the DNA techniques that are currently used in, or have been proposed for, forensic entomology and review established genetic analyses from other scientific fields that address questions similar to those in forensic entomology. We describe how accepted standards for forensic DNA practice and method validation are likely to apply to insect evidence used in a death or other forensic entomological investigation.
Neophyte experiences of football (soccer) match analysis: a multiple case study approach.
McKenna, Mark; Cowan, Daryl Thomas; Stevenson, David; Baker, Julien Steven
2018-03-05
Performance analysis is extensively used in sport, but its pedagogical application is little understood. Given its expanding role across football, this study explored the experiences of neophyte performance analysts. Experiences of six analysis interns, across three professional football clubs, were investigated as multiple cases of new match analysis. Each intern was interviewed after their first season, with archival data providing background information. Four themes emerged from qualitative analysis: (1) "building of relationships" was important, along with trust and role clarity; (2) "establishing an analysis system" was difficult due to tacit coach knowledge, but analysis was established; (3) the quality of the "feedback process" hinged on coaching styles, with balance of feedback and athlete engagement considered essential; (4) "establishing effect" was complex with no statistical effects reported; yet enhanced relationships, role clarity, and improved performances were reported. Other emic accounts are required to further understand occupational culture within performance analysis.
Visual analysis and exploration of complex corporate shareholder networks
NASA Astrophysics Data System (ADS)
Tekušová, Tatiana; Kohlhammer, Jörn
2008-01-01
The analysis of large corporate shareholder network structures is an important task in corporate governance, in financing, and in financial investment domains. In a modern economy, large structures of cross-corporation, cross-border shareholder relationships exist, forming complex networks. These networks are often difficult to analyze with traditional approaches. An efficient visualization of the networks helps to reveal the interdependent shareholding formations and the controlling patterns. In this paper, we propose an effective visualization tool that supports the financial analyst in understanding complex shareholding networks. We develop an interactive visual analysis system by combining state-of-the-art visualization technologies with economic analysis methods. Our system is capable to reveal patterns in large corporate shareholder networks, allows the visual identification of the ultimate shareholders, and supports the visual analysis of integrated cash flow and control rights. We apply our system on an extensive real-world database of shareholder relationships, showing its usefulness for effective visual analysis.
Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric
2014-01-29
Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.
Using Browser Notebooks to Analyse Big Atmospheric Data-sets in the Cloud
NASA Astrophysics Data System (ADS)
Robinson, N.; Tomlinson, J.; Arribas, A.; Prudden, R.
2016-12-01
We are presenting an account of our experience building an ecosystem for the analysis of big atmospheric data-sets. By using modern technologies we have developed a prototype platform which is scaleable and capable of analysing very large atmospheric datasets. We tested different big-data ecosystems such as Hadoop MapReduce, Spark and Dask, in order to find the one which was best suited for analysis of multidimensional binary data such as NetCDF. We make extensive use of infrastructure-as-code and containerisation to provide a platform which is reusable, and which can scale to accommodate changes in demand. We make this platform readily accessible using browser based notebooks. As a result, analysts with minimal technology experience can, in tens of lines of Python, make interactive data-visualisation web pages, which can analyse very large amounts of data using cutting edge big-data technology
[From psychotherapy to psychoanalysis: Max Levy-Suhl (1876-1947)].
Hermanns, Ludger M; Schröter, Michael; Stroeken, Harry
2014-01-01
From psychotherapy to psychoanalysis: Max Levy-Suhl (1876-1947). Levy-Suhl can be considered one of the great practising psychotherapists in early 20th century Berlin. He was active in various fields, including ophthalmology, forensic adolescent psychiatry and hypnosis. Prominent among his publications were two handbooks of psychotherapeutic methods. His attitude towards psychoanalysis shifted from initial criticism to acceptance. Ca. 1930 he experienced some kind of conversion, resulting in his training at the Berlin Institute and becoming a member of the German Psychoanalytic Society. As a Jew being forced to emigrate in 1933, Levy-Suhl turned to the Netherlands where he had a psychoanalytic children's home in Amersfoort, followed by an analyst's practice in Amsterdam. He survived the German occupation, but apparently as a broken man. After the war he committed suicide.--The paper is complemented by an appendix, containing documents and an extensive bibliography.
Relatedness, national boarders, perceptions of firms and the value of their innovations
NASA Astrophysics Data System (ADS)
Castor, Adam R.
The main goal of this dissertation is to better understand how external corporate stakeholder perceptions of relatedness affect important outcomes for companies. In pursuit of this goal, I apply the lens of category studies. Categories not only help audiences to distinguish between members of different categories, they also convey patterns of relatedness. In turn, this may have implications for understanding how audiences search, what they attend to, and how the members are ultimately valued. In the first chapter, I apply incites from social psychology to show how the nationality of audience members affects the way that they cognitively group objects into similar categories. I find that the geographic location of stock market analysts affect the degree to which they will revise their earnings estimates for a given company in the wake of an earnings miss by another firm in the same industry. Foreign analysts revise their earnings estimates downward more so than do local analysts, suggesting that foreign analysts ascribe the earnings miss more broadly and tend to lump companies located in the same country into larger groups than do local analysts. In the second chapter, I demonstrate that the structure of inter-category relationships can have consequential effects for the members of a focal category. Leveraging an experimental-like design, I study the outcomes of nanotechnology patents and the pattern of forward citations across multiple patent jurisdictions. I find that members of technology categories with many close category 'neighbors' are more broadly cited than members of categories with few category 'neighbors.' My findings highlight how category embeddedness and category system structure affect the outcomes of category members as well as the role that classification plays in the valuation of innovation. In the third chapter, I propose a novel and dynamic measure of corporate similarity that is constructed from the two-mode analyst and company coverage network. The approach creates a fine-grained continuous measure of company similarity that can be used as an alternative or supplement to existing static industry classification systems. I demonstrate the value of this new measure in the context of predicting financial market responses to merger and acquisition deals.
NASA Astrophysics Data System (ADS)
Budde, M. E.; Rowland, J.; Anthony, M.; Palka, S.; Martinez, J.; Hussain, R.
2017-12-01
The U.S. Geological Survey (USGS) supports the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET). The USGS Earth Resources Observation and Science (EROS) Center has developed tools designed to aid food security analysts in developing assumptions of agro-climatological outcomes. There are four primary steps to developing agro-climatology assumptions; including: 1) understanding the climatology, 2) evaluating current climate modes, 3) interpretation of forecast information, and 4) incorporation of monitoring data. Analysts routinely forecast outcomes well in advance of the growing season, which relies on knowledge of climatology. A few months prior to the growing season, analysts can assess large-scale climate modes that might influence seasonal outcomes. Within two months of the growing season, analysts can evaluate seasonal forecast information as indicators. Once the growing season begins, monitoring data, based on remote sensing and field information, can characterize the start of season and remain integral monitoring tools throughout the duration of the season. Each subsequent step in the process can lead to modifications of the original climatology assumption. To support such analyses, we have created an agro-climatology analysis tool that characterizes each step in the assumption building process. Satellite-based rainfall and normalized difference vegetation index (NDVI)-based products support both the climatology and monitoring steps, sea-surface temperature data and knowledge of the global climate system inform the climate modes, and precipitation forecasts at multiple scales support the interpretation of forecast information. Organizing these data for a user-specified area provides a valuable tool for food security analysts to better formulate agro-climatology assumptions that feed into food security assessments. We have also developed a knowledge base for over 80 countries that provide rainfall and NDVI-based products, including annual and seasonal summaries, historical anomalies, coefficient of variation, and number of years below 70% of annual or seasonal averages. These products provide a quick look for analysts to assess the agro-climatology of a country.
Advanced Technology Lifecycle Analysis System (ATLAS)
NASA Technical Reports Server (NTRS)
O'Neil, Daniel A.; Mankins, John C.
2004-01-01
Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is satisfied with the system configurations, technology portfolios, and deployment strategies, he or she can present the concepts to a team, which will conduct a detailed, discipline-oriented analysis within a CEE. An analog to this approach is the music industry where a songwriter creates the lyrics and music before entering a recording studio.
Extensions under development for the HEVC standard
NASA Astrophysics Data System (ADS)
Sullivan, Gary J.
2013-09-01
This paper discusses standardization activities for extending the capabilities of the High Efficiency Video Coding (HEVC) standard - the first edition of which was completed in early 2013. These near-term extensions are focused on three areas: range extensions (such as enhanced chroma formats, monochrome video, and increased bit depth), bitstream scalability extensions for spatial and fidelity scalability, and 3D video extensions (including stereoscopic/multi-view coding, and probably also depth map coding and combinations thereof). Standardization extensions on each of these topics will be completed by mid-2014, and further work beyond that timeframe is also discussed.
St. Pierre, Tim G.; House, Michael J.; Bangma, Sander J.; Pang, Wenjie; Bathgate, Andrew; Gan, Eng K.; Ayonrinde, Oyekoya T.; Bhathal, Prithi S.; Clouston, Andrew; Olynyk, John K.; Adams, Leon A.
2016-01-01
Background and Aims Validation of non-invasive methods of liver fat quantification requires a reference standard. However, using standard histopathology assessment of liver biopsies is problematical because of poor repeatability. We aimed to assess a stereological method of measuring volumetric liver fat fraction (VLFF) in liver biopsies and to use the method to validate a magnetic resonance imaging method for measurement of VLFF. Methods VLFFs were measured in 59 subjects (1) by three independent analysts using a stereological point counting technique combined with the Delesse principle on liver biopsy histological sections and (2) by three independent analysts using the HepaFat-Scan® technique on magnetic resonance images of the liver. Bland Altman statistics and intraclass correlation (IC) were used to assess the repeatability of each method and the bias between the methods of liver fat fraction measurement. Results Inter-analyst repeatability coefficients for the stereology and HepaFat-Scan® methods were 8.2 (95% CI 7.7–8.8)% and 2.4 (95% CI 2.2–2.5)% VLFF respectively. IC coefficients were 0.86 (95% CI 0.69–0.93) and 0.990 (95% CI 0.985–0.994) respectively. Small biases (≤3.4%) were observable between two pairs of analysts using stereology while no significant biases were observable between any of the three pairs of analysts using HepaFat-Scan®. A bias of 1.4±0.5% VLFF was observed between the HepaFat-Scan® method and the stereological method. Conclusions Repeatability of the stereological method is superior to the previously reported performance of assessment of hepatic steatosis by histopathologists and is a suitable reference standard for validating non-invasive methods of measurement of VLFF. PMID:27501242
Utilizing semantic Wiki technology for intelligence analysis at the tactical edge
NASA Astrophysics Data System (ADS)
Little, Eric
2014-05-01
Challenges exist for intelligence analysts to efficiently and accurately process large amounts of data collected from a myriad of available data sources. These challenges are even more evident for analysts who must operate within small military units at the tactical edge. In such environments, decisions must be made quickly without guaranteed access to the kinds of large-scale data sources available to analysts working at intelligence agencies. Improved technologies must be provided to analysts at the tactical edge to make informed, reliable decisions, since this is often a critical collection point for important intelligence data. To aid tactical edge users, new types of intelligent, automated technology interfaces are required to allow them to rapidly explore information associated with the intersection of hard and soft data fusion, such as multi-INT signals, semantic models, social network data, and natural language processing of text. Abilities to fuse these types of data is paramount to providing decision superiority. For these types of applications, we have developed BLADE. BLADE allows users to dynamically add, delete and link data via a semantic wiki, allowing for improved interaction between different users. Analysts can see information updates in near-real-time due to a common underlying set of semantic models operating within a triple store that allows for updates on related data points from independent users tracking different items (persons, events, locations, organizations, etc.). The wiki can capture pictures, videos and related information. New information added directly to pages is automatically updated in the triple store and its provenance and pedigree is tracked over time, making that data more trustworthy and easily integrated with other users' pages.
Reproducibility of apatite fission-track length data and thermal history reconstruction
NASA Astrophysics Data System (ADS)
Ketcham, Richard A.; Donelick, Raymond A.; Balestrieri, Maria Laura; Zattin, Massimiliano
2009-07-01
The ability to derive detailed thermal history information from apatite fission-track analysis is predicated on the reliability of track length measurements. However, insufficient attention has been given to whether and how these measurements should be standardized. In conjunction with a fission-track workshop we conducted an experiment in which 11 volunteers measured ~ 50 track lengths on one or two samples. One mount contained Durango apatite with unannealed induced tracks, and one contained apatite from a crystalline rock containing spontaneous tracks with a broad length distribution caused by partial resetting. Results for both mounts showed scatter indicative of differences in measurement technique among the individual analysts. The effects of this variability on thermal history inversion were tested using the HeFTy computer program to model the spontaneous track measurements. A cooling-only scenario and a reheating scenario more consistent with the sample's geological history were posed. When a uniform initial length value from the literature was used, results among analysts were very inconsistent in both scenarios, although normalizing for track angle by projecting all lengths to a c-axis parallel crystallographic orientation improved some aspects of congruency. When the induced track measurement was used as the basis for thermal history inversion congruency among analysts, and agreement with inversions based on data previously collected, was significantly improved. Further improvement was obtained by using c-axis projection. Differences among inversions that persisted could be traced to differential sampling of long- and short-track populations among analysts. The results of this study, while demonstrating the robustness of apatite fission-track thermal history inversion, nevertheless point to the necessity for a standardized length calibration schema that accounts for analyst variation.
Development of a Comprehensive Database System for Safety Analyst
Paz, Alexander; Veeramisti, Naveen; Khanal, Indira; Baker, Justin
2015-01-01
This study addressed barriers associated with the use of Safety Analyst, a state-of-the-art tool that has been developed to assist during the entire Traffic Safety Management process but that is not widely used due to a number of challenges as described in this paper. As part of this study, a comprehensive database system and tools to provide data to multiple traffic safety applications, with a focus on Safety Analyst, were developed. A number of data management tools were developed to extract, collect, transform, integrate, and load the data. The system includes consistency-checking capabilities to ensure the adequate insertion and update of data into the database. This system focused on data from roadways, ramps, intersections, and traffic characteristics for Safety Analyst. To test the proposed system and tools, data from Clark County, which is the largest county in Nevada and includes the cities of Las Vegas, Henderson, Boulder City, and North Las Vegas, was used. The database and Safety Analyst together help identify the sites with the potential for safety improvements. Specifically, this study examined the results from two case studies. The first case study, which identified sites having a potential for safety improvements with respect to fatal and all injury crashes, included all roadway elements and used default and calibrated Safety Performance Functions (SPFs). The second case study identified sites having a potential for safety improvements with respect to fatal and all injury crashes, specifically regarding intersections; it used default and calibrated SPFs as well. Conclusions were developed for the calibration of safety performance functions and the classification of site subtypes. Guidelines were provided about the selection of a particular network screening type or performance measure for network screening. PMID:26167531
Developing Guidelines for Assessing Visual Analytics Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean
2011-07-01
In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less
Basic Characteristics and Spatial Patterns of Pseudo-Settlements--Taking Dalian as An Example.
Gao, Jiaji; Zhang, Yingjia; Li, Xueming
2016-01-20
A person's living behavior patterns are closely related to three types of settlements: real-life settlements, imagined settlements, and pseudo-settlements. The term "pseudo-settlement" (PS) refers to the places that are selectively recorded and represented after the mass media chose and restructure the residence information. As the mass media rapidly develops and people's way of obtaining information gradually change, PS has already become one of the main ways for people to recognize and understand real-life settlements, as well as describe their impressions of imagined settlements. PS also has a profound impact on tourism, employment, investment, migration, real estate development, etc. Thus, the study of PSs has important theoretical and practical significance. This paper proposes to put forward residential quarters where the mass media is displayed as the object of study and establishes the pseudo-settlement index system of Dalian in and elaborate analysis of the concept of PSs. From three aspects, including pseudo-buildings, pseudo-districts and pseudo-culture, this paper uses the ArcGIS 10.0 kernel density (spacial analyst) to analyze and interpret the basic characteristics and spatial patterns of 14 elements of the PS in Dalian. Through systemic clustering analysis, it identifies eight major types of PSs in Dalian. Then it systematically elaborates current situations and characteristics of the spatial pattern of PSs in Dalian, namely: regionally concentrated, widely scattered and blank spaces without pseudo-settlements. Finally, this paper discusses the mechanism of formation of PSs in Dalian.
Merchant, Nathan D; Fristrup, Kurt M; Johnson, Mark P; Tyack, Peter L; Witt, Matthew J; Blondel, Philippe; Parks, Susan E
2015-01-01
1. Many organisms depend on sound for communication, predator/prey detection and navigation. The acoustic environment can therefore play an important role in ecosystem dynamics and evolution. A growing number of studies are documenting acoustic habitats and their influences on animal development, behaviour, physiology and spatial ecology, which has led to increasing demand for passive acoustic monitoring (PAM) expertise in the life sciences. However, as yet, there has been no synthesis of data processing methods for acoustic habitat monitoring, which presents an unnecessary obstacle to would-be PAM analysts. 2. Here, we review the signal processing techniques needed to produce calibrated measurements of terrestrial and aquatic acoustic habitats. We include a supplemental tutorial and template computer codes in matlab and r, which give detailed guidance on how to produce calibrated spectrograms and statistical analyses of sound levels. Key metrics and terminology for the characterisation of biotic, abiotic and anthropogenic sound are covered, and their application to relevant monitoring scenarios is illustrated through example data sets. To inform study design and hardware selection, we also include an up-to-date overview of terrestrial and aquatic PAM instruments. 3. Monitoring of acoustic habitats at large spatiotemporal scales is becoming possible through recent advances in PAM technology. This will enhance our understanding of the role of sound in the spatial ecology of acoustically sensitive species and inform spatial planning to mitigate the rising influence of anthropogenic noise in these ecosystems. As we demonstrate in this work, progress in these areas will depend upon the application of consistent and appropriate PAM methodologies. PMID:25954500
Merchant, Nathan D; Fristrup, Kurt M; Johnson, Mark P; Tyack, Peter L; Witt, Matthew J; Blondel, Philippe; Parks, Susan E
2015-03-01
1. Many organisms depend on sound for communication, predator/prey detection and navigation. The acoustic environment can therefore play an important role in ecosystem dynamics and evolution. A growing number of studies are documenting acoustic habitats and their influences on animal development, behaviour, physiology and spatial ecology, which has led to increasing demand for passive acoustic monitoring (PAM) expertise in the life sciences. However, as yet, there has been no synthesis of data processing methods for acoustic habitat monitoring, which presents an unnecessary obstacle to would-be PAM analysts. 2. Here, we review the signal processing techniques needed to produce calibrated measurements of terrestrial and aquatic acoustic habitats. We include a supplemental tutorial and template computer codes in matlab and r, which give detailed guidance on how to produce calibrated spectrograms and statistical analyses of sound levels. Key metrics and terminology for the characterisation of biotic, abiotic and anthropogenic sound are covered, and their application to relevant monitoring scenarios is illustrated through example data sets. To inform study design and hardware selection, we also include an up-to-date overview of terrestrial and aquatic PAM instruments. 3. Monitoring of acoustic habitats at large spatiotemporal scales is becoming possible through recent advances in PAM technology. This will enhance our understanding of the role of sound in the spatial ecology of acoustically sensitive species and inform spatial planning to mitigate the rising influence of anthropogenic noise in these ecosystems. As we demonstrate in this work, progress in these areas will depend upon the application of consistent and appropriate PAM methodologies.
Basic Characteristics and Spatial Patterns of Pseudo-Settlements—Taking Dalian as An Example
Gao, Jiaji; Zhang, Yingjia; Li, Xueming
2016-01-01
A person’s living behavior patterns are closely related to three types of settlements: real-life settlements, imagined settlements, and pseudo-settlements. The term “pseudo-settlement” (PS) refers to the places that are selectively recorded and represented after the mass media chose and restructure the residence information. As the mass media rapidly develops and people’s way of obtaining information gradually change, PS has already become one of the main ways for people to recognize and understand real-life settlements, as well as describe their impressions of imagined settlements. PS also has a profound impact on tourism, employment, investment, migration, real estate development, etc. Thus, the study of PSs has important theoretical and practical significance. This paper proposes to put forward residential quarters where the mass media is displayed as the object of study and establishes the pseudo-settlement index system of Dalian in and elaborate analysis of the concept of PSs. From three aspects, including pseudo-buildings, pseudo-districts and pseudo-culture, this paper uses the ArcGIS 10.0 kernel density (spacial analyst) to analyze and interpret the basic characteristics and spatial patterns of 14 elements of the PS in Dalian. Through systemic clustering analysis, it identifies eight major types of PSs in Dalian. Then it systematically elaborates current situations and characteristics of the spatial pattern of PSs in Dalian, namely: regionally concentrated, widely scattered and blank spaces without pseudo-settlements. Finally, this paper discusses the mechanism of formation of PSs in Dalian. PMID:26805859
Deeny, Sarah R; Steventon, Adam
2015-08-01
Socrates described a group of people chained up inside a cave, who mistook shadows of objects on a wall for reality. This allegory comes to mind when considering 'routinely collected data'-the massive data sets, generated as part of the routine operation of the modern healthcare service. There is keen interest in routine data and the seemingly comprehensive view of healthcare they offer, and we outline a number of examples in which they were used successfully, including the Birmingham OwnHealth study, in which routine data were used with matched control groups to assess the effect of telephone health coaching on hospital utilisation.Routine data differ from data collected primarily for the purposes of research, and this means that analysts cannot assume that they provide the full or accurate clinical picture, let alone a full description of the health of the population. We show that major methodological challenges in using routine data arise from the difficulty of understanding the gap between patient and their 'data shadow'. Strategies to overcome this challenge include more extensive data linkage, developing analytical methods and collecting more data on a routine basis, including from the patient while away from the clinic. In addition, creating a learning health system will require greater alignment between the analysis and the decisions that will be taken; between analysts and people interested in quality improvement; and between the analysis undertaken and public attitudes regarding appropriate use of data. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
1993-09-15
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.
1993-09-15
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Center (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability providedgeneral visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.
Marshall Engineers Use Virtual Reality
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).
Generating Ground Reference Data for a Global Impervious Surface Survey
NASA Technical Reports Server (NTRS)
Tilton, James C.; De Colstoun, Eric Brown; Wolfe, Robert E.; Tan, Bin; Huang, Chengquan
2012-01-01
We are developing an approach for generating ground reference data in support of a project to produce a 30m impervious cover data set of the entire Earth for the years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. Since sufficient ground reference data for training and validation is not available from ground surveys, we are developing an interactive tool, called HSegLearn, to facilitate the photo-interpretation of 1 to 2 m spatial resolution imagery data, which we will use to generate the needed ground reference data at 30m. Through the submission of selected region objects and positive or negative examples of impervious surfaces, HSegLearn enables an analyst to automatically select groups of spectrally similar objects from a hierarchical set of image segmentations produced by the HSeg image segmentation program at an appropriate level of segmentation detail, and label these region objects as either impervious or nonimpervious.
Investigative change detection: identifying new topics using lexicon-based search
NASA Astrophysics Data System (ADS)
Hintz, Kenneth J.
2002-08-01
In law enforcement there is much textual data which needs to be searched in order to detect new threats. A new methodology which can be applied to this need is the automatic searching of the contents of documents from known sources to construct a lexicon of words used by that source. When analyzing future documents, the occurrence of words which have not been lexiconized are indicative of the introduction of a new topic into the source's lexicon which should be examined in its context by an analyst. A system analogous to this has been built and used to detect Fads and Categories on web sites. Fad refers to the first appearance of a word not in the lexicon; Category refers to the repeated appearance of a Fad word and the exceeding of some frequency or spatial occurrence metric indicating a permanence to the Category.
Computer Applications and Virtual Environments (CAVE)
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall SPace Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).
Metric Learning to Enhance Hyperspectral Image Segmentation
NASA Technical Reports Server (NTRS)
Thompson, David R.; Castano, Rebecca; Bue, Brian; Gilmore, Martha S.
2013-01-01
Unsupervised hyperspectral image segmentation can reveal spatial trends that show the physical structure of the scene to an analyst. They highlight borders and reveal areas of homogeneity and change. Segmentations are independently helpful for object recognition, and assist with automated production of symbolic maps. Additionally, a good segmentation can dramatically reduce the number of effective spectra in an image, enabling analyses that would otherwise be computationally prohibitive. Specifically, using an over-segmentation of the image instead of individual pixels can reduce noise and potentially improve the results of statistical post-analysis. In this innovation, a metric learning approach is presented to improve the performance of unsupervised hyperspectral image segmentation. The prototype demonstrations attempt a superpixel segmentation in which the image is conservatively over-segmented; that is, the single surface features may be split into multiple segments, but each individual segment, or superpixel, is ensured to have homogenous mineralogy.
ComputerApplications and Virtual Environments (CAVE)
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Center (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability providedgeneral visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.
ComputerApplications and Virtual Environments (CAVE)
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.
Spatial modelling for tsunami evacuation route in Parangtritis Village
NASA Astrophysics Data System (ADS)
Juniansah, A.; Tyas, B. I.; Tama, G. C.; Febriani, K. R.; Farda, N. M.
2018-04-01
Tsunami is a series of huge sea waves that commonly occurs because of the oceanic plate movement or tectonic activity under the sea. As a sudden hazard, the tsunami has damaged many people over the years. Parangtritis village is one of high tsunami hazard risk area in Indonesia which needs an effective tsunami risk reduction. This study aims are modelling a tsunami susceptibility map, existing assembly points evaluation, and suggesting effective evacuation routes. The susceptibility map was created using ALOS PALSAR DEM and surface roughness coefficient. The method of tsunami modelling employed inundation model developed by Berryman (2006). The results are used to determine new assembly points based on the Sentinel 2A imagery and to determine the most effective evacuation route by using network analyst. This model can be used to create detailed scale of evacuation route, but unrepresentative for assembly point that far from road network.
Toward a Cognitive Task Analysis for Biomedical Query Mediation
Hruby, Gregory W.; Cimino, James J.; Patel, Vimla; Weng, Chunhua
2014-01-01
In many institutions, data analysts use a Biomedical Query Mediation (BQM) process to facilitate data access for medical researchers. However, understanding of the BQM process is limited in the literature. To bridge this gap, we performed the initial steps of a cognitive task analysis using 31 BQM instances conducted between one analyst and 22 researchers in one academic department. We identified five top-level tasks, i.e., clarify research statement, explain clinical process, identify related data elements, locate EHR data element, and end BQM with either a database query or unmet, infeasible information needs, and 10 sub-tasks. We evaluated the BQM task model with seven data analysts from different clinical research institutions. Evaluators found all the tasks completely or semi-valid. This study contributes initial knowledge towards the development of a generalizable cognitive task representation for BQM. PMID:25954589
Toward a cognitive task analysis for biomedical query mediation.
Hruby, Gregory W; Cimino, James J; Patel, Vimla; Weng, Chunhua
2014-01-01
In many institutions, data analysts use a Biomedical Query Mediation (BQM) process to facilitate data access for medical researchers. However, understanding of the BQM process is limited in the literature. To bridge this gap, we performed the initial steps of a cognitive task analysis using 31 BQM instances conducted between one analyst and 22 researchers in one academic department. We identified five top-level tasks, i.e., clarify research statement, explain clinical process, identify related data elements, locate EHR data element, and end BQM with either a database query or unmet, infeasible information needs, and 10 sub-tasks. We evaluated the BQM task model with seven data analysts from different clinical research institutions. Evaluators found all the tasks completely or semi-valid. This study contributes initial knowledge towards the development of a generalizable cognitive task representation for BQM.
Panel: What could be Jungian about Human Rights work?
Berg, Astrid; Salman, Tawfiq; Troudart, Tristan
2011-06-01
The question of whether Jungian analysts should move beyond the consulting room to engage with mental health issues that pertain to the collective is the focus of this paper. Two narratives are presented: one from the view point of a psychiatrist in Occupied Palestine, the other from the conflicted situation which faces an Israeli analyst. Despite the strong ambivalence that is experienced on both sides, there is a willingness to meet and to take a standpoint without necessarily coming to a resolution. A third position is offered by describing experiences from the South African perspective. The African notion of Ubuntu is offered as a moral entry point that states that community goes beyond one's own; from this point of view, Jungian analysts can do no other than to act. © 2011, The Society of Analytical Psychology.
Malberg, Norka T
2012-01-01
Ms. Todd's paper illustrates both the value of the analytic frame and the relevance of a flexible approach in response to the external reality. In this case, the impingement of the outside environment became an ongoing threat to the analyst's thinking and to the development of a safe and predictable therapeutic relationship. Ms. Todd's narrative of Joey's three-and-a-half-year analysis emphasizes the impact of external interference on the analyst's capacity to experience difficult affects with and for the patient. In addition, it highlights the importance of recognizing and working through one's countertransference resistance. This commentary focuses on Ms. Todd's work with Joey, so I will only refer to her work with his parents and other providers as it is reflected in her analysis.
Public information, dissemination, and behavior analysis
Morris, Edward K.
1985-01-01
Behavior analysts have become increasingly concerned about inaccuracies and misconceptions in the public, educational, and professional information portraying their activities, but have done little to correct these views. The present paper has two purposes in this regard. First, the paper describes some of the conditions that have given rise to these concerns. Second, and more important, the paper surveys various procedures and programs for the dissemination of public information that may correct inaccuracies and misconceptions. Special consideration is also given to issues involving (a) the assessment of the problem, (b) the content and means of dissemination, (c) the possible contributions of behavior analysts to current misunderstandings, and (d) relationships between behavior analysts and the media. The dissemination of accurate and unbiased information constitutes an important new undertaking for behavior analysis. The future of the field may depend in part on such activity. PMID:22478623
The threat of male-to-female erotic transference.
Celenza, Andrea
2006-01-01
Vignettes from an ongoing psychoanalysis with a patient, Michael, are presented to illustrate the various dimensions of the erotic transference at different phases of the treatment. The relation to power, the experience and expression of aggression, how these may be organized by gender, and the female analyst's countertransference are discussed as potentially fostering or inhibitory in the development of an erotic transference. Traditional sociocultural gender stereotypes kept alive in fantasy can cause female analysts to subtly foreclose the impending threat of an intense erotic transference with male analysands due to a fear of outwardly directed male aggression. It is suggested that the maternal/containing transference can be unconsciously fostered by both analyst and analysand to defensively avoid expression of the aggressivized erotic transference in its full intensity. Similarities and differences in cases of sexual boundary violations with opposite-gender pairings are discussed.
A note on the history of the Norwegian Psychoanalytic Society from 1933 to 1945.
Anthi, Per; Haugsgjerd, Svein
2013-08-01
The Norwegian analysts, who were trained in Berlin before 1933, were drawn into a struggle against fascism, informed by politically leftist analysts who worked at the Berlin Institute. The Norwegian group, including the analysts Wilhelm Reich and Otto Fenichel, were committed to Marxist or social democratic ideologies in order to fight down fascism and Nazism. They were a source of inspiration but also of conflict. After the war the leadership of the IPA was sceptical about the Norwegian group because of its former connections with Die Linke, as well as its relations with Wilhelm Reich. This paper in part considers the courageous efforts of Nic Waal, whom Ernest Jones used as a delegate and courier to solve problems for the IPA and who was unjustly treated after the war. Copyright © 2013 Institute of Psychoanalysis.
RipleyGUI: software for analyzing spatial patterns in 3D cell distributions
Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik
2013-01-01
The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544
SPATIAL SCALE OF AUTOCORRELATION IN WISCONSIN FROG AND TOAD SURVEY DATA
The degree to which local population dynamics are correlated with nearby sites has important implications for metapopulation dynamics and landscape management. Spatially extensive monitoring data can be used to evaluate large-scale population dynamic processes. Our goals in this ...
Selected annotated bibliographies for adaptive filtering of digital image data
Mayers, Margaret; Wood, Lynnette
1988-01-01
Digital spatial filtering is an important tool both for enhancing the information content of satellite image data and for implementing cosmetic effects which make the imagery more interpretable and appealing to the eye. Spatial filtering is a context-dependent operation that alters the gray level of a pixel by computing a weighted average formed from the gray level values of other pixels in the immediate vicinity.Traditional spatial filtering involves passing a particular filter or set of filters over an entire image. This assumes that the filter parameter values are appropriate for the entire image, which in turn is based on the assumption that the statistics of the image are constant over the image. However, the statistics of an image may vary widely over the image, requiring an adaptive or "smart" filter whose parameters change as a function of the local statistical properties of the image. Then a pixel would be averaged only with more typical members of the same population. This annotated bibliography cites some of the work done in the area of adaptive filtering. The methods usually fall into two categories, (a) those that segment the image into subregions, each assumed to have stationary statistics, and use a different filter on each subregion, and (b) those that use a two-dimensional "sliding window" to continuously estimate the filter either the spatial or frequency domain, or may utilize both domains. They may be used to deal with images degraded by space variant noise, to suppress undesirable local radiometric statistics while enforcing desirable (user-defined) statistics, to treat problems where space-variant point spread functions are involved, to segment images into regions of constant value for classification, or to "tune" images in order to remove (nonstationary) variations in illumination, noise, contrast, shadows, or haze.Since adpative filtering, like nonadaptive filtering, is used in image processing to accomplish various goals, this bibliography is organized in subsections based on application areas. Contrast enhancement, edge enhancement, noise suppression, and smoothing are typically performed in order imaging process, (for example, degradations due to the optics and electronics of the sensor, or to blurring caused by the intervening atmosphere, uniform motion, or defocused optics). Some of the papers listed may apply to more than one of the above categories; when this happens the paper is listed under the category for which the paper's emphasis is greatest. A list of survey articles is also supplied. These articles are general discussions on adaptive filters and reviews of work done. Finally, a short list of miscellaneous articles are listed which were felt to be sufficiently important to be included, but do not fit into any of the above categories. This bibliography, listing items published from 1970 through 1987, is extensive, but by no means complete. It is intended as a guide for scientists and image analysts, listing references for background information as well as areas of significant development in adaptive filtering.
The QUELCE Method: Using Change Drivers to Estimate Program Costs
2016-08-01
QUELCE computes a distribution of program costs based on Monte Carlo analysis of program cost drivers—assessed via analyses of dependency structure...possible scenarios. These include a dependency structure matrix to understand the interaction of change drivers for a specific project a...performed by the SEI or by company analysts. From the workshop results, analysts create a dependency structure matrix (DSM) of the change drivers
Can’t We All Just Get Along? Improving the Law Enforcement-Intelligence Community Relationship
2007-06-01
program that honored analysts, executives, authors, and agencies for exceptional intelligence writing and products. Some cross- pollination between...enforcement was primarily still reactive rather than proactive.8 There was more evidence of cross- pollination between local law enforcement and members of...capability, infrastructure, and other conventional foreign intelligence problems. Over the years, these CD analysts had little if any interaction or
2017-09-01
meta-analytic review and theoretical integration . Journal of Personality and Social Psychology ,65(4), 681. Karr-Wisniewski, P., & Lu, Y. (2010...dissertation applies attribution theory, a product of cognitive psychology , to evaluate how analysts collectively and individually make attributions in...Likewise, many researchers agree that anomaly detection is an integral component for insider threat analysis (Brdiczka, Liu, Price, Shen, Patil, Chow
Arab Threat Perceptions and the Future of the U.S. Military Presence in the Middle East
2015-10-01
knowledge, and provides solutions to strategic Army issues affecting the national security community . The Peacekeeping and Stability Operations...analysts concern topics having strategic implications for the Army, the Department of Defense, and the larger national security community . In addition...update the national security community on the research of our analysts, recent and forthcoming publications, and upcoming confer- ences sponsored by
Defense AT and L. Volume 45, Issue 1
2016-02-01
and government organizations. She currently is a senior research analyst for the MCBL Science and Technology Branch at Fort Leavenworth, Kansas...core functionality and interface design. Analysts from the Army S&T and MC user communities participated, including MCBL, Army Research Laboratory...Mica R. Endsley, Ph.D. Programs can use the 60-year foundation of scientific research and engineering in the field of human factors to develop robust
The Crime-Terror Nexus and the Threat to U.S. Homeland Security
2015-12-01
described by analysts as falling into the “gray area phenomenon.” The three case studies, the analysis, and conclusion of this thesis support the...sub-national groups are protean in nature; they are best described by analysts as falling into the “gray area phenomenon.” The three case studies, the...1 A. WHY IS THE CRIME-TERROR NEXUS A PROBLEM WORTHY OF RESEARCH
Coordinated Displays to Assist Cyber Defenders
2016-09-23
suspicious activity, such as the occurrence of a network event that is similar to a known attack signature, the system generates an alert which is then...presented to a human computer network defense analyst, or more succinctly, a network analyst, who must evaluate the veracity of that alert . To...display and select an alert to investigate further. Though alerts generally include some information about the nature of a potential threat, the
Analysis of the Research and Studies Program at the United States Military Academy
2004-09-01
operational assessment methodology, efficiency analysis, recruiting analysis especially marketing effects and capability analysis and modeling. Lieutenant...Finally, and arguably the most compelling rationale is the market force of increased funding. Figure 3 below shows the increase in funding received by...to integrate in a team of analysts from other departments to assist in the effort. First, bringing in analysts from other departments gave those
Discovering and Analyzing Deviant Communities: Methods and Experiments
2014-10-01
analysis. Sinkholing . Sinkholing is the current method of choice for botnet analysis and defense [3]. In this approach, the analyst deceives bots into...from the bots to the botnet. There are several drawbacks to sinkholing and shutting down botnets. The biggest issue is the complexity and time...involved in conducting a sinkhol - ing campaign. Normally, sinkholing involves a coordinated effort from the analyst, ISPs, and law enforcement officials
Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community
2016-01-01
Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement
Multi-Sensor Information Integration and Automatic Understanding
2008-05-27
distributions for target tracks and class which are utilized by an active learning cueing management framework to optimally task the appropriate sensor...modality to cued regions of interest. Moreover, this active learning approach also facilitates analyst cueing to help resolve track ambiguities in complex...scenes. We intend to leverage SIG’s active learning with analyst cueing under future efforts with ONR and other DoD agencies. Obtaining long- term
Multi-Sensor Information Integration and Automatic Understanding
2008-08-27
distributions for target tracks and class which are utilized by an active learning cueing management framework to optimally task the appropriate sensor modality...to cued regions of interest. Moreover, this active learning approach also facilitates analyst cueing to help resolve track ambiguities in complex...scenes. We intend to leverage SIG’s active learning with analyst cueing under future efforts with ONR and other DoD agencies. Obtaining long- term
A review method for UML requirements analysis model employing system-side prototyping.
Ogata, Shinpei; Matsuura, Saeko
2013-12-01
User interface prototyping is an effective method for users to validate the requirements defined by analysts at an early stage of a software development. However, a user interface prototype system offers weak support for the analysts to verify the consistency of the specifications about internal aspects of a system such as business logic. As the result, the inconsistency causes a lot of rework costs because the inconsistency often makes the developers impossible to actualize the system based on the specifications. For verifying such consistency, functional prototyping is an effective method for the analysts, but it needs a lot of costs and more detailed specifications. In this paper, we propose a review method so that analysts can verify the consistency among several different kinds of diagrams in UML efficiently by employing system-side prototyping without the detailed model. The system-side prototype system does not have any functions to achieve business logic, but visualizes the results of the integration among the diagrams in UML as Web pages. The usefulness of our proposal was evaluated by applying our proposal into a development of Library Management System (LMS) for a laboratory. This development was conducted by a group. As the result, our proposal was useful for discovering the serious inconsistency caused by the misunderstanding among the members of the group.
God of the hinge: treating LGBTQIA patients.
Boland, Annie
2017-11-01
This paper looks at systems of gender within the context of analysis. It explores the unique challenges of individuation faced by transsexual, transgender, gender queer, gender non-conforming, cross-dressing and intersex patients. To receive patients generously we need to learn how a binary culture produces profound and chronic trauma. These patients wrestle with being who they are whilst simultaneously receiving negative projections and feeling invisible. While often presenting with the struggles of gender conforming individuals, understanding the specifically gendered aspect of their identity is imperative. An analyst's unconscious bias may lead to iatrogenic shaming. The author argues that rigorous, humble inquiry into the analyst's transphobia can be transformative for patient, analyst, and the work itself. Analysis may, then, provide gender-variant patients with their first remembered and numinous experience of authentic connection to self. Conjuring the image of a hinge, securely placed in the neutral region of a third space, creates a transpositive analytic temenos. Invoking the spirit of the Trickster in the construction of this matrix supports the full inclusion of gender-variant patients. Nuanced attunement scaffolds mirroring and the possibility of play. Being mindful that gender is sturdy and delicate as well as mercurial and defined enriches the analyst's listening. © 2017, The Society of Analytical Psychology.
Innovating Big Data Computing Geoprocessing for Analysis of Engineered-Natural Systems
NASA Astrophysics Data System (ADS)
Rose, K.; Baker, V.; Bauer, J. R.; Vasylkivska, V.
2016-12-01
Big data computing and analytical techniques offer opportunities to improve predictions about subsurface systems while quantifying and characterizing associated uncertainties from these analyses. Spatial analysis, big data and otherwise, of subsurface natural and engineered systems are based on variable resolution, discontinuous, and often point-driven data to represent continuous phenomena. We will present examples from two spatio-temporal methods that have been adapted for use with big datasets and big data geo-processing capabilities. The first approach uses regional earthquake data to evaluate spatio-temporal trends associated with natural and induced seismicity. The second algorithm, the Variable Grid Method (VGM), is a flexible approach that presents spatial trends and patterns, such as those resulting from interpolation methods, while simultaneously visualizing and quantifying uncertainty in the underlying spatial datasets. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analyses to efficiently consume and utilize large geospatial data in these custom analytical algorithms through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom `Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations.
NASA Astrophysics Data System (ADS)
Rose, K.; Bauer, J. R.; Baker, D. V.
2015-12-01
As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).
Narragansett Bay (NB) has been extensively sampled over the last 50 years by various government agencies, academic institutions, and private groups. To date, most spatial research conducted within the estuary has employed deterministic sampling designs. Several studies have used ...
Genome-scale modelling of microbial metabolism with temporal and spatial resolution.
Henson, Michael A
2015-12-01
Most natural microbial systems have evolved to function in environments with temporal and spatial variations. A major limitation to understanding such complex systems is the lack of mathematical modelling frameworks that connect the genomes of individual species and temporal and spatial variations in the environment to system behaviour. The goal of this review is to introduce the emerging field of spatiotemporal metabolic modelling based on genome-scale reconstructions of microbial metabolism. The extension of flux balance analysis (FBA) to account for both temporal and spatial variations in the environment is termed spatiotemporal FBA (SFBA). Following a brief overview of FBA and its established dynamic extension, the SFBA problem is introduced and recent progress is described. Three case studies are reviewed to illustrate the current state-of-the-art and possible future research directions are outlined. The author posits that SFBA is the next frontier for microbial metabolic modelling and a rapid increase in methods development and system applications is anticipated. © 2015 Authors; published by Portland Press Limited.
Developing GIS-based eastern equine encephalitis vector-host models in Tuskegee, Alabama.
Jacob, Benjamin G; Burkett-Cadena, Nathan D; Luvall, Jeffrey C; Parcak, Sarah H; McClure, Christopher J W; Estep, Laura K; Hill, Geoffrey E; Cupp, Eddie W; Novak, Robert J; Unnasch, Thomas R
2010-02-24
A site near Tuskegee, Alabama was examined for vector-host activities of eastern equine encephalomyelitis virus (EEEV). Land cover maps of the study site were created in ArcInfo 9.2 from QuickBird data encompassing visible and near-infrared (NIR) band information (0.45 to 0.72 microm) acquired July 15, 2008. Georeferenced mosquito and bird sampling sites, and their associated land cover attributes from the study site, were overlaid onto the satellite data. SAS 9.1.4 was used to explore univariate statistics and to generate regression models using the field and remote-sampled mosquito and bird data. Regression models indicated that Culex erracticus and Northern Cardinals were the most abundant mosquito and bird species, respectively. Spatial linear prediction models were then generated in Geostatistical Analyst Extension of ArcGIS 9.2. Additionally, a model of the study site was generated, based on a Digital Elevation Model (DEM), using ArcScene extension of ArcGIS 9.2. For total mosquito count data, a first-order trend ordinary kriging process was fitted to the semivariogram at a partial sill of 5.041 km, nugget of 6.325 km, lag size of 7.076 km, and range of 31.43 km, using 12 lags. For total adult Cx. erracticus count, a first-order trend ordinary kriging process was fitted to the semivariogram at a partial sill of 5.764 km, nugget of 6.114 km, lag size of 7.472 km, and range of 32.62 km, using 12 lags. For the total bird count data, a first-order trend ordinary kriging process was fitted to the semivariogram at a partial sill of 4.998 km, nugget of 5.413 km, lag size of 7.549 km and range of 35.27 km, using 12 lags. For the Northern Cardinal count data, a first-order trend ordinary kriging process was fitted to the semivariogram at a partial sill of 6.387 km, nugget of 5.935 km, lag size of 8.549 km and a range of 41.38 km, using 12 lags. Results of the DEM analyses indicated a statistically significant inverse linear relationship between total sampled mosquito data and elevation (R2 = -.4262; p < .0001), with a standard deviation (SD) of 10.46, and total sampled bird data and elevation (R2 = -.5111; p < .0001), with a SD of 22.97. DEM statistics also indicated a significant inverse linear relationship between total sampled Cx. erracticus data and elevation (R2 = -.4711; p < .0001), with a SD of 11.16, and the total sampled Northern Cardinal data and elevation (R2 = -.5831; p < .0001), SD of 11.42. These data demonstrate that GIS/remote sensing models and spatial statistics can capture space-varying functional relationships between field-sampled mosquito and bird parameters for determining risk for EEEV transmission.
NASA Technical Reports Server (NTRS)
Butler, T. G.
1985-01-01
Some of the problems that confront an analyst in free body modeling, to satisfy rigid body conditions are discussed and with some remedies for these problems are presented. The problems of detecting these culprits at various levels within the analysis are examined. A new method within NASTRAN for checking the model for defects very early in the analysis without requiring the analyst to bear the expense of an eigenvalue analysis before discovering these defects is outlined.
2010-12-01
la Reine (en droit du Canada), telle que représentée par le ministre de ...organisationnelles, ainsi qu’aux préjugés des analystes et des organismes du renseignement. Un tel constat exige la tenue d’un examen en profondeur de la ...processus de production du renseignement que l’on pourrait expliquer par la mise en application des connaissances et des méthodes acquises dans
NASA Technical Reports Server (NTRS)
Coberly, W. A.; Tubbs, J. D.; Odell, P. L.
1979-01-01
The overall success of large-scale crop inventories of agricultural regions using Landsat multispectral scanner data is highly dependent upon the labeling of training data by analyst/photointerpreters. The principal analyst tool in labeling training data is a false color infrared composite of Landsat bands 4, 5, and 7. In this paper, this color display is investigated and its influence upon classification errors is partially determined.
The Case for Licensure of Applied Behavior Analysts
Dorsey, Michael F; Weinberg, Michael; Zane, Thomas; Guidi, Megan M
2009-01-01
The evolution of the field of applied behavior analysis to a practice-oriented profession has created the need to ensure that the consumers of these services are adequately protected. We review the limitations of the current board certification process and present a rationale for the establishment of licensing standards for applied behavior analysts on a state-by-state basis. Recommendations for securing the passage of a licensure bill also are discussed. PMID:22477697
CTC Sentinel. Volume 9, Issue 6, June 2016
2016-06-01
Greece. He is a Ph.D. candidate at the Free State University in South Africa and a senior analyst at the Research Institute for European and... research by police and intelligence analysts revealed that “ap- proximately 40 percent of the several thousand Islamist extremists across the country...story Richard Walton argues the threat to Euro 2016, which concludes on July 10, is more acute than for any other international sporting event in history
2017-10-01
significant pressure upon Air Force imagery analysts to exhibit expertise in multiple disciplines including full-motion video , electro-optical still...disciplines varies, but the greatest divergence is between full-motion video and all other forms of still imagery. This paper delves into three...motion video discipline were to be created. The research reveals several positive aspects of this course of action but precautions would be required
Some Thoughts on Self-Disclosure.
Richards, Arnold
2018-04-01
This paper explores the pros and cons of self-disclosure and self revelation in the analyst. It takes as its starting point a paper by Jeffrey Stern that shows a mixed but generally positive outcome of an incident of self-disclosure. The trend in more recent times has been toward somewhat more self-disclosure, with modern analysts' views on a continuum. The author discusses an example from his own practice, in which he delayed self-disclosure for some time, but did reveal facts about himself, and how this had a mostly positive outcome. He concludes by distinguishing self-disclosure that entails stating facts about self from self-revelation, when the analyst tells his feelings about some specifics from his own life or in the patient's disclosure. Such revelation is not likely to be beneficial to the therapeutic alliance in its early stages, but may be of value as the analytic relationship and trust develop over longer time.
A note on consummation and termination.
Calef, V; Weinshel, E M
1983-01-01
The sensation sometimes expressed by analytic patients, most notably during termination of having left some "unfinished business" (to which they hope to return) is not necessarily simply a judgment about the analysis; frequently it is an affective component of the wish for consummation which has not been granted by the analysis. Simultaneously, it expresses the defense against that very consummation. The wish to give the analyst a gift is in some sense the direct opposite, or more correctly, expresses the defense more openly as a bribe and warning to the analyst that he should not expect or hope for consummation of the instinctual wishes which have been the center of analytic work; i.e., it is a defense against the fulfillment of those wishes almost as if the analyst, by attempting to analyze them, insists upon their enactment. Nevertheless, and despite the apparent contradiction, both affects, which serve similar functions, may appear simultaneously.
On evading analysis by becoming an analyst.
Meredith-Owen, William
2007-09-01
This paper considers what implications Bion's famous anecdote about 'some patients getting better and others going on to become psycho-analysts' might have in clinical practice. It explores key stages in the post-qualification analyses of three practitioners whose training analyses had left them qualified but restless and dissatisfied with their ongoing work. It suggests that a significant common factor in these unsatisfactory outcomes was the weakness of these analysands' egos, understood as their inability to enjoy coniunctios, and their profound fear of accessing the source of the problem. This had led to an unwitting investment in spurious super-ego driven alternatives such as professional qualification rather than face the initially bleak realization (of 'nameless dread') that could initiate analysis and individuation. Because of the containment and reward implicit in the training environment it is argued that training analysts--despite their experience and expertise--remain vulnerable to being recruited into an ameliorative fantasy that blocks the transference and inhibits development.
The case of David: on the couch for sixty minutes, nine years of once-a-week treatment.
Kavaler-Adler, Susan
2005-06-01
This paper illustrates a unique case of object relations psychoanalytic psychotherapy on a once-a-week treatment basis. The work of developmental mourning that would be thought to require two to five sessions a week was accomplished on a once-a-week basis. The analyst adjusted the treatment hour, in this one case, to 60 minutes, as opposed to the 45- or 50-minute hour. When treatment began, the analyst made an intuitive judgment to increase the patient's one session a week--which the patient made clear was all he was ready to do--to 60 minutes. The analyst made time in her practice for this 60-minute session and has continued with the patient using this format for 9 years of treatment. This had led up to the current stage of treatment, which has been so critical to the patient's self-integration process.
King, Robert; Parker, Simon; Mouzakis, Kon; Fletcher, Winston; Fitzgerald, Patrick
2007-11-01
The Integrated Task Modeling Environment (ITME) is a user-friendly software tool that has been developed to automatically recode low-level data into an empirical record of meaningful task performance. The present research investigated and validated the performance of the ITME software package by conducting complex simulation missions and comparing the task analyses produced by ITME with taskanalyses produced by experienced video analysts. A very high interrater reliability (> or = .94) existed between experienced video analysts and the ITME for the task analyses produced for each mission. The mean session time:analysis time ratio was 1:24 using video analysis techniques and 1:5 using the ITME. It was concluded that the ITME produced task analyses that were as reliable as those produced by experienced video analysts, and significantly reduced the time cost associated with these analyses.
Interpreting Black-Box Classifiers Using Instance-Level Visual Explanations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tamagnini, Paolo; Krause, Josua W.; Dasgupta, Aritra
2017-05-14
To realize the full potential of machine learning in diverse real- world domains, it is necessary for model predictions to be readily interpretable and actionable for the human in the loop. Analysts, who are the users but not the developers of machine learning models, often do not trust a model because of the lack of transparency in associating predictions with the underlying data space. To address this problem, we propose Rivelo, a visual analytic interface that enables analysts to understand the causes behind predictions of binary classifiers by interactively exploring a set of instance-level explanations. These explanations are model-agnostic, treatingmore » a model as a black box, and they help analysts in interactively probing the high-dimensional binary data space for detecting features relevant to predictions. We demonstrate the utility of the interface with a case study analyzing a random forest model on the sentiment of Yelp reviews about doctors.« less
Boundary and analytic attitude: reflections on a summer holiday break.
Wright, Susanna
2016-06-01
The effect of a boundary in analytic work at the summer holiday break is discussed in relation to archetypal experiences of exclusion, loss and limitation. Some attempts by patients to mitigate an analyst's act of separation are reviewed as enactments, and in particular the meanings of a gift made by one patient. Analytic attitude towards enactment from within different schools of practice is sketched, with reference to the effect on the analyst of departing from the received practice of their own allegiance. A theory is adumbrated that the discomfort of 'contravening the rules' has a useful effect in sparking the analyst into consciousness, with greater attention to salient features in an individual case. Interpretation as an enactment is briefly considered, along with the possible effects of containing the discomfort of a patient's enactment in contrast to confronting it with interpretation. © 2016, The Society of Analytical Psychology.
Situational Awareness of Network System Roles (SANSR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huffer, Kelly M; Reed, Joel W
In a large enterprise it is difficult for cyber security analysts to know what services and roles every machine on the network is performing (e.g., file server, domain name server, email server). Using network flow data, already collected by most enterprises, we developed a proof-of-concept tool that discovers the roles of a system using both clustering and categorization techniques. The tool's role information would allow cyber analysts to detect consequential changes in the network, initiate incident response plans, and optimize their security posture. The results of this proof-of-concept tool proved to be quite accurate on three real data sets. Wemore » will present the algorithms used in the tool, describe the results of preliminary testing, provide visualizations of the results, and discuss areas for future work. Without this kind of situational awareness, cyber analysts cannot quickly diagnose an attack or prioritize remedial actions.« less
A demonstration that the adaptation of electronic instrumentation and towed survey strategies are effective in providing rapid, spatially extensive, and cost effective data for assessment of the Great Lakes.
NASA Astrophysics Data System (ADS)
Syed, N. H.; Rehman, A. A.; Hussain, D.; Ishaq, S.; Khan, A. A.
2017-11-01
Morphometric analysis is vital for any watershed investigation and it is inevitable for flood risk assessment in sub-watershed basins. Present study undertaken to carry out critical evaluation and assessment of sub watershed morphological parameters for flood risk assessment of Central Karakorum National Park (CKNP), where Geographical information system and remote sensing (GIS & RS) approach used for quantifying the parameter and mapping of sub watershed units. ASTER DEM used as a geo-spatial data for watershed delineation and stream network. Morphometric analysis carried out using spatial analyst tool of ArcGIS 10.2. The parameters included were bifurcation ratio (Rb), Drainage Texture (Rt), Circulatory ratio (Rc), Elongated ratio (Re), Drainage density (Dd), Stream Length (Lu), Stream order (Su), Slope and Basin length (Lb) have calculated separately. The analysis revealed that the stream order varies from order 1 to 6 and the total numbers of stream segments of all orders were 52. Multi criteria analysis process used to calculate the risk factor. As an accomplished result, map of sub watershed prioritization developed using weighted standardized risk factor. These results helped to understand sensitivity of flush floods in different sub watersheds of the study area and leaded to better management of the mountainous regions in prospect of flush floods.
Spatial epidemiology of bovine tuberculosis in Mexico.
Martínez, Horacio Zendejas; Suazo, Feliciano Milián; Cuador Gil, José Quintín; Bello, Gustavo Cruz; Anaya Escalera, Ana María; Márquez, Gabriel Huitrón; Casanova, Leticia García
2007-01-01
The purpose of this study was to use geographic information systems (GIS) and geo-statistical methods of ordinary kriging to predict the prevalence and distribution of bovine tuberculosis (TB) in Jalisco, Mexico. A random sample of 2 287 herds selected from a set of 48 766 was used for the analysis. Spatial location of herds was obtained by either a personal global positioning system (GPS), a database from the Instituto Nacional de Estadìstica Geografìa e Informàtica (INEGI) or Google Earth. Information on TB prevalence was provided by the Jalisco Commission for the Control and Eradication of Tuberculosis (COEETB). Prediction of TB was obtained using ordinary kriging in the geostatistical analyst module in ArcView8. A predicted high prevalence area of TB matching the distribution of dairy cattle was observed. This prediction was in agreement with the prevalence calculated on the total 48 766 herds. Validation was performed taking estimated values of TB prevalence at each municipality, extracted from the kriging surface and then compared with the real prevalence values using a correlation test, giving a value of 0.78, indicating that GIS and kriging are reliable tools for the estimation of TB distribution based on a random sample. This resulted in a significant savings of resources.
Andrew T. Hudak; Jeffrey S. Evans; Nicholas L. Crookston; Michael J. Falkowski; Brant K. Steigers; Rob Taylor; Halli Hemingway
2008-01-01
Stand exams are the principal means by which timber companies monitor and manage their forested lands. Airborne LiDAR surveys sample forest stands at much finer spatial resolution and broader spatial extent than is practical on the ground. In this paper, we developed models that leverage spatially intensive and extensive LiDAR data and a stratified random sample of...
GRODY - GAMMA RAY OBSERVATORY DYNAMICS SIMULATOR IN ADA
NASA Technical Reports Server (NTRS)
Stark, M.
1994-01-01
Analysts use a dynamics simulator to test the attitude control system algorithms used by a satellite. The simulator must simulate the hardware, dynamics, and environment of the particular spacecraft and provide user services which enable the analyst to conduct experiments. Researchers at Goddard's Flight Dynamics Division developed GRODY alongside GROSS (GSC-13147), a FORTRAN simulator which performs the same functions, in a case study to assess the feasibility and effectiveness of the Ada programming language for flight dynamics software development. They used popular object-oriented design techniques to link the simulator's design with its function. GRODY is designed for analysts familiar with spacecraft attitude analysis. The program supports maneuver planning as well as analytical testing and evaluation of the attitude determination and control system used on board the Gamma Ray Observatory (GRO) satellite. GRODY simulates the GRO on-board computer and Control Processor Electronics. The analyst/user sets up and controls the simulation. GRODY allows the analyst to check and update parameter values and ground commands, obtain simulation status displays, interrupt the simulation, analyze previous runs, and obtain printed output of simulation runs. The video terminal screen display allows visibility of command sequences, full-screen display and modification of parameters using input fields, and verification of all input data. Data input available for modification includes alignment and performance parameters for all attitude hardware, simulation control parameters which determine simulation scheduling and simulator output, initial conditions, and on-board computer commands. GRODY generates eight types of output: simulation results data set, analysis report, parameter report, simulation report, status display, plots, diagnostic output (which helps the user trace any problems that have occurred during a simulation), and a permanent log of all runs and errors. The analyst can send results output in graphical or tabular form to a terminal, disk, or hardcopy device, and can choose to have any or all items plotted against time or against each other. Goddard researchers developed GRODY on a VAX 8600 running VMS version 4.0. For near real time performance, GRODY requires a VAX at least as powerful as a model 8600 running VMS 4.0 or a later version. To use GRODY, the VAX needs an Ada Compilation System (ACS), Code Management System (CMS), and 1200K memory. GRODY is written in Ada and FORTRAN.
Royle, J. Andrew; Chandler, Richard B.; Sollmann, Rahel; Gardner, Beth
2013-01-01
Spatial Capture-Recapture provides a revolutionary extension of traditional capture-recapture methods for studying animal populations using data from live trapping, camera trapping, DNA sampling, acoustic sampling, and related field methods. This book is a conceptual and methodological synthesis of spatial capture-recapture modeling. As a comprehensive how-to manual, this reference contains detailed examples of a wide range of relevant spatial capture-recapture models for inference about population size and spatial and temporal variation in demographic parameters. Practicing field biologists studying animal populations will find this book to be a useful resource, as will graduate students and professionals in ecology, conservation biology, and fisheries and wildlife management.
Spatial-Operator Algebra For Flexible-Link Manipulators
NASA Technical Reports Server (NTRS)
Jain, Abhinandan; Rodriguez, Guillermo
1994-01-01
Method of computing dynamics of multiple-flexible-link robotic manipulators based on spatial-operator algebra, which originally applied to rigid-link manipulators. Aspects of spatial-operator-algebra approach described in several previous articles in NASA Tech Briefs-most recently "Robot Control Based on Spatial-Operator Algebra" (NPO-17918). In extension of spatial-operator algebra to manipulators with flexible links, each link represented by finite-element model: mass of flexible link apportioned among smaller, lumped-mass rigid bodies, coupling of motions expressed in terms of vibrational modes. This leads to operator expression for modal-mass matrix of link.
Spatially-protected Topology and Group Cohomology in Band Insulators
NASA Astrophysics Data System (ADS)
Alexandradinata, A.
This thesis investigates band topologies which rely fundamentally on spatial symmetries. A basic geometric property that distinguishes spatial symmetry regards their transformation of the spatial origin. Point groups consist of spatial transformations that preserve the spatial origin, while un-split extensions of the point groups by spatial translations are referred to as nonsymmorphic space groups. The first part of the thesis addresses topological phases with discretely-robust surface properties: we introduce theories for the Cnv point groups, as well as certain nonsymmorphic groups that involve glide reflections. These band insulators admit a powerful characterization through the geometry of quasimomentum space; parallel transport in this space is represented by the Wilson loop. The non-symmorphic topology we study is naturally described by a further extension of the nonsymmorphic space group by quasimomentum translations (the Wilson loop), thus placing real and quasimomentum space on equal footing -- here, we introduce the language of group cohomology into the theory of band insulators. The second part of the thesis addresses topological phases without surface properties -- their only known physical consequences are discrete signatures in parallel transport. We provide two such case studies with spatial-inversion and discrete-rotational symmetries respectively. One lesson learned here regards the choice of parameter loops in which we carry out transport -- the loop must be chosen to exploit the symmetry that protects the topology. While straight loops are popular for their connection with the geometric theory of polarization, we show that bent loops also have utility in topological band theory.
2008-03-01
amount of arriving data, extract actionable information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center...information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center climate and it becomes clear that analysts, police... fusion centers, including specifics about how these problems manifest at the Illinois State Police (ISP) Statewide Terrorism and Intelligence Center
An Introduction to the Mission Risk Diagnostic for Incident Management Capabilities (MRD-IMC)
2014-05-01
objectives. Analysts applying the MRD- IMC evaluate a set of systemic risk factors (called drivers) to aggregate decision-making data and provide decision...function is in position to achieve its mission and objective(s) [Alberts 2012]. To accomplish this goal, analysts applying the MRD- IMC evaluate a...005 | 3 evaluation of IM processes and capabilities. The MRD- IMC comprises the following three core tasks: 1. Identify the mission and objective(s
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolton, P.
The purpose of this task was to support ESH-3 in providing Airborne Release Fraction and Respirable Fraction training to safety analysts at LANL who perform accident analysis, hazard analysis, safety analysis, and/or risk assessments at nuclear facilities. The task included preparation of materials for and the conduct of two 3-day training courses covering the following topics: safety analysis process; calculation model; aerosol physic concepts for safety analysis; and overview of empirically derived airborne release fractions and respirable fractions.
Unlocking User-Centered Design Methods for Building Cyber Security Visualizations
2015-08-07
have rarely linked these methods to a final, deployed tool. Goodall et al. interviewed analysts to derive requirements for a network security tool [14... Goodall , W. Lutters, and A. Komlodi. The work of intrusion detec- tion: rethinking the role of security analysts. AMCIS 2004 Proceed- ings, 2004. [14] J. R... Goodall , A. A. Ozok, W. G. Lutters, P. Rheingans, and A. Kom- lodi. A user-centered approach to visualizing network traffic for intru- sion
2008-03-01
thinking as a specialized course. None of these schools requires a foreign language. Only Cal State Fullerton and Sacrament State requires students to...research methods. None of the schools require a foreign language. AMU offers critical thinking as an elective from its General Program for students ...analysts should learn more about the religion. Question Six: Instruction in which of these types of philosophies (Western, Eastern, Middle Eastern, or
On the question of self-disclosure by the analyst: error or advance in technique?
Jacobs, T
1999-04-01
The question of self-disclosure by the analyst and its uses in treatment is an issue widely debated today. In this paper, the author reviews this controversial technique from historical and contemporary points of view, delineates several forms of self-disclosure, and, by means of several clinical examples, discusses the effects on the patient and the analytic process of utilizing one or another kind of self-disclosure in these particular situations.
Insurgent Design: The Re-Emergence of Al-Qaida from 9/11 to the Present
2015-12-01
unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Analysts disagree on how to characterize al-Qa’ida’s evolution. One...ABSTRACT Analysts disagree on how to characterize al-Qa’ida’s evolution. One perspective regards jihadi-Islamism in general to be self-marginalizing. A...impossible. I would also like to thank Dr. Craig Whiteside and Dr. Siamak Naficy, both of whom provided critical input to my research methodology and
When is Analysis Sufficient? A Study of how Professional Intelligence Analysts Judge Rigor
2007-05-01
investors, the marketing researcher assembling an analysis of a competitor’s new products for a corporate executive, and the military analyst preparing a...previously mentioned. In all instances of analysis, the risk of shallowness is fundamental—for both the middle school student and the marketing researcher...natural gas energy policy to respond to the changing consumption of a limited resource in a dynamic energy market . The next critical facet of the
Concept Development and Experimentation Policy and Process: How Analysis Provides Rigour
2010-04-01
modelling and simulation techniques, but in reality the main tool in use is common sense and logic. The main goal of OA analyst is to bring forward those...doing so she should distinguish between the ideal and the intended or desired models to approach the reality as much as possible. Subsequently, the...and collection of measurements to be conducted. In doing so the analyst must ensure to distinguish between the actual and the perceived reality . From
Quadrennial Review of Military Compensation (7th). Allowances. Major Topical Summary (MTS) 3
1992-08-01
Colonel D. Cragin Shelton, ANG Compensation Analyst Major Daniel J. Arena, USA Compensation Analyst QRMC SUPPORT Mr. William H. Warnock Director xviii...of living in the 84 randomly selected areas, in rank order. The QRMC also had Runzheimer survey what were ’ William H. Albright, Benjamin R. Baker...Directorate of Plans, Programs and Analysis, 1990. Albright, William H. et al., A Reference Guide to the 1984 Military Health Services System Beneficiary
Common Bolted Joint Analysis Tool
NASA Technical Reports Server (NTRS)
Imtiaz, Kauser
2011-01-01
Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.
Learning patterns of life from intelligence analyst chat
NASA Astrophysics Data System (ADS)
Schneider, Michael K.; Alford, Mark; Babko-Malaya, Olga; Blasch, Erik; Chen, Lingji; Crespi, Valentino; HandUber, Jason; Haney, Phil; Nagy, Jim; Richman, Mike; Von Pless, Gregory; Zhu, Howie; Rhodes, Bradley J.
2016-05-01
Our Multi-INT Data Association Tool (MIDAT) learns patterns of life (POL) of a geographical area from video analyst observations called out in textual reporting. Typical approaches to learning POLs from video make use of computer vision algorithms to extract locations in space and time of various activities. Such approaches are subject to the detection and tracking performance of the video processing algorithms. Numerous examples of human analysts monitoring live video streams annotating or "calling out" relevant entities and activities exist, such as security analysis, crime-scene forensics, news reports, and sports commentary. This user description typically corresponds with textual capture, such as chat. Although the purpose of these text products is primarily to describe events as they happen, organizations typically archive the reports for extended periods. This archive provides a basis to build POLs. Such POLs are useful for diagnosis to assess activities in an area based on historical context, and for consumers of products, who gain an understanding of historical patterns. MIDAT combines natural language processing, multi-hypothesis tracking, and Multi-INT Activity Pattern Learning and Exploitation (MAPLE) technologies in an end-to-end lab prototype that processes textual products produced by video analysts, infers POLs, and highlights anomalies relative to those POLs with links to "tracks" of related activities performed by the same entity. MIDAT technologies perform well, achieving, for example, a 90% F1-value on extracting activities from the textual reports.
Self-analysis and the development of an interpretation.
Campbell, Donald
2017-10-01
In spite of the fact that Freud's self-analysis was at the centre of so many of his discoveries, self-analysis remains a complex, controversial and elusive exercise. While self-analysis is often seen as emerging at the end of an analysis and then used as a criteria in assessing the suitability for termination, I try to attend to the patient's resistance to self-analysis throughout an analysis. I take the view that the development of the patient's capacity for self-analysis within the analytic session contributes to the patient's growth and their creative and independent thinking during the analysis, which prepares him or her for a fuller life after the formal analysis ends. The model I will present is based on an over lapping of the patient's and the analyst's self-analysis, with recognition and use of the analyst's counter-transference. My focus is on the analyst's self-analysis that is in response to a particular crisis of not knowing, which results in feeling intellectually and emotionally stuck. This paper is not a case study, but a brief look at the process I went through to arrive at a particular interpretation with a particular patient during a particular session. I will concentrate on resistances in which both patient and analyst initially rely upon what is consciously known. Copyright © 2017 Institute of Psychoanalysis.
Toward an ethics of psychoanalysis: a critical reading of Lacan's ethics.
Kirshner, Lewis A
2012-12-01
Lacan's seminar The Ethics of Psychoanalysis (1959-1960) pursues, from a Freudian perspective, a fundamental philosophical question classically addressed by Aristotle's Nichomachean Ethics: How is human life best lived and fulfilled? Is there is an ethic of this type intrinsic to psychoanalysis? Lacan placed the problem of desire at the center of his Ethics. His notorious self-authorized freedom from convention and probable crossing of limits (see Roudinesco 1993) may have led mainstream analysts to ignore his admonition: "At every moment we need to know what our effective relationship is to the desire to do good, to the desire to cure" (Lacan 1959-1960, p. 219). This means that the analyst's desire, as well as the patient's, is always in play in his attempt to sustain an ethical position. An examination of Lacan's seminar highlights this link, but also points to a number of unresolved issues. The patient's desire is a complex matter, readily entangled in neurotic compromise, defense, and transference, and the analyst's commitment to it is also problematic because of the inevitable co-presence of his own desire. Lacan suggested that more emphasis be placed in training on the desire of the analyst, but beyond that a proposal is advanced for the institutionalization of a "third" as reviewer and interlocutor in routine analytic practice. Analysis may not be a discipline that can be limited to a dyadic treatment relationship.
Contacting a 19 month-old mute autistic girl: a clinical narrative.
Busch de Ahumada, Luisa C; Ahumada, Jorge L
2015-02-01
Conveying that psychoanalysis offers rich opportunities for the very early treatment of autistic spectrum disorders, this clinical communication unfolds the clinical process of a 19-month-old 'shell-type' encapsulated mute autistic girl. It details how, in a four-weekly-sessions schedule, infant Lila evolved within two years from being emotionally out-of-contact to the affective aliveness of oedipal involvement. Following Frances Tustin's emphasis on the analyst's 'quality of attention' and Justin Call's advice that in baby-mother interaction the infant is the initiator and the mother is the follower, it is described how the analyst must, amid excruciating non-response, even-mindedly sustain her attention in order to meet the child half-way at those infrequent points where flickers of initiative on her side are adumbrated. This helps attain evanescent 'moments of contact' which coalesce later into 'moments of sharing', eventually leading to acknowledgment of the analyst's humanness and a receptiveness for to-and-fro communication. Thus the 'primal dialogue' (Spitz) is reawakened and, by experiencing herself in the mirror of the analyst, the child's sense of I-ness is reinstated. As evinced by the literature, the mainstream stance rests on systematic early interpretation of the transference, which has in our view strongly deterred progress in the psychoanalytic treatment of autistic spectrum disorders. Copyright © 2014 Institute of Psychoanalysis.