Sample records for business process analyst

  1. 78 FR 17722 - Technological Upgrades to Registration and Recordation Functions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-22

    ... 2000, the Copyright Office initiated a comprehensive business process reengineering initiative intended... outside consultants and business analysts, the Office identified opportunities for efficiency enhancements... business processes and the automated production of public copyright records. Funding available for the...

  2. 78 FR 14359 - Verizon Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-05

    ... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small And Medium Business, Tampa, Florida; Verizon Business Networks Services, Inc., Senior Coordinator-Order... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small and...

  3. 78 FR 47778 - Verizon Business Networks Services, Inc. Senior Analysts-Sales Impletmentation (SA-SI) Birmingham...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ...,968B] Verizon Business Networks Services, Inc. Senior Analysts-Sales Impletmentation (SA-SI) Birmingham, Alabama; Verizon Business Networks Services, Inc. Senior Analysts-Sales Impletmentation (SA-SI) Service Program Delivery Division San Francisco, California; Verizon Business Networks Services, Inc.Senior...

  4. AN 8-WEEK SUMMER INSTITUTE TRAINING PROGRAM TO RETRAIN OFFICE EDUCATION TEACHERS FOR TEACHING BUSINESS ELECTRONIC DATA PROCESSING.

    ERIC Educational Resources Information Center

    CARTER, FOREST C.

    AN 8-WEEK SEMINAR WAS HELD TO RETRAIN TEACHERS WITH A MINIMUM OF 3-YEARS' EXPERIENCE IN BUSINESS OR OFFICE EDUCATION TO TEACH BUSINESS DATA PROCESSING AND PROGRAMING TECHNIQUES. THE OBJECTIVES WERE TO ASSIST IN THE KNOWLEDGE AND SKILL DEVELOPMENT NECESSARY FOR PREPARING COMPUTER PROGRAMERS AND APPLICATION ANALYSTS, AND TO DEVELOP COURSE MATERIAL,…

  5. Understanding the health care business model: the financial analysts' point of view.

    PubMed

    Bukh, Per Nikolaj; Nielsen, Christian

    2010-01-01

    This study focuses on how financial analysts understand the strategy of a health care company and which elements, from such a strategy perspective, they perceive as constituting the cornerstone of a health care company's business model. The empirical part of this study is based on semi-structured interviews with analysts following a large health care company listed on the Copenhagen Stock Exchange. The authors analyse how the financial analysts view strategy and value creation within the framework of a business model. Further, the authors analyze whether the characteristics emerging from a comprehensive literature review are reflected in the financial analysts' perceptions of which information is decision-relevant and important to communicate to the financial markets. Among the conclusions of the study is the importance of distinguishing between the health care companies' business model and the model by which the payment of revenues are allocated between end users and reimbursing organizations.

  6. Towards an Understanding of the Business Process Analyst: An Analysis of Competencies

    ERIC Educational Resources Information Center

    Sonteya, Thembela; Seymour, Lisa

    2012-01-01

    The increase in adoption of business process management (BPM) and service oriented architecture (SOA) has created a high demand for qualified professionals with a plethora of skills. However, despite the growing amount of literature available on the topics of BPM and SOA, little research has been conducted around developing a detailed list of…

  7. 77 FR 65581 - Verizon Business Networks Services, Inc., Senior Analyst, Service Program Delivery (SA-SPD...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-29

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-81,827] Verizon Business Networks... Verizon Business Network Services, Inc., Senior Analyst-Service Program Delivery, Hilliard, Ohio (subject.... Specifically, the worker group supplies service program delivery services. At the request of the State of Ohio...

  8. A content analysis of analyst research: health care through the eyes of analysts.

    PubMed

    Nielsen, Christian

    2008-01-01

    This article contributes to the understanding of how health care companies may communicate the business models by studying financial analysts' analyst reports. The study examines the differences between the information conveyed in recurrent and fundamental analyst reports as well as whether the characteristics of the analysts and their environment affect their business model analyses. A medium-sized health care company in the medical-technology sector, internationally renowned for its state-of-the-art business reporting, was chosen as the basis for the study. An analysis of 111 fundamental and recurrent analyst reports on this company by each investment bank actively following it was conducted using a content analysis methodology. The study reveals that the recurrent analyses are concerned with evaluating the information disclosed by the health care company itself and not so much with digging up new information. It also indicates that while maintenance work might be focused on evaluating specific details, fundamental research is more concerned with extending the understanding of the general picture, i.e., the sustainability and performance of the overall business model. The amount of financial information disclosed in either type of report is not correlated to the other disclosures in the reports. In comparison to business reporting practices, the fundamental analyst reports put considerably less weight on social and sustainability, intellectual capital and corporate governance information, and they disclose much less comparable non-financial information. The suggestion made is that looking at the types of information financial analysts consider important and convey to their "customers," the investors and fund managers, constitutes a valuable indication to health care companies regarding the needs of the financial market users of their reports and other communications. There are some limitations to the possibility of applying statistical tests to the data-set as well as methodological limitations in relation to the exclusion of tables and graphs.

  9. Identifying influential factors of business process performance using dependency analysis

    NASA Astrophysics Data System (ADS)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  10. Training Guide for the Management Analyst Industrial Engineer Technician

    DTIC Science & Technology

    1979-07-01

    comtemporary work operations, and blending traditional and modern organization concepts, the student devwlops the facility to analyze and create organization...training, the attendee will know the functions of a computer as it processes business data to produce information for improved management. He will...action which is most cost effective when considering proposed investments. Emphasis is placed on the adaption of general business practices to

  11. Understanding interfirm relationships in business ecosystems with interactive visualization.

    PubMed

    Basole, Rahul C; Clear, Trustin; Hu, Mengdie; Mehrotra, Harshit; Stasko, John

    2013-12-01

    Business ecosystems are characterized by large, complex, and global networks of firms, often from many different market segments, all collaborating, partnering, and competing to create and deliver new products and services. Given the rapidly increasing scale, complexity, and rate of change of business ecosystems, as well as economic and competitive pressures, analysts are faced with the formidable task of quickly understanding the fundamental characteristics of these interfirm networks. Existing tools, however, are predominantly query- or list-centric with limited interactive, exploratory capabilities. Guided by a field study of corporate analysts, we have designed and implemented dotlink360, an interactive visualization system that provides capabilities to gain systemic insight into the compositional, temporal, and connective characteristics of business ecosystems. dotlink360 consists of novel, multiple connected views enabling the analyst to explore, discover, and understand interfirm networks for a focal firm, specific market segments or countries, and the entire business ecosystem. System evaluation by a small group of prototypical users shows supporting evidence of the benefits of our approach. This design study contributes to the relatively unexplored, but promising area of exploratory information visualization in market research and business strategy.

  12. 78 FR 77769 - Data Collection Available for Public Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-24

    ... comments to Amy Garcia, Program Analyst, Office of Government Contracting, Small Business Administration, 409 3rd Street, 7th Floor, Washington, DC 20416. FOR FURTHER INFORMATION CONTACT: Amy Garcia, Program Analyst, 202-205- 6842, amy.garcia@sba.gov , or Curtis B. Rich, Management Analyst, 202- 205-7030, curtis...

  13. Ontology-Based Information Extraction for Business Intelligence

    NASA Astrophysics Data System (ADS)

    Saggion, Horacio; Funk, Adam; Maynard, Diana; Bontcheva, Kalina

    Business Intelligence (BI) requires the acquisition and aggregation of key pieces of knowledge from multiple sources in order to provide valuable information to customers or feed statistical BI models and tools. The massive amount of information available to business analysts makes information extraction and other natural language processing tools key enablers for the acquisition and use of that semantic information. We describe the application of ontology-based extraction and merging in the context of a practical e-business application for the EU MUSING Project where the goal is to gather international company intelligence and country/region information. The results of our experiments so far are very promising and we are now in the process of building a complete end-to-end solution.

  14. Computer Rehabilitation Training for the Severely Disabled.

    ERIC Educational Resources Information Center

    Louisiana State Univ., Baton Rouge.

    The Computer Rehabilitation Training Program for the Severely Disabled is a job-oriented training program to prepare physically handicapped persons to become computer programmers and analysts. The program is operated by: a nonprofit organization of Baton Rouge-area business people interested in data processing; the Department of Social Services,…

  15. Oncology Modeling for Fun and Profit! Key Steps for Busy Analysts in Health Technology Assessment.

    PubMed

    Beca, Jaclyn; Husereau, Don; Chan, Kelvin K W; Hawkins, Neil; Hoch, Jeffrey S

    2018-01-01

    In evaluating new oncology medicines, two common modeling approaches are state transition (e.g., Markov and semi-Markov) and partitioned survival. Partitioned survival models have become more prominent in oncology health technology assessment processes in recent years. Our experience in conducting and evaluating models for economic evaluation has highlighted many important and practical pitfalls. As there is little guidance available on best practices for those who wish to conduct them, we provide guidance in the form of 'Key steps for busy analysts,' who may have very little time and require highly favorable results. Our guidance highlights the continued need for rigorous conduct and transparent reporting of economic evaluations regardless of the modeling approach taken, and the importance of modeling that better reflects reality, which includes better approaches to considering plausibility, estimating relative treatment effects, dealing with post-progression effects, and appropriate characterization of the uncertainty from modeling itself.

  16. 76 FR 174 - International Business Machines (IBM), Global Sales Operations Organization, Sales and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-03

    ...] International Business Machines (IBM), Global Sales Operations Organization, Sales and Distribution Business Manager Roles; One Teleworker Located in Charleston, WV; International Business Machines (IBM), Global Sales Operations Organization, Sales and Distribution Business Unit, Relations Analyst and Band 8...

  17. The Tail of BPM

    NASA Astrophysics Data System (ADS)

    Kruba, Steve; Meyer, Jim

    Business process management suites (BPMS's) represent one of the fastest growing segments in the software industry as organizations automate their key business processes. As this market matures, it is interesting to compare it to Chris Anderson's 'Long Tail.' Although the 2004 "Long Tail" article in Wired magazine was primarily about the media and entertainment industries, it has since been applied (and perhaps misapplied) to other markets. Analysts describe a "Tail of BPM" market that is, perhaps, several times larger than the traditional BPMS product market. This paper will draw comparisons between the concepts in Anderson's article (and subsequent book) and the BPM solutions market.

  18. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    ERIC Educational Resources Information Center

    Diamond, Michael; Mattia, Angela

    2017-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  19. Data Visualization: An Exploratory Study into the Software Tools Used by Businesses

    ERIC Educational Resources Information Center

    Diamond, Michael; Mattia, Angela

    2015-01-01

    Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…

  20. 78 FR 9447 - Data Collection Available for Public Comments and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-08

    ... SMALL BUSINESS ADMINISTRATION Data Collection Available for Public Comments and Recommendations... requirements. SUMMARY: On December 10, 2012, the Small Business Administration (SBA) published the 60-day..., Program Analyst, Office of Business Development, U.S. Small Business Administration, 409 3rd Street 8th...

  1. Using CASE to Exploit Process Modeling in Technology Transfer

    NASA Technical Reports Server (NTRS)

    Renz-Olar, Cheryl

    2003-01-01

    A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).

  2. Supporting the Growing Needs of the GIS Industry

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Visual Learning Systems, Inc. (VLS), of Missoula, Montana, has developed a commercial software application called Feature Analyst. Feature Analyst was conceived under a Small Business Innovation Research (SBIR) contract with NASA's Stennis Space Center, and through the Montana State University TechLink Center, an organization funded by NASA and the U.S. Department of Defense to link regional companies with Federal laboratories for joint research and technology transfer. The software provides a paradigm shift to automated feature extraction, as it utilizes spectral, spatial, temporal, and ancillary information to model the feature extraction process; presents the ability to remove clutter; incorporates advanced machine learning techniques to supply unparalleled levels of accuracy; and includes an exceedingly simple interface for feature extraction.

  3. 77 FR 46550 - Data Collection Available for Public Comments and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-03

    ... SMALL BUSINESS ADMINISTRATION Data Collection Available for Public Comments and Recommendations... of 1995, this notice announces the Small Business Administration's intentions to request approval on... Nathaniel Bishop, Program Analyst, Office of Economic Development, Small Business Administration, 409 3rd...

  4. 78 FR 9667 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... data are used widely and are valuable tools for analysts of business cycle conditions, including..., Department of the Treasury, and the business community. New orders serve as an indicator of future production... following formula: Affected Public: Business or other for-profit. Frequency: Annually. Respondent's...

  5. 77 FR 8804 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-15

    ..., is frequently used to monitor the business cycle. This survey provides an essential component of the... planning and analysis to business firms, trade associations, research and consulting agencies, and academia... project future movements in manufacturing activity. These statistics are valuable for analysts of business...

  6. 78 FR 76696 - Data Collection Available for Public Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-18

    ... SMALL BUSINESS ADMINISTRATION Data Collection Available for Public Comments ACTION: 60-Day notice... 1995, this notice announces the Small Business Administration's intentions to request approval on a new..., Financial Analyst, Office of Financial Assistance, Small Business Administration, 409 3rd Street, 8th Floor...

  7. Computer Programmer/Analyst.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 25 subjects appropriate for use in a competency list for the occupation of computer programmer/analyst, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 25 units are as…

  8. 78 FR 45198 - Federal Acquisition Regulation; Submission for OMB Review; Anti-Kickback Procedures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-26

    ... INFORMATION CONTACT: Ms. Cecelia L. Davis, Procurement Analyst, Office of Governmentwide Acquisition Policy..., Contractor Business Ethics Compliance Program and Disclosure Requirements. Response: It is important to... business. In the normal course of business, a company that is concerned about ethical behavior will take...

  9. ELECTRONIC DATA PROCESSING--I, A SUGGESTED 2-YEAR POST HIGH SCHOOL CURRICULUM FOR COMPUTER PROGRAMERS AND BUSINESS APPLICATIONS ANALYSTS.

    ERIC Educational Resources Information Center

    RONEY, MAURICE W.; AND OTHERS

    DESIGNED FOR USE IN PLANNING PREPARATORY PROGRAMS, THIS CURRICULUM CAN ALSO BE USEFUL IN PLANNING EXTENSION COURSES FOR EMPLOYED PERSONS. MATERIALS WERE ADAPTED FROM A GUIDE PREPARED BY ORANGE COAST COLLEGE, CALIFORNIA, UNDER A CONTRACTUAL ARRANGEMENT WITH THE U.S. OFFICE OF EDUCATION, AND REVIEWED BY A COMMITTEE COMPOSED OF SPECIALISTS IN DATA…

  10. Intelligence Dissemination to the Warfighter

    DTIC Science & Technology

    2007-12-01

    that prevent other JWICS users from exchanging data. The CIA conducts most of their business on the CIAnet , which can pull data from JWICS but...data. Spreadsheets and word processors, in order to retain a high level of user- friendliness, handle several complex background processes that...the “ complex adaptive systems”, where the onus is placed equally on the analyst and on the tools to be receptive and adaptable. It is the

  11. Sharing adverse drug event data using business intelligence technology.

    PubMed

    Horvath, Monica M; Cozart, Heidi; Ahmad, Asif; Langman, Matthew K; Ferranti, Jeffrey

    2009-03-01

    Duke University Health System uses computerized adverse drug event surveillance as an integral part of medication safety at 2 community hospitals and an academic medical center. This information must be swiftly communicated to organizational patient safety stakeholders to find opportunities to improve patient care; however, this process is encumbered by highly manual methods of preparing the data. Following the examples of other industries, we deployed a business intelligence tool to provide dynamic safety reports on adverse drug events. Once data were migrated into the health system data warehouse, we developed census-adjusted reports with user-driven prompts. Drill down functionality enables navigation from aggregate trends to event details by clicking report graphics. Reports can be accessed by patient safety leadership either through an existing safety reporting portal or the health system performance improvement Web site. Elaborate prompt screens allow many varieties of reports to be created quickly by patient safety personnel without consultation with the research analyst. The reduction in research analyst workload because of business intelligence implementation made this individual available to additional patient safety projects thereby leveraging their talents more effectively. Dedicated liaisons are essential to ensure clear communication between clinical and technical staff throughout the development life cycle. Design and development of the business intelligence model for adverse drug event data must reflect the eccentricities of the operational system, especially as new areas of emphasis evolve. Future usability studies examining the data presentation and access model are needed.

  12. Task Lists for Business, Marketing and Management Occupations, 1988: Cluster Matrices for Business, Marketing and Management Occupations. Education for Employment Task Lists.

    ERIC Educational Resources Information Center

    Fonseca, Linda Lafferty

    Developed in Illinois, this document contains three components. The first component consists of employability task lists for the business, marketing, and management occupations of first-line supervisors and manager/supervisors; file clerks; traffic, shipping, and receiving clerks; records management analysts; adjustment clerks; and customer…

  13. 78 FR 20168 - Twenty Fourth Meeting: RTCA Special Committee 203, Unmanned Aircraft Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-03

    ... Washington, DC, on March 28, 2013. Paige Williams, Management Analyst, NextGen, Business Operations Group... Introductions Review Meeting Agenda Review/Approval of Twenty Third Plenary Meeting Summary Leadership Update... for Unmanned Aircraft Systems and Minimum Aviation System Performance Standards Other Business Adjourn...

  14. A Methodological Approach for Training Analysts of Small Business Problems.

    ERIC Educational Resources Information Center

    Mackness, J. R.

    1986-01-01

    Steps in a small business analysis are discussed: understand how company activities interact internally and with markets and suppliers; know the relative importance of controllable management variables; understand the social atmosphere within the company; analyze the operations of the company; define main problem areas; identify possible actions…

  15. 78 FR 12136 - Fifty Eighth Meeting: RTCA Special Committee 186, Automatic Dependent Surveillance-Broadcast (ADS-B)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-21

    .... Issued in Washington, DC, on February 19, 2013. Paige Williams, Management Analyst, Business Operations... Trajectory Management Other? Other Business. None Identified Review Action Items/Work Programs. Adjourn...) [ssquf] Flight-deck Interval Management (FIM) [ssquf] CAVS and CDTI Assisted Pilot Procedures (CAPP...

  16. A review method for UML requirements analysis model employing system-side prototyping.

    PubMed

    Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    User interface prototyping is an effective method for users to validate the requirements defined by analysts at an early stage of a software development. However, a user interface prototype system offers weak support for the analysts to verify the consistency of the specifications about internal aspects of a system such as business logic. As the result, the inconsistency causes a lot of rework costs because the inconsistency often makes the developers impossible to actualize the system based on the specifications. For verifying such consistency, functional prototyping is an effective method for the analysts, but it needs a lot of costs and more detailed specifications. In this paper, we propose a review method so that analysts can verify the consistency among several different kinds of diagrams in UML efficiently by employing system-side prototyping without the detailed model. The system-side prototype system does not have any functions to achieve business logic, but visualizes the results of the integration among the diagrams in UML as Web pages. The usefulness of our proposal was evaluated by applying our proposal into a development of Library Management System (LMS) for a laboratory. This development was conducted by a group. As the result, our proposal was useful for discovering the serious inconsistency caused by the misunderstanding among the members of the group.

  17. Program on stimulating operational private sector use of Earth observation satellite information

    NASA Technical Reports Server (NTRS)

    Eastwood, L. F., Jr.; Foshage, J.; Gomez, G.; Kirkpatrick, B.; Konig, B.; Stein, R. (Principal Investigator)

    1981-01-01

    Ideas for new businesses specializing in using remote sensing and computerized spatial data systems were developd. Each such business serves as an 'information middleman', buying raw satellite or aircraft imagery, processing these data, combining them in a computer system with customer-specific information, and marketing the resulting information products. Examples of the businesses the project designed are: (1) an agricultural facility site evaluation firm; (2) a mass media grocery price and supply analyst and forecaster; (3) a management service for privately held woodlots; (4) a brokerage for insulation and roofing contractors, based on infrared imagery; (5) an expanded real estate information service. In addition, more than twenty-five other commercially attractive ideas in agribusiness, forestry, mining, real estate, urban planning and redevelopment, and consumer information were created. The commercial feasibility of the five business was assessed. This assessment included market surveys, revenue projections, cost analyses, and profitability studies. The results show that there are large and enthusiastic markets willing to pay for the services these businesses offer, and that the businesses could operate profitably.

  18. Architecture for Business Intelligence in the Healthcare Sector

    NASA Astrophysics Data System (ADS)

    Lee, Sang Young

    2018-03-01

    Healthcare environment is growing to include not only the traditional information systems, but also a business intelligence platform. For executive leaders, consultants, and analysts, there is no longer a need to spend hours in design and develop of typical reports or charts, the entire solution can be completed through using Business Intelligence software. The current paper highlights the advantages of big data analytics and business intelligence in the healthcare industry. In this paper, In this paper we focus our discussion around intelligent techniques and methodologies which are recently used for business intelligence in healthcare.

  19. 75 FR 1662 - Small Business Size Standards: Waiver of the Nonmanufacturer Rule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-12

    ... Garcia, Program Analyst, U.S. Small Business Administration, Office of Government Contracting, 409 3rd Street, SW., Suite 8800, Washington, DC 20416. FOR FURTHER INFORMATION CONTACT: Amy Garcia, by telephone at (202) 205- 6842; by FAX at (202) 481-1630; or by e-mail at Amy.garcia@sba.gov . SUPPLEMENTARY...

  20. Catholic Business Schools and the Crisis of the Academic Industry

    ERIC Educational Resources Information Center

    Hoevel, Carlos

    2012-01-01

    According to many analysts, after the dot-com, housing and financial bubbles, the next bubble to burst may be that of higher education and especially business education schools. Given this possible scenario, there are two ways one might interpret the current crisis in education, accompanied by two proposals for addressing the problems. According…

  1. 76 FR 18303 - Federal Acquisition Regulation; Federal Acquisition Circular 2005-51; Introduction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... Acquisition Circular 2005-51; Introduction; Women-Owned Small Business (WOSB) Program; Clarification of...- 4755. List of Rules in FAC 2005-51 Item Subject FAR case Analyst I Women-Owned Small Business 2010-015... following these item summaries. FAC 2005-51 amends the FAR as specified below: Item I--Women-Owned Small...

  2. 77 FR 12911 - Federal Acquisition Regulation; Federal Acquisition Circular 2005-56; Introduction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... 2005-56 Item Subject FAR Case Analyst I Women-Owned Small Business 2010-015 Morgan (WOSB) Program. II... following these item summaries. FAC 2005-56 amends the FAR as specified below: Item I--Women-Owned Small... agencies in achieving the 5 percent statutory goal for contracting with women-owned small businesses. This...

  3. 78 FR 5554 - Data Collection Available for Public Comments and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ..., to Gina, Supervisory Administrative Specialist, Office of Disaster, Small Business Administration..., Supervisor Administrative Specialist, 202-205-6458 [email protected] Curtis B. Rich, Management Analyst, 202...

  4. VIGOR: Interactive Visual Exploration of Graph Query Results.

    PubMed

    Pienta, Robert; Hohman, Fred; Endert, Alex; Tamersoy, Acar; Roundy, Kevin; Gates, Chris; Navathe, Shamkant; Chau, Duen Horng

    2018-01-01

    Finding patterns in graphs has become a vital challenge in many domains from biological systems, network security, to finance (e.g., finding money laundering rings of bankers and business owners). While there is significant interest in graph databases and querying techniques, less research has focused on helping analysts make sense of underlying patterns within a group of subgraph results. Visualizing graph query results is challenging, requiring effective summarization of a large number of subgraphs, each having potentially shared node-values, rich node features, and flexible structure across queries. We present VIGOR, a novel interactive visual analytics system, for exploring and making sense of query results. VIGOR uses multiple coordinated views, leveraging different data representations and organizations to streamline analysts sensemaking process. VIGOR contributes: (1) an exemplar-based interaction technique, where an analyst starts with a specific result and relaxes constraints to find other similar results or starts with only the structure (i.e., without node value constraints), and adds constraints to narrow in on specific results; and (2) a novel feature-aware subgraph result summarization. Through a collaboration with Symantec, we demonstrate how VIGOR helps tackle real-world problems through the discovery of security blindspots in a cybersecurity dataset with over 11,000 incidents. We also evaluate VIGOR with a within-subjects study, demonstrating VIGOR's ease of use over a leading graph database management system, and its ability to help analysts understand their results at higher speed and make fewer errors.

  5. An Investigation of Techniques for Detecting Data Anomalies in Earned Value Management Data

    DTIC Science & Technology

    2011-12-01

    Management Studio Harte Hanks Trillium Software Trillium Software System IBM Info Sphere Foundation Tools Informatica Data Explorer Informatica ...Analyst Informatica Developer Informatica Administrator Pitney Bowes Business Insight Spectrum SAP BusinessObjects Data Quality Management DataFlux...menting quality monitoring efforts and tracking data quality improvements Informatica http://www.informatica.com/products_services/Pages/index.aspx

  6. 76 FR 18324 - Federal Acquisition Regulation; Federal Acquisition Circular 2005-51; Small Entity Compliance Guide

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... (202) 501-4755. List of Rules in FAC 2005-51 Item Subject FAR case Analyst *I Women-Owned Small 2010...--Women-Owned Small Business (WOSB) Program (FAR Case 2010-015) (Interim) This interim rule amends the FAR to add subpart 19.15, Women-Owned Small Business Program, which will assist Federal agencies in...

  7. 75 FR 75171 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-02

    ... of existing facilities are critical to evaluating productivity growth, the ability of U.S. business... improve estimates of capital stocks for productivity analysis. In addition, industry analysts use the data...

  8. National Freight Demand Modeling - Bridging the Gap between Freight Flow Statistics and U.S. Economic Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, Shih-Miao; Hwang, Ho-Ling

    2007-01-01

    This paper describes a development of national freight demand models for 27 industry sectors covered by the 2002 Commodity Flow Survey. It postulates that the national freight demands are consistent with U.S. business patterns. Furthermore, the study hypothesizes that the flow of goods, which make up the national production processes of industries, is coherent with the information described in the 2002 Annual Input-Output Accounts developed by the Bureau of Economic Analysis. The model estimation framework hinges largely on the assumption that a relatively simple relationship exists between freight production/consumption and business patterns for each industry defined by the three-digit Northmore » American Industry Classification System industry codes (NAICS). The national freight demand model for each selected industry sector consists of two models; a freight generation model and a freight attraction model. Thus, a total of 54 simple regression models were estimated under this study. Preliminary results indicated promising freight generation and freight attraction models. Among all models, only four of them had a R2 value lower than 0.70. With additional modeling efforts, these freight demand models could be enhanced to allow transportation analysts to assess regional economic impacts associated with temporary lost of transportation services on U.S. transportation network infrastructures. Using such freight demand models and available U.S. business forecasts, future national freight demands could be forecasted within certain degrees of accuracy. These freight demand models could also enable transportation analysts to further disaggregate the CFS state-level origin-destination tables to county or zip code level.« less

  9. 78 FR 43962 - Meeting: Cancellation of RTCA Program Management Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    .... Issued in Washington, DC, on July 12, 2013. Paige Williams, Management Analyst, NextGen, Business... Management Committee AGENCY: Federal Aviation Administration (FAA), U.S. Department of Transportation (DOT). [[Page 43963

  10. 77 FR 37732 - Fourteenth Meeting: RTCA Special Committee 223, Airport Surface Wireless Communications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-22

    ..., Introductions, and Administrative Remarks by Special Committee Leadership. Designated Federal Official (DFO): Mr..., DC, on June 15, 2012. Kathy Hitt, Program Analyst, Business Operations Branch, Federal Aviation...

  11. Interpretation and the psychic future.

    PubMed

    Cooper, S H

    1997-08-01

    The author applies the analyst's multi-faceted awareness of his or her view of the patient's psychic future to analytic process. Loewald's (1960) interest in the way in which the analyst anticipates the future of the patient was linked to his epistemological assumptions about the analyst's superior objectivity and maturity relative to the patient. The elucidation of the authority of the analyst (e.g. Hoffman, 1991, 1994) allows us to begin to disentangle the analyst's view of the patient's psychic future from some of these epistemological assumptions. Clinical illustrations attempt to show how the analyst's awareness of this aspect of the interpretive process is often deconstructed over time and can help to understand aspects of resistance from both analyst and patient. This perspective may provide one more avenue for understanding our various modes of influence through interpretive process.

  12. 78 FR 13395 - Meeting: RTCA Special Committee 223, Airport Surface Wireless Communications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-27

    ... Review: Convergence Sub-layer. Security. MAC Layer. Physical Layer. PICS. CRSL. Review/Approval of MOPS... Washington, DC, on February 21, 2013. Paige Williams, Management Analyst, NextGen, Business Operations Group...

  13. Analyst-to-Analyst Variability in Simulation-Based Prediction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glickman, Matthew R.; Romero, Vicente J.

    This report describes findings from the culminating experiment of the LDRD project entitled, "Analyst-to-Analyst Variability in Simulation-Based Prediction". For this experiment, volunteer participants solving a given test problem in engineering and statistics were interviewed at different points in their solution process. These interviews are used to trace differing solutions to differing solution processes, and differing processes to differences in reasoning, assumptions, and judgments. The issue that the experiment was designed to illuminate -- our paucity of understanding of the ways in which humans themselves have an impact on predictions derived from complex computational simulations -- is a challenging and openmore » one. Although solution of the test problem by analyst participants in this experiment has taken much more time than originally anticipated, and is continuing past the end of this LDRD, this project has provided a rare opportunity to explore analyst-to-analyst variability in significant depth, from which we derive evidence-based insights to guide further explorations in this important area.« less

  14. 77 FR 43410 - Data Collection Available for Public Comments and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... the collections, to Sandra Johnston, Program Analyst, Office of Financial Assistance, Small Business... Premier Certified Lenders Program (PCLP) transfers considerable authority and autonomy to Premier...) that are held in account at financial institutions and about SBIC borrowings from financial...

  15. Data management in clinical research: Synthesizing stakeholder perspectives.

    PubMed

    Johnson, Stephen B; Farach, Frank J; Pelphrey, Kevin; Rozenblit, Leon

    2016-04-01

    This study assesses data management needs in clinical research from the perspectives of researchers, software analysts and developers. This is a mixed-methods study that employs sublanguage analysis in an innovative manner to link the assessments. We performed content analysis using sublanguage theory on transcribed interviews conducted with researchers at four universities. A business analyst independently extracted potential software features from the transcriptions, which were translated into the sublanguage. This common sublanguage was then used to create survey questions for researchers, analysts and developers about the desirability and difficulty of features. Results were synthesized using the common sublanguage to compare stakeholder perceptions with the original content analysis. Individual researchers exhibited significant diversity of perspectives that did not correlate by role or site. Researchers had mixed feelings about their technologies, and sought improvements in integration, interoperability and interaction as well as engaging with study participants. Researchers and analysts agreed that data integration has higher desirability and mobile technology has lower desirability but disagreed on the desirability of data validation rules. Developers agreed that data integration and validation are the most difficult to implement. Researchers perceive tasks related to study execution, analysis and quality control as highly strategic, in contrast with tactical tasks related to data manipulation. Researchers have only partial technologic support for analysis and quality control, and poor support for study execution. Software for data integration and validation appears critical to support clinical research, but may be expensive to implement. Features to support study workflow, collaboration and engagement have been underappreciated, but may prove to be easy successes. Software developers should consider the strategic goals of researchers with regard to the overall coordination of research projects and teams, workflow connecting data collection with analysis and processes for improving data quality. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. The AskIT Service Desk: A Model for Improving Productivity and Reducing Costs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashcraft, Phillip Lynn; Fogle, Blythe G.; Cummings, Susan M.

    This was prepared for the business process improvement presentation to the Department of Energy. Los Alamos National Laboratory provides a single point of contact, the AskIT Service Desk, to address issues that impact customer productivity. At the most basic level, what customers want is for their calls to be received, to get a response from a knowledgeable analyst, and to have their issues resolved and their requests fulfilled. Providing a centralized, single point of contact service desk makes initiating technical or business support simple for the customer and improves the odds of immediately resolving the issue or correctly escalating themore » request to the next support level when necessary. Fulfilling customer requests through automated workflow also improves customer productivity and reduces costs. Finally, customers should be provided the option to solve their own problems through easy access to self-help resources such as frequently asked questions (FAQs) and how-to guides. To accomplish this, everyone who provides and supports services must understand how these processes and functions work together. Service providers and those who support services must “speak the same language” and share common objectives. The Associate Directorate for Business Innovation (ADBI) began the journey to improve services by selecting a known service delivery framework (Information Technology Infrastructure Library, or ITIL). From this framework, components that contribute significant business value were selected.« less

  17. Small Business Innovations

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The purpose of QASE RT is to enable system analysts and software engineers to evaluate performance and reliability implications of design alternatives. The program resulted from two Small Business Innovation Research (SBIR) projects. After receiving a description of the system architecture and workload from the user, QASE RT translates the system description into simulation models and executes them. Simulation provides detailed performance evaluation. The results of the evaluations are service and response times, offered load and device utilizations and functional availability.

  18. Micro-based fact collection tool user's manual

    NASA Technical Reports Server (NTRS)

    Mayer, Richard

    1988-01-01

    A procedure designed for use by an analyst to assist in the collection and organization of data gathered during the interview processes associated with system analysis and modeling task is described. The basic concept behind the development of this tool is that during the interview process an analyst is presented with assertions of facts by the domain expert. The analyst also makes observations of the domain. These facts need to be collected and preserved in such a way as to allow them to serve as the basis for a number of decision making processes throughout the system development process. This tool can be thought of as a computerization of the analysts's notebook.

  19. A note on consummation and termination.

    PubMed

    Calef, V; Weinshel, E M

    1983-01-01

    The sensation sometimes expressed by analytic patients, most notably during termination of having left some "unfinished business" (to which they hope to return) is not necessarily simply a judgment about the analysis; frequently it is an affective component of the wish for consummation which has not been granted by the analysis. Simultaneously, it expresses the defense against that very consummation. The wish to give the analyst a gift is in some sense the direct opposite, or more correctly, expresses the defense more openly as a bribe and warning to the analyst that he should not expect or hope for consummation of the instinctual wishes which have been the center of analytic work; i.e., it is a defense against the fulfillment of those wishes almost as if the analyst, by attempting to analyze them, insists upon their enactment. Nevertheless, and despite the apparent contradiction, both affects, which serve similar functions, may appear simultaneously.

  20. 76 FR 78010 - General Services Administration Acquisition Regulation; Information Collection; Contract...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-15

    ... collection of information is accurate and based on valid assumptions and methodology; and ways to enhance the...: February 13, 2012. FOR FURTHER INFORMATION CONTACT: Ms. Dana Munson, Procurement Analyst, General Services.../or business confidential information provided. SUPPLEMENTARY INFORMATION: A. Purpose Under certain...

  1. 78 FR 16246 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-14

    ...: Business or other for-profit organizations. Frequency: On occasion. Respondent's Obligation: Voluntary. OMB... submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection..., 2013. Gwellnar Banks, Management Analyst, Office of the Chief Information Officer. [FR Doc. 2013-05870...

  2. 78 FR 77194 - Data Collection Available for Public Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-20

    ... these institutions, SBA requires them to submit audited financial statements annually as well as interim, quarterly financial statements and other reports to facilitate the Agency's oversight of these lenders... comments to Andrea Giles, Supervisory Financial Analyst, Office of Credit Risk Management, Small Business...

  3. 75 FR 76517 - Data Collection Available for Public Comments and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-08

    ..., including current income. Title: ``Personal Financial Statement.'' Description of Respondents: SBA... Branch, Office of Financial Assistance, Small Business Administration, 409 3rd Street, 8th Floor... Financial Assistance, 202-205-7530, Curtis B. Rich, Management Analyst, 202-205-7030, [email protected

  4. Rescuing Dogs in the Frederick Community | Poster

    Cancer.gov

    Many Frederick National Lab employees have a favorite cause to which they volunteer a significant amount of time. For Dianna Kelly, IT program manager/scientific program analyst, Office of Scientific Operations, and Courtney Kennedy, associate technical project manager, Business Enterprise Systems, that cause is dog rescue.

  5. Addressing the Need for Independence in the CSE Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Ferragut, Erik M; Sheldon, Frederick T

    2011-01-01

    Abstract Information system security risk, defined as the product of the monetary losses associated with security incidents and the probability that they occur, is a suitable decision criterion when considering different information system architectures. Risk assessment is the widely accepted process used to understand, quantify, and document the effects of undesirable events on organizational objectives so that risk management, continuity of operations planning, and contingency planning can be performed. One technique, the Cyberspace Security Econometrics System (CSES), is a methodology for estimating security costs to stakeholders as a function of possible risk postures. In earlier works, we presented a computationalmore » infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain, as a result of security breakdowns. Additional work has applied CSES to specific business cases. The current state-of-the-art of CSES addresses independent events. In typical usage, analysts create matrices that capture their expert opinion, and then use those matrices to quantify costs to stakeholders. This expansion generalizes CSES to the common real-world case where events may be dependent.« less

  6. Concept of operations for knowledge discovery from Big Data across enterprise data warehouses

    NASA Astrophysics Data System (ADS)

    Sukumar, Sreenivas R.; Olama, Mohammed M.; McNair, Allen W.; Nutaro, James J.

    2013-05-01

    The success of data-driven business in government, science, and private industry is driving the need for seamless integration of intra and inter-enterprise data sources to extract knowledge nuggets in the form of correlations, trends, patterns and behaviors previously not discovered due to physical and logical separation of datasets. Today, as volume, velocity, variety and complexity of enterprise data keeps increasing, the next generation analysts are facing several challenges in the knowledge extraction process. Towards addressing these challenges, data-driven organizations that rely on the success of their analysts have to make investment decisions for sustainable data/information systems and knowledge discovery. Options that organizations are considering are newer storage/analysis architectures, better analysis machines, redesigned analysis algorithms, collaborative knowledge management tools, and query builders amongst many others. In this paper, we present a concept of operations for enabling knowledge discovery that data-driven organizations can leverage towards making their investment decisions. We base our recommendations on the experience gained from integrating multi-agency enterprise data warehouses at the Oak Ridge National Laboratory to design the foundation of future knowledge nurturing data-system architectures.

  7. Exploring the Analytical Processes of Intelligence Analysts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Kuchar, Olga A.; Wolf, Katherine E.

    We present an observational case study in which we investigate and analyze the analytical processes of intelligence analysts. Participating analysts in the study carry out two scenarios where they organize and triage information, conduct intelligence analysis, report results, and collaborate with one another. Through a combination of artifact analyses, group interviews, and participant observations, we explore the space and boundaries in which intelligence analysts work and operate. We also assess the implications of our findings on the use and application of relevant information technologies.

  8. Stormy weather. Echoes of Columbia/HCA heard as Tenet overhauls management amid scrutiny of outliner payments and investor protests.

    PubMed

    Galloro, Vince

    2002-11-11

    What began as a small shower of questions quickly transformed into a storm of investor protest last week as Tenet Healthcare Corp. underwent a management overhaul and analysts asked questions about its fiscal future. Tenet's business strategy came under scrutiny because a large proportion of its Medicare revenue was generated by so-called outlier payments. Chairman and CEO Jeffrey Barbakow (left) says that's not how he wants to do business.

  9. The Great Strategy Debate: NATO’s Evolution in the 1960s

    DTIC Science & Technology

    1991-01-01

    business . But even his severest critics were hardpressed to refute the notion that McNamara’s plans flowed from a coherent strategic concept. McNamara’s...sensed this even before he had the data to prove it. In contrast to Westem inteligence analysts, who tended to focus solely on the Soviet Union, McNamara...energy. For a period, it disrupted normal business and diverted the Alliance’s attention from other pressing matters. In addition, de Gaulle denied NATO

  10. DART system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.

    2005-08-01

    The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less

  11. Human Subject Research Protocol: Computer-Aided Human Centric Cyber Situation Awareness: Understanding Cognitive Processes of Cyber Analysts

    DTIC Science & Technology

    2013-11-01

    by existing cyber-attack detection tools far exceeds the analysts’ cognitive capabilities. Grounded in perceptual and cognitive theory , many visual...Processes Inspired by the sense-making theory discussed earlier, we model the analytical reasoning process of cyber analysts using three key...analyst are called “working hypotheses”); each hypothesis could trigger further actions to confirm or disconfirm it. New actions will lead to new

  12. 78 FR 70014 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection... quantity and of comparable quality so as to render the control ineffective. Affected Public: Businesses and... by fax to (202) 395-5167. Dated: November 18, 2013. Gwellnar Banks, Management Analyst, Office of the...

  13. 76 FR 4188 - Federal Acquisition Regulation; Public Access to the Federal Awardee Performance and Integrity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-24

    ...: Mr. Edward Loeb, Procurement Analyst, at (202) 501-0650 for clarification of content. For information... business ethics and quality of prospective contractors competing for Federal contracts. That rulemaking... when the interim rule is published, the Department of Defense's Director of Defense Procurement and...

  14. 75 FR 38673 - Federal Acquisition Regulation; Federal Acquisition Circular 2005-43; Introduction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-02

    ... Web site at http://www.fema.gov/business/contractor.shtm . The Registry covers domestic disaster and... Analyst I Government Property.... 2008-011 Parnell II......... Registry of Disaster 2008-035 Gary Response...--Registry of Disaster Response Contractors (FAR Case 2008-035) This final rule adopts, without change, the...

  15. 77 FR 16313 - Data Collection Available for Public Comments and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-20

    ..., Innovation and Technology Analyst, Office of Technology, Small Business Administration, 409 3rd Street, 6th Floor, Washington, DC 20416. FOR FURTHER INFORMATION CONTACT: Eric Eide, Innovation and Technology, mailto: 202-205-7507%20%[email protected] 202-205-7576 [email protected] Curtis B. Rich, Management...

  16. 77 FR 71028 - Twelfth Meeting: RTCA Special Committee 223, Airport Surface Wireless Communications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-28

    ... by Special Committee Leadership Agenda Overview Review/Approve prior Plenary Meeting Summary and... committee at any time. Issued in Washington, DC, on November 8, 2012. Richard F. Gonzalez, Management Analyst, Business Operations Group, Federal Aviation Administration. [FR Doc. 2012-28854 Filed 11-27-12; 8...

  17. 77 FR 36478 - Notice of Request for Extension of a Currently Approved Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-19

    ... support of the Single Family Housing Direct Loans and Grants programs. The collection involves the use of... consideration. FOR FURTHER INFORMATION CONTACT: Migdaliz Bernier, Finance and Loan Analyst, Single Family... information is estimated to average 6 minutes per response. Respondents: Individuals and business already...

  18. Labor Markets in Imbalance: Review of Qualitative Evidence.

    ERIC Educational Resources Information Center

    Medoff, James L.; Wiener, Jonathan B.

    Recent statistical investigations indicate that labor market imbalance has increased during the past decade and has had important deleterious effects on the nation's inflation and productivity growth records. A growing difficulty in filling skilled jobs at a given unemployment rate is reflected. Business community analysts attribute the growing…

  19. Looking back to inform the future: The role of cognition in forest disturbance characterization from remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Bianchetti, Raechel Anne

    Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show that the basic Image Interpretation Elements continue to be important despite technological augmentation of the interpretation process. These results are used to derive a set of design guidelines for developing geovisual analytic tools and training to support image analysis.

  20. How star women build portable skills.

    PubMed

    Groysberg, Boris

    2008-02-01

    In May 2004, with the war for talent in high gear, Groysberg and colleagues from Harvard Business School wrote in these pages about the risks of hiring star performers away from competitors. After studying the fortunes of more than 1,000 star stock analysts, they found that when a star switched companies, not only did his performance plunge, so did the effectiveness of the group he joined and the market value of his new company. But further analysis of the data reveals that it's not that simple. In fact, one group of analysts reliably maintained star rankings even after changing employers: women. Unlike their male counterparts, female stars who switched firms performed just as well, in the aggregate, as those who stayed put. The 189 star women in the sample (18% of the star analysts studied) achieved a higher rank after switching firms than the men did. Why the discrepancy? First, says the author, the best female analysts appear to have built their franchises on portable, external relationships with clients and the companies they covered, rather than on relationships rooted within their firms. By contrast, male analysts built up greater firm- and team-specific human capital by investing more in the internal networks and unique capabilities and resources of their own companies. Second, women took greater care when assessing a prospective new employer. In this article, Groysberg explores the reasons behind the star women's portable performance.

  1. A crisis in the analyst's life: self-containment, symbolization, and the holding space.

    PubMed

    Michelle, Flax

    2011-04-01

    Most analysts will experience some degree of crisis in the course of their working life. This paper explores the complex interplay between the analyst's affect during a crisis in her lifeü and the affective dynamics of the patient. The central question is "who or what holds the analyst"--especially in times of crisis. Symbolization of affect, facilitated by the analyst's self-created holding environment, is seen as a vital process in order for containment to take place. In the clinical case presented, the analyst's dog was an integral part of the analyst's self-righting through this difficult period; the dog functioned as an "analytic object" within the analysis.

  2. Human/autonomy collaboration for the automated generation of intelligence products

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Schlachter, Jason; Kuter, Ugur; Goldman, Robert

    2017-05-01

    Intelligence Analysis remains a manual process despite trends toward autonomy in information processing. Analysts need agile decision-­-support tools that can adapt to the evolving information needs of the mission, allowing the analyst to pose novel analytic questions. Our research enables the analysts to only provide a constrained English specification of what the intelligence product should be. Using HTN planning, the autonomy discovers, decides, and generates a workflow of algorithms to create the intelligence product. Therefore, the analyst can quickly and naturally communicate to the autonomy what information product is needed, rather than how to create it.

  3. Enhancing the value delivered by the statistician throughout drug discovery and development: putting statistical science into regulated pharmaceutical innovation.

    PubMed

    Enas, G G; Andersen, J S

    With the dawn of the 21st century, the pharmaceutical industry faces a dramatically different constellation of business and scientific predictors of success than those of just a few years ago. Significant advances in science at the genetic, molecular and cellular levels, combined with progress demonstrated around the globe with drug regulations, have increased business and competitive opportunities. This has occurred in search of better and cheaper medicines that reach patients with unmet medical needs as quickly as possible. Herein lie new opportunities for those who can help business and regulatory leaders make good decisions about drug development and market authorization as quickly and efficiently as possible in the presence of uncertainty. The statistician is uniquely trained and qualified to render such value. We show how the statistician can contribute to the process of drug innovation from the very early stages of drug discovery until patients, payers and regulators are satisfied. Indeed, the very nature of regulated innovation demands that efficient and effective processes are implemented which yield the right information for good decision making. The statistician can take the lead in setting a strategy that directs such processes in the direction of greatest value. This demands skills that enable one to identify important sources of variability and uncertainty and then leverage those skills to make decisions. If such decisions call for more information, then the statistician can render experimental designs which generate the right information needed to make the decision in an efficient, timely manner. To add value to the enterprise, statisticians will have to become more intimately associated with business and regulatory decisions by building on their traditional roles (for example, numerical analyst, tactician) and unique skill sets (for example, analysis, computation, logical thought and work process, precision, accuracy). Business and regulatory savvy, coupled with excellent communication and interpersonal skills, will allow statisticians to help create the knowledge needed to drive success in the future. Copyright 2001 John Wiley & Sons, Ltd.

  4. The Arts in Contemporary Education

    ERIC Educational Resources Information Center

    Eger, John M.

    2008-01-01

    The demand for a new workforce to meet the challenges of a global knowledge economy is rapidly increasing. As a special report in Business Week magazine observed last year: "The game is changing. It isn't just about math and science anymore. It's about creativity, imagination, and, above all, innovation." Most analysts studying the new global…

  5. 77 FR 1666 - Proposed Information Collection; Comment Request; Statement by Ultimate Consignee and Purchaser

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-11

    ... end-use and end-user of the U.S. origin commodities to be exported. The information will assist the... information collection). Affected Public: Business or other for-profit organizations. Estimated Number of... become a matter of public record. Dated: January 5, 2012. Gwellnar Banks, Management Analyst, Office of...

  6. An Empirical Study of Enterprise Conceptual Modeling

    NASA Astrophysics Data System (ADS)

    Anaby-Tavor, Ateret; Amid, David; Fisher, Amit; Ossher, Harold; Bellamy, Rachel; Callery, Matthew; Desmond, Michael; Krasikov, Sophia; Roth, Tova; Simmonds, Ian; de Vries, Jacqueline

    Business analysts, business architects, and solution consultants use a variety of practices and methods in their quest to understand business. The resulting work products could end up being transitioned into the formal world of software requirement definitions or as recommendations for all kinds of business activities. We describe an empirical study about the nature of these methods, diagrams, and home-grown conceptual models as reflected in real practice at IBM. We identify the models as artifacts of "enterprise conceptual modeling". We study important features of these models, suggest practical classifications, and discuss their usage. Our survey shows that the "enterprise conceptual modeling" arena presents a variety of descriptive models, each used by a relatively small group of colleagues. Together they form a "long tail" that extends from "drawings" on one end to "standards" on the other.

  7. Concept of Operations for Collaboration and Discovery from Big Data Across Enterprise Data Warehouses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olama, Mohammed M; Nutaro, James J; Sukumar, Sreenivas R

    2013-01-01

    The success of data-driven business in government, science, and private industry is driving the need for seamless integration of intra and inter-enterprise data sources to extract knowledge nuggets in the form of correlations, trends, patterns and behaviors previously not discovered due to physical and logical separation of datasets. Today, as volume, velocity, variety and complexity of enterprise data keeps increasing, the next generation analysts are facing several challenges in the knowledge extraction process. Towards addressing these challenges, data-driven organizations that rely on the success of their analysts have to make investment decisions for sustainable data/information systems and knowledge discovery. Optionsmore » that organizations are considering are newer storage/analysis architectures, better analysis machines, redesigned analysis algorithms, collaborative knowledge management tools, and query builders amongst many others. In this paper, we present a concept of operations for enabling knowledge discovery that data-driven organizations can leverage towards making their investment decisions. We base our recommendations on the experience gained from integrating multi-agency enterprise data warehouses at the Oak Ridge National Laboratory to design the foundation of future knowledge nurturing data-system architectures.« less

  8. 77 FR 12635 - Self-Regulatory Organizations; Financial Industry Regulatory Authority, Inc.; Notice of Filing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-01

    .... Securities Offering. Series 86 Research Analyst--Analysis..... From $160 to $175. Series 87 Research Analyst... Order Processing Assistant Representatives, Research Analysts and Operations Professionals, respectively... examination.\\7\\ \\6\\ PROCTOR is a computer system that is specifically designed for the administration and...

  9. One decade of the Data Fusion Information Group (DFIG) model

    NASA Astrophysics Data System (ADS)

    Blasch, Erik

    2015-05-01

    The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.

  10. The Start of a Tech Revolution

    ERIC Educational Resources Information Center

    Dyrli, Kurt O.

    2009-01-01

    We are at the start of a revolution in the use of computers, one that analysts predict will rival the development of the PC in its significance. Companies such as Google, HP, Amazon, Sun Microsystems, Sony, IBM, and Apple are orienting their entire business models toward this change, and software maker SAS has announced plans for a $70 million…

  11. Coping with Volume and Variety in Temporal Event Sequences: Strategies for Sharpening Analytic Focus.

    PubMed

    Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam

    2017-06-01

    The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.

  12. The analyst's participation in the analytic process.

    PubMed

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  13. 76 FR 24548 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change Relating...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-02

    ... committee uses third-party analyst research and a proprietary fundamental process to make allocation... investment process: Step 1: The Sub-Adviser's use of third-party research consists of analyzing the consensus... analyst research and a proprietary fundamental process to make allocation decisions. Changes to the Fund's...

  14. Knowledge Style Profiling: An Exploration of Cognitive, Temperament, Demographic and Organizational Characteristics among Decision Makers Using Advanced Analytical Technologies

    ERIC Educational Resources Information Center

    Polito, Vincent A., Jr.

    2010-01-01

    The objective of this research was to explore the possibilities of identifying knowledge style factors that could be used as central elements of a professional business analyst's (PBA) performance attributes at work for those decision makers that use advanced analytical technologies on decision making tasks. Indicators of knowledge style were…

  15. The Classification and Evaluation of Computer-Aided Software Engineering Tools

    DTIC Science & Technology

    1990-09-01

    International Business Machines Corporation Customizer is a Registered Trademark of Index Technology Corporation Data Analyst is a Registered Trademark of...years, a rapid series of new approaches have been adopted including: information engineering, entity- relationship modeling, automatic code generation...support true information sharing among tools and automated consistency checking. Moreover, the repository must record and manage the relationships and

  16. Can Khan Move the Bell Curve to the Right?

    ERIC Educational Resources Information Center

    Kronholz, June

    2012-01-01

    More than 1 million people have watched the online video in which Salman Khan--a charming MIT math whiz, Harvard Business School graduate, and former Boston hedge-fund analyst--explains how he began tutoring his cousins in math by posting short lessons for them on YouTube. Other people began watching the lessons and sending Khan adulatory notes.…

  17. Iridium: failures & successes

    NASA Astrophysics Data System (ADS)

    Christensen, CarissaBryce; Beard, Suzette

    2001-03-01

    This paper will provide an overview of the Iridium business venture in terms of the challenges faced, the successes achieved, and the causes of the ultimate failure of the venture — bankruptcy and system de-orbit. The paper will address technical, business, and policy issues. The intent of the paper is to provide a balanced and accurate overview of the Iridium experience, to aid future decision-making by policy makers, the business community, and technical experts. Key topics will include the history of the program, the objectives and decision-making of Motorola, the market research and analysis conducted, partnering strategies and their impact, consumer equipment availability, and technical issues — target performance, performance achieved, technical accomplishments, and expected and unexpected technical challenges. The paper will use as sources trade media and business articles on the Iridium program, technical papers and conference presentations, Wall Street analyst's reports, and, where possible, interviews with participants and close observers.

  18. A catalog of automated analysis methods for enterprise models.

    PubMed

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  19. Analyzing the requirements for mass production of small wind turbine generators

    NASA Astrophysics Data System (ADS)

    Anuskiewicz, T.; Asmussen, J.; Frankenfield, O.

    Mass producibility of small wind turbine generators to give manufacturers design and cost data for profitable production operations is discussed. A 15 kW wind turbine generator for production in annual volumes from 1,000 to 50,000 units is discussed. Methodology to cost the systems effectively is explained. The process estimate sequence followed is outlined with emphasis on the process estimate sheets compiled for each component and subsystem. These data enabled analysts to develop cost breakdown profiles crucial in manufacturing decision-making. The appraisal also led to various design recommendations including replacement of aluminum towers with cost effective carbon steel towers. Extensive cost information is supplied in tables covering subassemblies, capital requirements, and levelized energy costs. The physical layout of the plant is depicted to guide manufacturers in taking advantage of the growing business opportunity now offered in conjunction with the national need for energy development.

  20. Dynamics of analyst forecasts and emergence of complexity: Role of information disparity

    PubMed Central

    Ahn, Kwangwon

    2017-01-01

    We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts’ forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems. PMID:28498831

  1. Trials with errors--preserving the integrity of clinical trials.

    PubMed

    Guyer, R L

    2000-01-01

    The crucial final test of medical research is the clinical trial, which determines whether a drug or discovery really is an effective therapy. All people who participate in clinical trials--researchers, sponsors, volunteers, analysts, reviewers, overseers, others--have opportunities to strengthen or weaken the integrity of the trial system by their behavior. Medical research is now officially married to business, and "profitable" connotes something different to each partner. Only if research and business can profit in parallel will the alliance succeed. Every person who is involved in the medical research business faces temptations and must choose how to react. Each has power and must choose how to wield it. Several centuries before this marriage, the Englishman Izaak Walton noted that "Health is...a blessing that money cannot buy."

  2. Mathematical Modelling for the Evaluation of Automated Speech Recognition Systems--Research Area 3.3.1 (c)

    DTIC Science & Technology

    2016-01-07

    news. Both of these resemble typical activities of intelligence analysts in OSINT processing and production applications. We assessed two task...intelligence analysts in a number of OSINT processing and production applications. (5) Summary of the most important results In both settings

  3. Conceptualisation of clinical facts in the analytic process.

    PubMed

    Riesenberg-Malcolm, R

    1994-12-01

    In this paper the author discusses what she understands to be a clinical fact, stressing that it takes place within the analytic situation between patient and analyst. It is in the process of conceptualising the fact that the analyst comes to define it. In order to conceptualise, the analyst must have a frame of reference, a theoretical basis through which he perceives his patient's communications and is able to give meaning to them. In analytic work, the analyst uses his theory in mainly two ways. When working with his patient it operates mostly unconsciously, but interspersed by quick more conscious thinking. When away from the patient, theory needs to come to the front of the analyst's mind, consciously used by him. A clinical case is used to illustrate these two aspects of theoretical work. In the material presented, aspects of a first session are tentatively conceptualised. Then material from the same patient some years later is described, the method of working and the way of understanding is discussed and thus the process of conceptualising can be illustrated. The theme of hope has been singled out as a linking point between the earlier and later pieces of material.

  4. "This strange disease": adolescent transference and the analyst's sexual orientation.

    PubMed

    Burton, John K; Gilmore, Karen

    2010-08-01

    The treatment of adolescents by gay analysts is uncharted territory regarding the impact of the analyst's sexuality on the analytic process. Since a core challenge of adolescence involves the integration of the adult sexual body, gender role, and reproductive capacities into evolving identity, and since adolescents seek objects in their environment to facilitate both identity formation and the establishment of autonomy from primary objects, the analyst's sexual orientation is arguably a potent influence on the outcome of adolescent development. However, because sexual orientation is a less visible characteristic of the analyst than gender, race, or age, for example, the line between reality and fantasy is less clearly demarcated. This brings up special considerations regarding discovery and disclosure in the treatment. To explore these issues, the case of a late adolescent girl in treatment with a gay male analyst is presented. In this treatment, the question of the analyst's sexual orientation, and the demand by the patient for the analyst's self-disclosure, became a transference nucleus around which the patient's individual dynamics and adolescent dilemmas could be explored and clarified.

  5. Two-Bin Kanban: Ordering Impact at Navy Medical Center San Diego

    DTIC Science & Technology

    2016-06-17

    pretest (2013 data set) and posttest (2015 data set) analysis to avoid having the findings influenced by price changes. DMLSS does not track shipping...statistics based on those observations (Kabacoff, 2011, p. 112). Replacing the groups of observations with summary statistics allows the analyst...listed on the Acquisition Research Program website (www.acquisitionresearch.net). Acquisition Research Program Graduate School of Business & Public

  6. Using a Model of Analysts' Judgments to Augment an Item Calibration Process

    ERIC Educational Resources Information Center

    Hauser, Carl; Thum, Yeow Meng; He, Wei; Ma, Lingling

    2015-01-01

    When conducting item reviews, analysts evaluate an array of statistical and graphical information to assess the fit of a field test (FT) item to an item response theory model. The process can be tedious, particularly when the number of human reviews (HR) to be completed is large. Furthermore, such a process leads to decisions that are susceptible…

  7. Station Set Residual: Event Classification Using Historical Distribution of Observing Stations

    NASA Astrophysics Data System (ADS)

    Procopio, Mike; Lewis, Jennifer; Young, Chris

    2010-05-01

    Analysts working at the International Data Centre in support of treaty monitoring through the Comprehensive Nuclear-Test-Ban Treaty Organization spend a significant amount of time reviewing hypothesized seismic events produced by an automatic processing system. When reviewing these events to determine their legitimacy, analysts take a variety of approaches that rely heavily on training and past experience. One method used by analysts to gauge the validity of an event involves examining the set of stations involved in the detection of an event. In particular, leveraging past experience, an analyst can say that an event located in a certain part of the world is expected to be detected by Stations A, B, and C. Implicit in this statement is that such an event would usually not be detected by Stations X, Y, or Z. For some well understood parts of the world, the absence of one or more "expected" stations—or the presence of one or more "unexpected" stations—is correlated with a hypothesized event's legitimacy and to its survival to the event bulletin. The primary objective of this research is to formalize and quantify the difference between the observed set of stations detecting some hypothesized event, versus the expected set of stations historically associated with detecting similar nearby events close in magnitude. This Station Set Residual can be quantified in many ways, some of which are correlated with the analysts' determination of whether or not the event is valid. We propose that this Station Set Residual score can be used to screen out certain classes of "false" events produced by automatic processing with a high degree of confidence, reducing the analyst burden. Moreover, we propose that the visualization of the historically expected distribution of detecting stations can be immediately useful as an analyst aid during their review process.

  8. Mortality, integrity, and psychoanalysis (who are you to me? Who am I to you?).

    PubMed

    Pinsky, Ellen

    2014-01-01

    The author narrates her experience of mourning her therapist's sudden death. The profession has neglected implications of the analyst's mortality: what is lost or vulnerable to loss? What is that vulnerability's function? The author's process of mourning included her writing and her becoming an analyst. Both pursuits inspired reflections on mortality in two overlapping senses: bodily (the analyst is mortal and can die) and character (the analyst is mortal and can err). The subject thus expands to include impaired character and ethical violations. Paradoxically, the analyst's human limitations threaten each psychoanalytic situation, but also enable it: human imperfection animates the work. The essay ends with a specific example of integrity. © 2014 The Psychoanalytic Quarterly, Inc.

  9. Gamification: The Intersection between Behavior Analysis and Game Design Technologies.

    PubMed

    Morford, Zachary H; Witts, Benjamin N; Killingsworth, Kenneth J; Alavosius, Mark P

    2014-05-01

    Deterding et al. (Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, USA 15: 9-15, 2011) report a recent rise in popularity of video game inspired software designed to address issues in a variety of areas, including health, energy conservation, education, and business. These applications have been based on the concept of gamification, which involves a process by which nongame activities are designed to be more like a game. We provide examples of how gamification has been used to increase health-related behavior, energy consumption, academic performance, and other socially-significant behavior. We argue that behavior analytic research and practice stands to benefit from incorporating successful elements of game design. Lastly, we provide suggestions for behavior analysts regarding applied and basic research related to gamification.

  10. Internalization, separation-individuation, and the nature of therapeutic action.

    PubMed

    Blatt, S J; Behrends, R S

    1987-01-01

    Based on the assumption that the mutative factors that facilitate growth in psychoanalysis involve the same fundamental mechanisms that lead to psychological growth in normal development, this paper considers the constant oscillation between gratification and deprivation leading to internalization as the central therapeutic mechanism of the psychoanalytic process. Patients experience the analytic process as a series of gratifying involvements and experienced incompatibilities that facilitate internalization, whereby the patient recovers lost or disrupted regulatory, gratifying interactions with the analyst, which are real or fantasied, by appropriating these interactions, transforming them into their own, enduring, self-generated functions and characteristics. Patients internalize not only the analyst's interpretive activity, but also the analyst's sensitivity, compassion and acceptance, and, in addition, their own activity in relation to the analyst such as free association. Both interpretation and the therapeutic relationship can contain elements of gratifying involvement and experienced incompatibility that lead to internalization and therefore both can be mutative factors in the therapeutic process.

  11. Cognitive task analysis of network analysts and managers for network situational awareness

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn

    2010-01-01

    The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.

  12. Informatics in neurocritical care: new ideas for Big Data.

    PubMed

    Flechet, Marine; Grandas, Fabian Güiza; Meyfroidt, Geert

    2016-04-01

    Big data is the new hype in business and healthcare. Data storage and processing has become cheap, fast, and easy. Business analysts and scientists are trying to design methods to mine these data for hidden knowledge. Neurocritical care is a field that typically produces large amounts of patient-related data, and these data are increasingly being digitized and stored. This review will try to look beyond the hype, and focus on possible applications in neurointensive care amenable to Big Data research that can potentially improve patient care. The first challenge in Big Data research will be the development of large, multicenter, and high-quality databases. These databases could be used to further investigate recent findings from mathematical models, developed in smaller datasets. Randomized clinical trials and Big Data research are complementary. Big Data research might be used to identify subgroups of patients that could benefit most from a certain intervention, or can be an alternative in areas where randomized clinical trials are not possible. The processing and the analysis of the large amount of patient-related information stored in clinical databases is beyond normal human cognitive ability. Big Data research applications have the potential to discover new medical knowledge, and improve care in the neurointensive care unit.

  13. TelCoVis: Visual Exploration of Co-occurrence in Urban Human Mobility Based on Telco Data.

    PubMed

    Wu, Wenchao; Xu, Jiayi; Zeng, Haipeng; Zheng, Yixian; Qu, Huamin; Ni, Bing; Yuan, Mingxuan; Ni, Lionel M

    2016-01-01

    Understanding co-occurrence in urban human mobility (i.e. people from two regions visit an urban place during the same time span) is of great value in a variety of applications, such as urban planning, business intelligence, social behavior analysis, as well as containing contagious diseases. In recent years, the widespread use of mobile phones brings an unprecedented opportunity to capture large-scale and fine-grained data to study co-occurrence in human mobility. However, due to the lack of systematic and efficient methods, it is challenging for analysts to carry out in-depth analyses and extract valuable information. In this paper, we present TelCoVis, an interactive visual analytics system, which helps analysts leverage their domain knowledge to gain insight into the co-occurrence in urban human mobility based on telco data. Our system integrates visualization techniques with new designs and combines them in a novel way to enhance analysts' perception for a comprehensive exploration. In addition, we propose to study the correlations in co-occurrence (i.e. people from multiple regions visit different places during the same time span) by means of biclustering techniques that allow analysts to better explore coordinated relationships among different regions and identify interesting patterns. The case studies based on a real-world dataset and interviews with domain experts have demonstrated the effectiveness of our system in gaining insights into co-occurrence and facilitating various analytical tasks.

  14. Business Leaders in Action Results for America, Leadership Report for 2010

    DTIC Science & Technology

    2010-03-23

    continuing to make BENS the best source of mission-critical busin ess practice in national security affairs. We will continue BENS’ work in those areas w...challenges in cyber s ecurity, the national debt, intelligence operations and technological innovation, our country needs leaders like you to improve... Intelligence – BENS members are providing intelligence analysts with a deeper understanding of the financial industry and its importance to national

  15. Modelling Agent-Environment Interaction in Multi-Agent Simulations with Affordances

    DTIC Science & Technology

    2010-04-01

    allow operations analysts to conduct statistical studies comparing the effectiveness of different systems or tactics in different scenarios. 11 Instead of...in a Monte-Carlo batch mode, producing statistical outcomes for particular measures of effectiveness. They typically also run at many times faster...Combined with annotated signs, the affordances allowed the traveller agents to find their way around the virtual airport and to conduct their business

  16. Transformation in the Developing World: An Analysis of Colombia’s Security Transformation

    DTIC Science & Technology

    2004-09-01

    familiar with this subject also took significant amounts of time from their busy schedules to grant telephone interviews and provide written guidance. For...allocations. Instead, there was a decrease in intelligence operative and analyst positions. Compared to the familiarity of the...the developing nation is not concerned with preparing a diverse global response strategy for an unknown enemy. Rather, it is generally familiar with

  17. Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.

    PubMed

    Xia, Jianguo; Wishart, David S

    2016-09-07

    MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  18. Global Logistics Management

    DTIC Science & Technology

    2011-07-21

    Phillips, Richard Spencer, and Leigh Warner. Catherine Whittington served as the Board Staff Analyst. PROCESS The Task Group conducted more than...Chair) Mr. Pierre Chao Mr. William Phillips Mr. Richard Spencer Ms. Leigh Warner DBB Staff Analyst Catherine Whittington 2 Methodology

  19. Cognitive Task Analysis of Network Analysts and Managers for Network Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.

    The goal of the project was to create a set of next generation cyber situational awareness capabilities with applications to other domains in the long term. The goal is to improve the decision making process such that decision makers can choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understood what their needs truly were. Consequently, this is the focus of this portion of the research. This paper discusses the methodology we followed to acquire this feedback from the analysts, namely a cognitive task analysis. Additionally, this papermore » provides the details we acquired from the analysts. This essentially provides details on their processes, goals, concerns, the data and meta-data they analyze, etc. A final result we describe is the generation of a task-flow diagram.« less

  20. Section 4: Requirements Intertwining

    NASA Astrophysics Data System (ADS)

    Loucopoulos, Pericles

    Business analysts are being asked to develop increasingly complex and varied business systems that need to cater to the changing and dynamic market conditions of the new economy. This is particularly acute in today’s turbulent business environment where powerful forces such as deregulation, globalisation, mergers, advances in information and telecommunications technologies, and increasing education of people provide opportunities for organising work in ways that have never before been possible. Enterprises attempt to create wealth either by getting better at improving their products and services or by harnessing creativity and human-centred management to create innovative solutions. In these business settings, requirements become critical in bridging system solutions to organisational and societal problems. They intertwine organisational, social, cognitive, and implementation considerations and they can provide unique insights to change in systems and their business context. Such design situations often involve multiple stakeholders from different participating organisations, subcontractors, divisions, etc., who may have a diversity of expertise, come from different organisational cultures and often have competing goals. The success or failure of many projects depends, to a large extent, on understanding the contextual setting of requirements and their interaction amongst a diverse population of stakeholders.

  1. Subcellular object quantification with Squassh3C and SquasshAnalyst.

    PubMed

    Rizk, Aurélien; Mansouri, Maysam; Ballmer-Hofer, Kurt; Berger, Philipp

    2015-11-01

    Quantitative image analysis plays an important role in contemporary biomedical research. Squassh is a method for automatic detection, segmentation, and quantification of subcellular structures and analysis of their colocalization. Here we present the applications Squassh3C and SquasshAnalyst. Squassh3C extends the functionality of Squassh to three fluorescence channels and live-cell movie analysis. SquasshAnalyst is an interactive web interface for the analysis of Squassh3C object data. It provides segmentation image overview and data exploration, figure generation, object and image filtering, and a statistical significance test in an easy-to-use interface. The overall procedure combines the Squassh3C plug-in for the free biological image processing program ImageJ and a web application working in conjunction with the free statistical environment R, and it is compatible with Linux, MacOS X, or Microsoft Windows. Squassh3C and SquasshAnalyst are available for download at www.psi.ch/lbr/SquasshAnalystEN/SquasshAnalyst.zip.

  2. High Level Information Fusion (HLIF) with nested fusion loops

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  3. Embodying analysis: the body and the therapeutic process.

    PubMed

    Martini, Salvatore

    2016-02-01

    This paper considers the transfer of somatic effects from patient to analyst, which gives rise to embodied countertransference, functioning as an organ of primitive communication. By means of processes of projective identification, the analyst experiences somatic disturbances within himself or herself that are connected to the split-off complexes of the analysand. The analysty's own attempt at mind-body integration ushers the patient towards a progressive understanding and acceptance of his or her inner suffering. Such experiences of psychic contagion between patient and analyst are related to Jung's 'psychology of the transference' and the idea of the 'subtle body' as an unconscious shared area. The re-attribution of meaning to pre-verbal psychic experiences within the 'embodied reverie' of the analyst enables the analytic dyad to reach the archetypal energies and structuring power of the collective unconscious. A detailed case example is presented of how the emergence of the vitalizing connection between the psyche and the soma, severed through traumatic early relations with parents or carers, allows the instinctual impulse of the Self to manifest, thereby reactivating the process of individuation. © 2016, The Society of Analytical Psychology.

  4. Proactive human-computer collaboration for information discovery

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Shilliday, Andrew; Barry, Kevin

    2016-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.

  5. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less

  6. Neurotechnology for intelligence analysts

    NASA Astrophysics Data System (ADS)

    Kruse, Amy A.; Boyd, Karen C.; Schulman, Joshua J.

    2006-05-01

    Geospatial Intelligence Analysts are currently faced with an enormous volume of imagery, only a fraction of which can be processed or reviewed in a timely operational manner. Computer-based target detection efforts have failed to yield the speed, flexibility and accuracy of the human visual system. Rather than focus solely on artificial systems, we hypothesize that the human visual system is still the best target detection apparatus currently in use, and with the addition of neuroscience-based measurement capabilities it can surpass the throughput of the unaided human severalfold. Using electroencephalography (EEG), Thorpe et al1 described a fast signal in the brain associated with the early detection of targets in static imagery using a Rapid Serial Visual Presentation (RSVP) paradigm. This finding suggests that it may be possible to extract target detection signals from complex imagery in real time utilizing non-invasive neurophysiological assessment tools. To transform this phenomenon into a capability for defense applications, the Defense Advanced Research Projects Agency (DARPA) currently is sponsoring an effort titled Neurotechnology for Intelligence Analysts (NIA). The vision of the NIA program is to revolutionize the way that analysts handle intelligence imagery, increasing both the throughput of imagery to the analyst and overall accuracy of the assessments. Successful development of a neurobiologically-based image triage system will enable image analysts to train more effectively and process imagery with greater speed and precision.

  7. The application test system: Experiences to date and future plans

    NASA Technical Reports Server (NTRS)

    May, G. A.; Ashburn, P.; Hansen, H. L. (Principal Investigator)

    1979-01-01

    The ATS analysis component is presented focusing on methods by which the varied data sources are used by the ATS analyst. Analyst training and initial processing of data is discussed along with short and long plans for the ATS.

  8. NET-VISA, a Bayesian method next-generation automatic association software. Latest developments and operational assessment.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Kushida, Noriyuki; Mialle, Pierrick; Tomuta, Elena; Arora, Nimar

    2017-04-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing a Bayesian method and software to perform the key step of automatic association of seismological, hydroacoustic, and infrasound (SHI) parametric data. In our preliminary testing in the CTBTO, NET_VISA shows much better performance than its currently operating automatic association module, with a rate for automatic events matching the analyst-reviewed events increased by 10%, signifying that the percentage of missed events is lowered by 40%. Initial tests involving analysts also showed that the new software will complete the automatic bulletins of the CTBTO by adding previously missed events. Because products by the CTBTO are also widely distributed to its member States as well as throughout the seismological community, the introduction of a new technology must be carried out carefully, and the first step of operational integration is to first use NET-VISA results within the interactive analysts' software so that the analysts can check the robustness of the Bayesian approach. We report on the latest results both on the progress for automatic processing and for the initial introduction of NET-VISA results in the analyst review process

  9. A feasibility study for Arizona's roadway safety management process using the Highway Safety Manual and SafetyAnalyst : final report.

    DOT National Transportation Integrated Search

    2016-07-01

    To enable implementation of the American Association of State Highway Transportation (AASHTO) Highway Safety Manual using : SaftetyAnalyst (an AASHTOWare software product), the Arizona Department of Transportation (ADOT) studied the data assessment :...

  10. USDA analyst review of the LACIE IMAGE-100 hybrid system test

    NASA Technical Reports Server (NTRS)

    Ashburn, P.; Buelow, K.; Hansen, H. L.; May, G. A. (Principal Investigator)

    1979-01-01

    Fifty operational segments from the U.S.S.R., 40 test segments from Canada, and 24 test segments from the United States were used to provide a wide range of geographic conditions for USDA analysts during a test to determine the effectiveness of labeling single pixel training fields (dots) using Procedure 1 on the 1-100 hybrid system, and clustering and classifying on the Earth Resources Interactive Processing System. The analysts had additional on-line capabilities such as interactive dot labeling, class or cluster map overlay flickers, and flashing of all dots of equal spectral value. Results on the 1-100 hybrid system are described and analyst problems and recommendations are discussed.

  11. Report To The Secretary Of Defense - Global Logistics Management

    DTIC Science & Technology

    2011-07-01

    Spencer, and Leigh Warner. Catherine Whittington served as the Board’s Staff Analyst. PROCESS The Task Group conducted more than 30 interviews...Phillips Mr. Richard Spencer Ms. Leigh Warner DBB Staff Analyst Ms. Catherine Whittington 2 Methodology  Reviewed DoD Directives and

  12. Toward a Cognitive Task Analysis for Biomedical Query Mediation

    PubMed Central

    Hruby, Gregory W.; Cimino, James J.; Patel, Vimla; Weng, Chunhua

    2014-01-01

    In many institutions, data analysts use a Biomedical Query Mediation (BQM) process to facilitate data access for medical researchers. However, understanding of the BQM process is limited in the literature. To bridge this gap, we performed the initial steps of a cognitive task analysis using 31 BQM instances conducted between one analyst and 22 researchers in one academic department. We identified five top-level tasks, i.e., clarify research statement, explain clinical process, identify related data elements, locate EHR data element, and end BQM with either a database query or unmet, infeasible information needs, and 10 sub-tasks. We evaluated the BQM task model with seven data analysts from different clinical research institutions. Evaluators found all the tasks completely or semi-valid. This study contributes initial knowledge towards the development of a generalizable cognitive task representation for BQM. PMID:25954589

  13. Toward a cognitive task analysis for biomedical query mediation.

    PubMed

    Hruby, Gregory W; Cimino, James J; Patel, Vimla; Weng, Chunhua

    2014-01-01

    In many institutions, data analysts use a Biomedical Query Mediation (BQM) process to facilitate data access for medical researchers. However, understanding of the BQM process is limited in the literature. To bridge this gap, we performed the initial steps of a cognitive task analysis using 31 BQM instances conducted between one analyst and 22 researchers in one academic department. We identified five top-level tasks, i.e., clarify research statement, explain clinical process, identify related data elements, locate EHR data element, and end BQM with either a database query or unmet, infeasible information needs, and 10 sub-tasks. We evaluated the BQM task model with seven data analysts from different clinical research institutions. Evaluators found all the tasks completely or semi-valid. This study contributes initial knowledge towards the development of a generalizable cognitive task representation for BQM.

  14. The problem of self-disclosure in psychoanalysis.

    PubMed

    Meissner, W W

    2002-01-01

    The problem of self-disclosure is explored in relation to currently shifting paradigms of the nature of the analytic relation and analytic interaction. Relational and intersubjective perspectives emphasize the role of self-disclosure as not merely allowable, but as an essential facilitating aspect of the analytic dialogue, in keeping with the role of the analyst as a contributing partner in the process. At the opposite extreme, advocates of classical anonymity stress the importance of neutrality and abstinence. The paper seeks to chart a course between unconstrained self-disclosure and absolute anonymity, both of which foster misalliances. Self-disclosure is seen as at times contributory to the analytic process, and at times deleterious. The decision whether to self-disclose, what to disclose, and when and how, should be guided by the analyst's perspective on neutrality, conceived as a mental stance in which the analyst assesses and decides what, at any given point, seems to contribute to the analytic process and the patient's therapeutic benefit. The major risk in self-disclosure is the tendency to draw the analytic interaction into the real relation between analyst and patient, thus diminishing or distorting the therapeutic alliance, mitigating transference expression, and compromising therapeutic effectiveness.

  15. NVivo 8 and consistency in data analysis: reflecting on the use of a qualitative data analysis program.

    PubMed

    Bergin, Michael

    2011-01-01

    Qualitative data analysis is a complex process and demands clear thinking on the part of the analyst. However, a number of deficiencies may obstruct the research analyst during the process, leading to inconsistencies occurring. This paper is a reflection on the use of a qualitative data analysis program, NVivo 8, and its usefulness in identifying consistency and inconsistency during the coding process. The author was conducting a large-scale study of providers and users of mental health services in Ireland. He used NVivo 8 to store, code and analyse the data and this paper reflects some of his observations during the study. The demands placed on the analyst in trying to balance the mechanics of working through a qualitative data analysis program, while simultaneously remaining conscious of the value of all sources are highlighted. NVivo 8 as a qualitative data analysis program is a challenging but valuable means for advancing the robustness of qualitative research. Pitfalls can be avoided during analysis by running queries as the analyst progresses from tree node to tree node rather than leaving it to a stage whereby data analysis is well advanced.

  16. Developing an intelligence analysis process through social network analysis

    NASA Astrophysics Data System (ADS)

    Waskiewicz, Todd; LaMonica, Peter

    2008-04-01

    Intelligence analysts are tasked with making sense of enormous amounts of data and gaining an awareness of a situation that can be acted upon. This process can be extremely difficult and time consuming. Trying to differentiate between important pieces of information and extraneous data only complicates the problem. When dealing with data containing entities and relationships, social network analysis (SNA) techniques can be employed to make this job easier. Applying network measures to social network graphs can identify the most significant nodes (entities) and edges (relationships) and help the analyst further focus on key areas of concern. Strange developed a model that identifies high value targets such as centers of gravity and critical vulnerabilities. SNA lends itself to the discovery of these high value targets and the Air Force Research Laboratory (AFRL) has investigated several network measures such as centrality, betweenness, and grouping to identify centers of gravity and critical vulnerabilities. Using these network measures, a process for the intelligence analyst has been developed to aid analysts in identifying points of tactical emphasis. Organizational Risk Analyzer (ORA) and Terrorist Modus Operandi Discovery System (TMODS) are the two applications used to compute the network measures and identify the points to be acted upon. Therefore, the result of leveraging social network analysis techniques and applications will provide the analyst and the intelligence community with more focused and concentrated analysis results allowing them to more easily exploit key attributes of a network, thus saving time, money, and manpower.

  17. Towards an automated intelligence product generation capability

    NASA Astrophysics Data System (ADS)

    Smith, Alison M.; Hawes, Timothy W.; Nolan, James J.

    2015-05-01

    Creating intelligence information products is a time consuming and difficult process for analysts faced with identifying key pieces of information relevant to a complex set of information requirements. Complicating matters, these key pieces of information exist in multiple modalities scattered across data stores, buried in huge volumes of data. This results in the current predicament analysts find themselves; information retrieval and management consumes huge amounts of time that could be better spent performing analysis. The persistent growth in data accumulation rates will only increase the amount of time spent on these tasks without a significant advance in automated solutions for information product generation. We present a product generation tool, Automated PrOduct Generation and Enrichment (APOGEE), which aims to automate the information product creation process in order to shift the bulk of the analysts' effort from data discovery and management to analysis. APOGEE discovers relevant text, imagery, video, and audio for inclusion in information products using semantic and statistical models of unstructured content. APOGEEs mixed-initiative interface, supported by highly responsive backend mechanisms, allows analysts to dynamically control the product generation process ensuring a maximally relevant result. The combination of these capabilities results in significant reductions in the time it takes analysts to produce information products while helping to increase the overall coverage. Through evaluation with a domain expert, APOGEE has been shown the potential to cut down the time for product generation by 20x. The result is a flexible end-to-end system that can be rapidly deployed in new operational settings.

  18. Collaborative human-machine analysis using a controlled natural language

    NASA Astrophysics Data System (ADS)

    Mott, David H.; Shemanski, Donald R.; Giammanco, Cheryl; Braines, Dave

    2015-05-01

    A key aspect of an analyst's task in providing relevant information from data is the reasoning about the implications of that data, in order to build a picture of the real world situation. This requires human cognition, based upon domain knowledge about individuals, events and environmental conditions. For a computer system to collaborate with an analyst, it must be capable of following a similar reasoning process to that of the analyst. We describe ITA Controlled English (CE), a subset of English to represent analyst's domain knowledge and reasoning, in a form that it is understandable by both analyst and machine. CE can be used to express domain rules, background data, assumptions and inferred conclusions, thus supporting human-machine interaction. A CE reasoning and modeling system can perform inferences from the data and provide the user with conclusions together with their rationale. We present a logical problem called the "Analysis Game", used for training analysts, which presents "analytic pitfalls" inherent in many problems. We explore an iterative approach to its representation in CE, where a person can develop an understanding of the problem solution by incremental construction of relevant concepts and rules. We discuss how such interactions might occur, and propose that such techniques could lead to better collaborative tools to assist the analyst and avoid the "pitfalls".

  19. The Independent Technical Analysis Process Final Report 2006-2007.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duberstein, Corey; Ham, Kenneth; Dauble, Dennis

    2007-03-01

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities. The Independent Technical Analysis Process (ITAP) was created to provide non-routine analysis for fish and wildlife agencies and tribes in particular and the public in general on matters related tomore » juvenile and adult salmon and steelhead passage through the mainstem hydrosystem. The process was designed to maintain the independence of analysts and reviewers from parties requesting analyses, to avoid potential bias in technical products. The objectives identified for this project were to administer a rigorous, transparent process to deliver unbiased technical assistance necessary to coordinate recommendations for storage reservoir and river operations that avoid potential conflicts between anadromous and resident fish. Seven work elements, designated by numbered categories in the Pisces project tracking system, were created to define and accomplish project goals as follows: (1) 118 Coordination - Coordinate technical analysis and review process: (a) Retain expertise for analyst/reviewer roles. (b) Draft research directives. (c) Send directive to the analyst. (d) Coordinate two independent reviews of the draft report. (e) Ensure reviewer comments are addressed within the final report. (2) 162 Analyze/Interpret Data - Implement the independent aspects of the project. (3) 122 Provide Technical Review - Implement the review process for the analysts. (4) 132 Produce Annual Report - FY06 annual progress report with Pisces Disseminate (5) 161 Disseminate Raw/Summary Data and Results - Post technical products on the ITAP web site. (6) 185-Produce Pisces Status Report - Provide periodic status reports to BPA. (7) 119 Manage and Administer Projects - project/contract administration.« less

  20. This art of psychoanalysis. Dreaming undreamt dreams and interrupted cries.

    PubMed

    Ogden, Thomas H

    2004-08-01

    It is the art of psychoanalysis in the making, a process inventing itself as it goes, that is the subject of this paper. The author articulates succinctly how he conceives of psychoanalysis, and offers a detailed clinical illustration. He suggests that each analysand unconsciously (and ambivalently) is seeking help in dreaming his 'night terrors' (his undreamt and undreamable dreams) and his 'nightmares' (his dreams that are interrupted when the pain of the emotional experience being dreamt exceeds his capacity for dreaming). Undreamable dreams are understood as manifestations of psychotic and psychically foreclosed aspects of the personality; interrupted dreams are viewed as reflections of neurotic and other non-psychotic parts of the personality. The analyst's task is to generate conditions that may allow the analysand--with the analyst's participation--to dream the patient's previously undreamable and interrupted dreams. A significant part of the analyst's participation in the patient's dreaming takes the form of the analyst's reverie experience. In the course of this conjoint work of dreaming in the analytic setting, the analyst may get to know the analysand sufficiently well for the analyst to be able to say something that is true to what is occurring at an unconscious level in the analytic relationship. The analyst's use of language contributes significantly to the possibility that the patient will be able to make use of what the analyst has said for purposes of dreaming his own experience, thereby dreaming himself more fully into existence.

  1. How Analysts Cognitively “Connect the Dots”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradel, Lauren; Self, Jessica S.; Endert, Alexander

    2013-06-04

    As analysts attempt to make sense of a collection of documents, such as intelligence analysis reports, they may wish to “connect the dots” between pieces of information that may initially seem unrelated. This process of synthesizing information between information requires users to make connections between pairs of documents, creating a conceptual story. We conducted a user study to analyze the process by which users connect pairs of documents and how they spatially arrange information. Users created conceptual stories that connected the dots using organizational strategies that ranged in complexity. We propose taxonomies for cognitive connections and physical structures used whenmore » trying to “connect the dots” between two documents. We compared the user-created stories with a data-mining algorithm that constructs chains of documents using co-occurrence metrics. Using the insight gained into the storytelling process, we offer design considerations for the existing data mining algorithm and corresponding tools to combine the power of data mining and the complex cognitive processing of analysts.« less

  2. Research on Heterogeneous Data Exchange based on XML

    NASA Astrophysics Data System (ADS)

    Li, Huanqin; Liu, Jinfeng

    Integration of multiple data sources is becoming increasingly important for enterprises that cooperate closely with their partners for e-commerce. OLAP enables analysts and decision makers fast access to various materialized views from data warehouses. However, many corporations have internal business applications deployed on different platforms. This paper introduces a model for heterogeneous data exchange based on XML. The system can exchange and share the data among the different sources. The method used to realize the heterogeneous data exchange is given in this paper.

  3. Space shuttle/payload interface analysis. (Study 2.4) Volume 4: Business Risk and Value of Operations in Space (BRAVO). Part 2: User's manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The BRAVO User's Manual is presented which describes the BRAVO methodology in terms of step-by-step procedures, so that it may be used as a tool for a team of analysts performing cost effectiveness analyses on potential future space applications. BRAVO requires a relatively general set of input information and a relatively small expenditure of resources. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.

  4. Media and Influence

    NASA Astrophysics Data System (ADS)

    Bennett, William H.

    The public information media provides information on current events (news), entertainment (programming), and opinions offered by trusted public sources (e.g., business, academic or religious spokespersons, journalists, and government officials). Consequently, it is a major force in shaping a populace's attitudes toward significant social issues and of great interest to intervention planners. The chapter attempts to provide modelers and intervention analysts alike with sufficient understanding of media mechanisms and current research that they can begin contributing to, and benefiting from this important area of study.

  5. A multi-phase network situational awareness cognitive task analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.

    Abstract The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into making certain that we had feedback from network analysts and managers and understand what their genuine needs are. This article discusses the cognitive task-analysis methodology that we followed to acquire feedback from the analysts. This article also provides the details we acquired from the analysts on their processes, goals, concerns, themore » data and metadata that they analyze. Finally, we describe the generation of a novel task-flow diagram representing the activities of the target user base.« less

  6. Storing and managing information artifacts collected by information analysts using a computing device

    DOEpatents

    Pike, William A; Riensche, Roderick M; Best, Daniel M; Roberts, Ian E; Whyatt, Marie V; Hart, Michelle L; Carr, Norman J; Thomas, James J

    2012-09-18

    Systems and computer-implemented processes for storage and management of information artifacts collected by information analysts using a computing device. The processes and systems can capture a sequence of interactive operation elements that are performed by the information analyst, who is collecting an information artifact from at least one of the plurality of software applications. The information artifact can then be stored together with the interactive operation elements as a snippet on a memory device, which is operably connected to the processor. The snippet comprises a view from an analysis application, data contained in the view, and the sequence of interactive operation elements stored as a provenance representation comprising operation element class, timestamp, and data object attributes for each interactive operation element in the sequence.

  7. RAVE: Rapid Visualization Environment

    NASA Technical Reports Server (NTRS)

    Klumpar, D. M.; Anderson, Kevin; Simoudis, Avangelos

    1994-01-01

    Visualization is used in the process of analyzing large, multidimensional data sets. However, the selection and creation of visualizations that are appropriate for the characteristics of a particular data set and the satisfaction of the analyst's goals is difficult. The process consists of three tasks that are performed iteratively: generate, test, and refine. The performance of these tasks requires the utilization of several types of domain knowledge that data analysts do not often have. Existing visualization systems and frameworks do not adequately support the performance of these tasks. In this paper we present the RApid Visualization Environment (RAVE), a knowledge-based system that interfaces with commercial visualization frameworks and assists a data analyst in quickly and easily generating, testing, and refining visualizations. RAVE was used for the visualization of in situ measurement data captured by spacecraft.

  8. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    PubMed

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  9. Collaborative human-machine analysis to disambiguate entities in unstructured text and structured datasets

    NASA Astrophysics Data System (ADS)

    Davenport, Jack H.

    2016-05-01

    Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.

  10. Improving sensor data analysis through diverse data source integration

    NASA Astrophysics Data System (ADS)

    Casper, Jennifer; Albuquerque, Ronald; Hyland, Jeremy; Leveille, Peter; Hu, Jing; Cheung, Eddy; Mauer, Dan; Couture, Ronald; Lai, Barry

    2009-05-01

    Daily sensor data volumes are increasing from gigabytes to multiple terabytes. The manpower and resources needed to analyze the increasing amount of data are not growing at the same rate. Current volumes of diverse data, both live streaming and historical, are not fully analyzed. Analysts are left mostly to analyzing the individual data sources manually. This is both time consuming and mentally exhausting. Expanding data collections only exacerbate this problem. Improved data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Improved techniques are needed to reduce an analysts decision response time and to enable more intelligent and immediate situation awareness. This paper describes the Sensor Data and Analysis Framework (SDAF) system built to provide analysts with the ability to pose integrated queries on diverse live and historical data sources, and plug in needed algorithms for upstream processing and filtering. The SDAF system was inspired by input and feedback from field analysts and experts. This paper presents SDAF's capabilities, implementation, and reasoning behind implementation decisions. Finally, lessons learned from preliminary tests and deployments are captured for future work.

  11. Counter-responses as organizers in adolescent analysis and therapy.

    PubMed

    Richmond, M Barrie

    2004-01-01

    The author introduces Counter-response as a phenomological term to replace theory-burdened terms like counter-transference, counter-identification, and counter-resistance. He discusses the analyst's use of self (drawing on the comparison with Winnicott's use of the object) in processing the expectable destabilizing counter-reactions that occur in working therapeutically with disturbed adolescents and their parents. Further; he discusses the counter-reaction to the patient's narrative, acting-out, and how re-enactments can serve as an organizer for understanding the patient's inner life when the analyst formulates his/her counter-response. Emphasis is placed on the therapist forming his or her own narrative with the adolescent that takes into account the evoked counter-reaction. For this purpose, the author recommends the use of a combined counter-response and metaphor-orienting perspective to acknowledge and work with the denial, illusions, reversal of perspective, and catastrophic anxieties experienced with these adolescents. The counter-response perspective permits the emergence of the disturbed adolescent's novel narrative; however, since these experiences can be destabilizing or disruptive, the author also recommends the use of a personal metaphor to anticipate the reluctance to examining, processing, and formulating the analyst's dysphoric counter-reaction. With the use of the counter-response, the analyst's therapeutic ideal is to achieve a more optimal balance between using accepted narrative theories and exploring novel enactment experiences. His swimming metaphor stratagem is designed to keep the analyst in these difficult encounters.

  12. Prerequisites for Systems Analysts: Analytic and Management Demands of a New Approach to Educational Administration.

    ERIC Educational Resources Information Center

    Ammentorp, William

    There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…

  13. Four rules for taking your message to Wall Street.

    PubMed

    Hutton, A

    2001-05-01

    Managers fail to communicate effectively with Wall Street for all sorts of reasons. But neglecting the investment community--particularly the analysts whose opinions shape the market and whose recommendations often make or break a company's share price--can knock the most carefully conceived and brilliantly executed strategy off course. The companies that struggle the most with providing good information to analysts are those in rapidly evolving industries, where the gap between traditional performance metrics and economic realities is at its widest. In these industries, a company's strategy and the variables that govern its performance can change radically in a short time. What's more, the metrics used to report performance often fail to capture the drivers of value in today's information economy. Few accounting measures are helpful when it comes to assessing the intangible assets--knowledge, skilled employees, and so forth--on which many of today's fastest-growing companies build their strategies. According to Amy Hutton, an associate professor at Harvard Business School, there are four basic rules for clear communications with Wall Street. First, make sure that your company's financial reporting reflects your strategy as closely as possible. Second, popularize the nonfinancial metrics that best predict--and flatter--the performance of your businesses. Third, appoint managers with recognized credibility to your strategic operations. Finally, cultivate the market experts who cover the industries in which you seek to compete. Hutton shows how AOL successfully followed these rules as it significantly changed its strategic direction and competitive arena.

  14. Cost approach of health care entity intangible asset valuation.

    PubMed

    Reilly, Robert F

    2012-01-01

    In the valuation synthesis and conclusion process, the analyst should consider the following question: Does the selected valuation approach(es) and method(s) accomplish the analyst's assignment? Also, does the selected valuation approach and method actually quantify the desired objective of the intangible asset analysis? The analyst should also consider if the selected valuation approach and method analyzes the appropriate bundle of legal rights. The analyst should consider if there were sufficient empirical data available to perform the selected valuation approach and method. The valuation synthesis should consider if there were sufficient data available to make the analyst comfortable with the value conclusion. The valuation analyst should consider if the selected approach and method will be understandable to the intended audience. In the valuation synthesis and conclusion, the analyst should also consider which approaches and methods deserve the greatest consideration with respect to the intangible asset's RUL. The intangible asset RUL is a consideration of each valuation approach. In the income approach, the RUL may affect the projection period for the intangible asset income subject to either yield capitalization or direct capitalization. In the cost approach, the RUL may affect the total amount of obsolescence, if any, from the estimate cost measure (that is, the intangible reproduction cost new or replacement cost new). In the market approach, the RUL may effect the selection, rejection, and/or adjustment of the comparable or guideline intangible asset sale and license transactional data. The experienced valuation analyst will use professional judgment to weight the various value indications to conclude a final intangible asset value, based on: The analyst's confidence in the quantity and quality of available data; The analyst's level of due diligence performed on that data; The relevance of the valuation method to the intangible asset life cycle stage and degree of marketability; and The degree of variation in the range of value indications. Valuation analysts value health care intangible assets for a number of reasons. In addition to regulatory compliance reasons, these reasons include various transaction, taxation, financing, litigation, accounting, bankruptcy, and planning purposes. The valuation analyst should consider all generally accepted intangible asset valuation approaches, methods, and procedures. Many valuation analysts are more familiar with market approach and income approach valuation methods. However, there are numerous instances when cost approach valuation methods are also applicable to the health care intangible asset valuation. This discussion summarized the analyst's procedures and considerations with regard to the cost approach. The cost approach is often applicable to the valuation of intangible assets in the health care industry. However, the cost approach is only applicable if the valuation analyst (1) appropriately considers all of the cost components and (2) appropriately identifies and quantifies all obsolescence allowances. Regardless of the health care intangible asset or the reason for the valuation, the analyst should be familiar with all generally accepted valuation approaches and methods. And, the valuation analyst should have a clear, convincing, and cogent rationale for (1) accepting each approach and method applied and (2) rejecting each approach and method not applied. That way, the valuation analyst will best achieve the purpose and objective of the health care intangible asset valuation.

  15. A Graph is Worth a Thousand Words: How Overconfidence and Graphical Disclosure of Numerical Information Influence Financial Analysts Accuracy on Decision Making.

    PubMed

    Cardoso, Ricardo Lopes; Leite, Rodrigo Oliveira; de Aquino, André Carlos Busanelli

    2016-01-01

    Previous researches support that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Moreover, literature shows that different types of graphical information can help or harm the accuracy on decision making of accountants and financial analysts. We conducted a 4×2 mixed-design experiment to examine the effects of numerical information disclosure on financial analysts' accuracy, and investigated the role of overconfidence in decision making. Results show that compared to text, column graph enhanced accuracy on decision making, followed by line graphs. No difference was found between table and textual disclosure. Overconfidence harmed accuracy, and both genders behaved overconfidently. Additionally, the type of disclosure (text, table, line graph and column graph) did not affect the overconfidence of individuals, providing evidence that overconfidence is a personal trait. This study makes three contributions. First, it provides evidence from a larger sample size (295) of financial analysts instead of a smaller sample size of students that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Second, it uses the text as a baseline comparison to test how different ways of information disclosure (line and column graphs, and tables) can enhance understandability of information. Third, it brings an internal factor to this process: overconfidence, a personal trait that harms the decision-making process of individuals. At the end of this paper several research paths are highlighted to further study the effect of internal factors (personal traits) on financial analysts' accuracy on decision making regarding numerical information presented in a graphical form. In addition, we offer suggestions concerning some practical implications for professional accountants, auditors, financial analysts and standard setters.

  16. From somatic pain to psychic pain: The body in the psychoanalytic field.

    PubMed

    Hartung, Thomas; Steinbrecher, Michael

    2017-03-24

    The integration of psyche and soma begins with a baby's earliest contact with his or her parents. With the help of maternal empathy and reverie, β-elements are transformed into α-elements. While we understand this to be the case, we would like to enquire what actually happens to those parts of the affect which have not been transformed? For the most part they may be dealt with by evacuation, but they can also remain within the body, subsequently contributing to psychosomatic symptoms. This paper describes how the body serves as an intermediate store between the psychic (inner) and outer reality. The authors focuses on the unconscious communicative process between the analyst and the analysand, and in particular on how psychosomatic symptoms can spread to the analyst's body. The latter may become sensitive to the analysand's psychosomatic symptoms in order to better understand the psychoanalytical process. Sensory processes (visual and auditory) and psychic mechanisms such as projective identification can serve as a means for this communication. One of the first analysts to deal with this topic was Wilhelm Reich. He described one kind of psychosomatic defence like a shell, the character armour, comparing the armour formed by muscle tension with another, more psychical type of armour. This concept can be linked to Winnicott's contribution of the false self and later on to Feldman's concept of compliance as a defence. The authors links further details of the clinical material with theoretical concepts from Joyce McDougall, Piera Aulagnier, and Ricardo Rodulfo and Marilia Aisenstein. With the aid of the complex concept of projective identification, as described by Heinz Weiss, the authors discusses the important question of how the analyst gets in touch with the patient's current psychosomatic state, and describes a specific communication between the body of the psychoanalyst and the body of the patient. A vignette illustrates in greater detail the relationship between this theoretical understanding and an actual clinical example. In the session described, the analyst reacts to the patient with an intense body-countertransference, taking on the patient's symptoms for a short time. The patient, who had been unable to integrate psyche and soma (whose psyche did not indwell (Winnicott) in his body), projected the untransformed β-elements into his body, where they emerged as bodily symptoms. The body became a kind of intermediate store between inner and outer reality. By internalizing the patient's symptoms in his own body, the analyst created a bodily communication - something in between concerning the inner and the outer reality of both participants of the analytic dyad. The analyst was able to recognize his psychosomatic experience as the fear of dying, and to work through his bodily countertransference. This is described in detail. The emerging understanding of the countertransference helped the analyst to contribute to the patient's process of transforming his symptoms. The analyst was able to help the patient get in touch emotionally with many traumatic situations experienced during his life. The function of the psychosomatic symptoms was to contain the patient's fear of death. These frightening feelings could now be worked through on a psychical level; they could enter into a process of symbol formation so that the psychosomatic symptoms were no longer necessary and disappeared. Copyright © 2017 Institute of Psychoanalysis.

  17. Automatic theory generation from analyst text files using coherence networks

    NASA Astrophysics Data System (ADS)

    Shaffer, Steven C.

    2014-05-01

    This paper describes a three-phase process of extracting knowledge from analyst textual reports. Phase 1 involves performing natural language processing on the source text to extract subject-predicate-object triples. In phase 2, these triples are then fed into a coherence network analysis process, using a genetic algorithm optimization. Finally, the highest-value sub networks are processed into a semantic network graph for display. Initial work on a well- known data set (a Wikipedia article on Abraham Lincoln) has shown excellent results without any specific tuning. Next, we ran the process on the SYNthetic Counter-INsurgency (SYNCOIN) data set, developed at Penn State, yielding interesting and potentially useful results.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogen, Paul Logasa; McKenzie, Amber T; Gillen, Rob

    Forensic document analysis has become an important aspect of investigation of many different kinds of crimes from money laundering to fraud and from cybercrime to smuggling. The current workflow for analysts includes powerful tools, such as Palantir and Analyst s Notebook, for moving from evidence to actionable intelligence and tools for finding documents among the millions of files on a hard disk, such as FTK. However, the analysts often leave the process of sorting through collections of seized documents to filter out the noise from the actual evidence to a highly labor-intensive manual effort. This paper presents the Redeye Analysismore » Workbench, a tool to help analysts move from manual sorting of a collection of documents to performing intelligent document triage over a digital library. We will discuss the tools and techniques we build upon in addition to an in-depth discussion of our tool and how it addresses two major use cases we observed analysts performing. Finally, we also include a new layout algorithm for radial graphs that is used to visualize clusters of documents in our system.« less

  19. Conserving analyst attention units: use of multi-agent software and CEP methods to assist information analysis

    NASA Astrophysics Data System (ADS)

    Rimland, Jeffrey; McNeese, Michael; Hall, David

    2013-05-01

    Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.

  20. On dreaming one's patient: reflections on an aspect of countertransference dreams.

    PubMed

    Brown, Lawrence J

    2007-07-01

    This paper explores the phenomenon of the countertransference dream. Until very recently, such dreams have tended to be seen as reflecting either unanalyzed difficulties in the analyst or unexamined conflicts in the analytic relationship. While the analyst's dream of his/her patient may represent such problems, the author argues that such dreams may also indicate the ways in which the analyst comes to know the patient on a deep, unconscious level by processing the patient's communicative projective identifications. Two extended clinical examples of the author's countertransference dreams are offered. The author also discusses the use of countertransference dreams in psychoanalytic supervision.

  1. The Pacor 2 expert system: A case-based reasoning approach to troubleshooting

    NASA Technical Reports Server (NTRS)

    Sary, Charisse

    1994-01-01

    The Packet Processor 2 (Pacor 2) Data Capture Facility (DCF) acquires, captures, and performs level-zero processing of packet telemetry for spaceflight missions that adhere to communication services recommendations established by the Consultative Committee for Space Data Systems (CCSDS). A major goal of this project is to reduce life-cycle costs. One way to achieve this goal is to increase automation. Through automation, using expert systems, and other technologies, staffing requirements will remain static, which will enable the same number of analysts to support more missions. Analysts provide packet telemetry data evaluation and analysis services for all data received. Data that passes this evaluation is forwarded to the Data Distribution Facility (DDF) and released to scientists. Through troubleshooting, data that fails this evaluation is dumped and analyzed to determine if its quality can be improved before it is released. This paper describes a proof-of-concept prototype that troubleshoots data quality problems. The Pacor 2 expert system prototype uses the case-based reasoning (CBR) approach to development, an alternative to a rule-based approach. Because Pacor 2 is not operational, the prototype has been developed using cases that describe existing troubleshooting experience from currently operating missions. Through CBR, this experience will be available to analysts when Pacor 2 becomes operational. As Pacor 2 unique experience is gained, analysts will update the case base. In essence, analysts are training the system as they learn. Once the system has learned the cases most likely to recur, it can serve as an aide to inexperienced analysts, a refresher to experienced analysts for infrequently occurring problems, or a training tool for new analysts. The Expert System Development Methodology (ESDM) is being used to guide development.

  2. Multi-Instance Learning Models for Automated Support of Analysts in Simulated Surveillance Environments

    NASA Technical Reports Server (NTRS)

    Birisan, Mihnea; Beling, Peter

    2011-01-01

    New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.

  3. Presentation of the results of a Bayesian automatic event detection and localization program to human analysts

    NASA Astrophysics Data System (ADS)

    Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.

    2016-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.

  4. Learning patterns of life from intelligence analyst chat

    NASA Astrophysics Data System (ADS)

    Schneider, Michael K.; Alford, Mark; Babko-Malaya, Olga; Blasch, Erik; Chen, Lingji; Crespi, Valentino; HandUber, Jason; Haney, Phil; Nagy, Jim; Richman, Mike; Von Pless, Gregory; Zhu, Howie; Rhodes, Bradley J.

    2016-05-01

    Our Multi-INT Data Association Tool (MIDAT) learns patterns of life (POL) of a geographical area from video analyst observations called out in textual reporting. Typical approaches to learning POLs from video make use of computer vision algorithms to extract locations in space and time of various activities. Such approaches are subject to the detection and tracking performance of the video processing algorithms. Numerous examples of human analysts monitoring live video streams annotating or "calling out" relevant entities and activities exist, such as security analysis, crime-scene forensics, news reports, and sports commentary. This user description typically corresponds with textual capture, such as chat. Although the purpose of these text products is primarily to describe events as they happen, organizations typically archive the reports for extended periods. This archive provides a basis to build POLs. Such POLs are useful for diagnosis to assess activities in an area based on historical context, and for consumers of products, who gain an understanding of historical patterns. MIDAT combines natural language processing, multi-hypothesis tracking, and Multi-INT Activity Pattern Learning and Exploitation (MAPLE) technologies in an end-to-end lab prototype that processes textual products produced by video analysts, infers POLs, and highlights anomalies relative to those POLs with links to "tracks" of related activities performed by the same entity. MIDAT technologies perform well, achieving, for example, a 90% F1-value on extracting activities from the textual reports.

  5. Shuttle user analysis (study 2.2). Volume 3: Business risk and value of operations in space (BRAVO). Part 2: User's manual

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The purpose of the BRAVO User's Manual is to describe the BRAVO methodology in terms of step-by-step procedures. The BRAVO methodology then becomes a tool which a team of analysts can utilize to perform cost effectiveness analyses on potential future space applications with a relatively general set of input information and a relatively small expenditure of resources. An overview of the BRAVO procedure is given by describing the complete procedure in a general form.

  6. A psychoanalytical phenomenology of perversion.

    PubMed

    Jiménez, Juan Pablo

    2004-02-01

    After stating that the current tasks of psychoanalytic research should fundamentally include the exploration of the analyst's mental processes in sessions with the patient, the author describes the analytical relation as one having an intersubjective nature. Seen from the outside, the analytical relation evidences two poles: a symmetric structural pole where both analyst and patient share a single world and a single approach to reality, and a functional asymmetric pole that defines the assignment of the respective roles. In the analysis of a perverse patient, the symmetry-asymmetry polarities acquire some very particular characteristics. Seen from the perspective of the analyst's subjectivity, perversion appears in the analyst's mind as a surreptitious and unexpected transgression of the basic agreement that facilitates and structures intersubjective encounters. It may go as far as altering the Aristotelian rules of logic. When coming into contact with the psychic reality of a perverse patient, what happens in the analyst's mind is that a world takes shape. This world is misleadingly coloured by an erotisation that sooner or later will acquire some characteristics of violence. The perverse nucleus, as a false reality, remains dangling in mid-air as an experience that is inaccessible to the analyst's empathy. The only way the analyst can reach it is from the 'periphery' of the patient's psychic reality, by trying in an indirect way to lead him back to his intersubjective roots. At this point, the author's intention is to explain this intersubjective phenomenon in terms of metapsychological and empirical research-based theories. Finally, some ideas on the psychogenesis of perversion are set forth.

  7. The reality of the other: dreaming of the analyst.

    PubMed

    Ferruta, Anna

    2009-02-01

    The author discusses the obstacles to symbolization encountered when the analyst appears in the first dream of an analysis: the reality of the other is represented through the seeming recognition of the person of the analyst, who is portrayed in undisguised form. The interpretation of this first dream gives rise to reflections on the meaning of the other's reality in analysis: precisely this realistic representation indicates that the function of the other in the construction of the psychic world has been abolished. An analogous phenomenon is observed in the countertransference, as the analyst's mental processes are occluded by an exclusively self-generated interpretation of the patient's psychic world. For the analyst too, the reality of the other proves not to play a significant part in the construction of her interpretation. A 'turning-point' dream after five years bears witness to the power of the transforming function performed by the other throughout the analysis, by way of the representation of characters who stand for the necessary presence of a third party in the construction of a personal psychic reality. The author examines the mutual denial of the other's otherness, as expressed by the vicissitudes of the transference and countertransference between analyst and patient, otherness being experienced as a disturbance of self-sufficient narcissistic functioning. The paper ends with an analysis of the transformations that took place in the analytic relationship.

  8. On becoming a psychoanalyst.

    PubMed

    Gabbard, Glen O; Ogden, Thomas H

    2009-04-01

    One has the opportunity and responsibility to become an analyst in one's own terms in the course of the years of practice that follow the completion of formal analytic training. The authors discuss their understanding of some of the maturational experiences that have contributed to their becoming analysts in their own terms. They believe that the most important element in the process of their maturation as analysts has been the development of the capacity to make use of what is unique and idiosyncratic to each of them; each, when at his best, conducts himself as an analyst in a way that reflects his own analytic style; his own way of being with, and talking with, his patients; his own form of the practice of psychoanalysis. The types of maturational experiences that the authors examine include situations in which they have learned to listen to themselves speak with their patients and, in so doing, begin to develop a voice of their own; experiences of growth that have occurred in the context of presenting clinical material to a consultant; making self-analytic use of their experience with their patients; creating/discovering themselves as analysts in the experience of analytic writing (with particular attention paid to the maturational experience involved in writing the current paper); and responding to a need to keep changing, to be original in their thinking and behavior as analysts.

  9. Analytic process and dreaming about analysis.

    PubMed

    Sirois, François

    2016-12-01

    Dreams about the analytic session feature a manifest content in which the analytic setting is subject to distortion while the analyst appears undisguised. Such dreams are a consistent yet infrequent occurrence in most analyses. Their specificity consists in never reproducing the material conditions of the analysis as such. This paper puts forward the following hypothesis: dreams about the session relate to some aspects of the analyst's activity. In this sense, such dreams are indicative of the transference neurosis, prefiguring transference resistances to the analytic elaboration of key conflicts. The parts taken by the patient and by the analyst are discussed in terms of their ability to signal a deepening of the analysis. Copyright © 2016 Institute of Psychoanalysis.

  10. SnapShot: Visualization to Propel Ice Hockey Analytics.

    PubMed

    Pileggi, H; Stolper, C D; Boyle, J M; Stasko, J T

    2012-12-01

    Sports analysts live in a world of dynamic games flattened into tables of numbers, divorced from the rinks, pitches, and courts where they were generated. Currently, these professional analysts use R, Stata, SAS, and other statistical software packages for uncovering insights from game data. Quantitative sports consultants seek a competitive advantage both for their clients and for themselves as analytics becomes increasingly valued by teams, clubs, and squads. In order for the information visualization community to support the members of this blossoming industry, it must recognize where and how visualization can enhance the existing analytical workflow. In this paper, we identify three primary stages of today's sports analyst's routine where visualization can be beneficially integrated: 1) exploring a dataspace; 2) sharing hypotheses with internal colleagues; and 3) communicating findings to stakeholders.Working closely with professional ice hockey analysts, we designed and built SnapShot, a system to integrate visualization into the hockey intelligence gathering process. SnapShot employs a variety of information visualization techniques to display shot data, yet given the importance of a specific hockey statistic, shot length, we introduce a technique, the radial heat map. Through a user study, we received encouraging feedback from several professional analysts, both independent consultants and professional team personnel.

  11. OpinionFlow: Visual Analysis of Opinion Diffusion on Social Media.

    PubMed

    Wu, Yingcai; Liu, Shixia; Yan, Kai; Liu, Mengchen; Wu, Fangzhao

    2014-12-01

    It is important for many different applications such as government and business intelligence to analyze and explore the diffusion of public opinions on social media. However, the rapid propagation and great diversity of public opinions on social media pose great challenges to effective analysis of opinion diffusion. In this paper, we introduce a visual analysis system called OpinionFlow to empower analysts to detect opinion propagation patterns and glean insights. Inspired by the information diffusion model and the theory of selective exposure, we develop an opinion diffusion model to approximate opinion propagation among Twitter users. Accordingly, we design an opinion flow visualization that combines a Sankey graph with a tailored density map in one view to visually convey diffusion of opinions among many users. A stacked tree is used to allow analysts to select topics of interest at different levels. The stacked tree is synchronized with the opinion flow visualization to help users examine and compare diffusion patterns across topics. Experiments and case studies on Twitter data demonstrate the effectiveness and usability of OpinionFlow.

  12. The Insider Threat to Cybersecurity: How Group Process and Ignorance Affect Analyst Accuracy and Promptitude

    DTIC Science & Technology

    2017-09-01

    meta-analytic review and theoretical integration . Journal of Personality and Social Psychology ,65(4), 681. Karr-Wisniewski, P., & Lu, Y. (2010...dissertation applies attribution theory, a product of cognitive psychology , to evaluate how analysts collectively and individually make attributions in...Likewise, many researchers agree that anomaly detection is an integral component for insider threat analysis (Brdiczka, Liu, Price, Shen, Patil, Chow

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Air Products and Chemicals (AP) is looking for buyers for its 50% stake in American Refuel, its waste-to-energy venture with Browning-Ferris. The company also announced plans to buy back 10% of its stock and says it will use the proceeds of the divestiture to partially fund the repurchase. The company declines to estimate the value of its stake in American Ref-Fuel, which runs four waste-to-energy plants. AP says the business is profitable but that it wants to focus on its industrial gases and chemical businesses. Meanwhile, AP CEO H.A. Wagner issued a caution over the company`s second-quarter results, due outmore » later this month. {open_quotes}While we expect our financial performance in the second quarter to be in the range of current analyst estimates, operating income in our gases segment remains at a level similar to that of the most recent quarter. Improving results in this area continues to be management`s top priority,{close_quotes} he says. Operating income for the gases business was down 5% in the first quarter compared with the year earlier.« less

  14. Critical product features' identification using an opinion analyzer.

    PubMed

    Shamim, Azra; Balakrishnan, Vimala; Tahir, Muhammad; Shiraz, Muhammad

    2014-01-01

    The increasing use and ubiquity of the Internet facilitate dissemination of word-of-mouth through blogs, online forums, newsgroups, and consumer's reviews. Online consumer's reviews present tremendous opportunities and challenges for consumers and marketers. One of the challenges is to develop interactive marketing practices for making connections with target consumers that capitalize consumer-to-consumer communications for generating product adoption. Opinion mining is employed in marketing to help consumers and enterprises in the analysis of online consumers' reviews by highlighting the strengths and weaknesses of the products. This paper describes an opinion mining system based on novel review and feature ranking methods to empower consumers and enterprises for identifying critical product features from enormous consumers' reviews. Consumers and business analysts are the main target group for the proposed system who want to explore consumers' feedback for determining purchase decisions and enterprise strategies. We evaluate the proposed system on real dataset. Results show that integration of review and feature-ranking methods improves the decision making processes significantly.

  15. Critical Product Features' Identification Using an Opinion Analyzer

    PubMed Central

    Shamim, Azra; Balakrishnan, Vimala

    2014-01-01

    The increasing use and ubiquity of the Internet facilitate dissemination of word-of-mouth through blogs, online forums, newsgroups, and consumer's reviews. Online consumer's reviews present tremendous opportunities and challenges for consumers and marketers. One of the challenges is to develop interactive marketing practices for making connections with target consumers that capitalize consumer-to-consumer communications for generating product adoption. Opinion mining is employed in marketing to help consumers and enterprises in the analysis of online consumers' reviews by highlighting the strengths and weaknesses of the products. This paper describes an opinion mining system based on novel review and feature ranking methods to empower consumers and enterprises for identifying critical product features from enormous consumers' reviews. Consumers and business analysts are the main target group for the proposed system who want to explore consumers' feedback for determining purchase decisions and enterprise strategies. We evaluate the proposed system on real dataset. Results show that integration of review and feature-ranking methods improves the decision making processes significantly. PMID:25506612

  16. Emissions Scenario Portal for Visualization of Low Carbon Pathways

    NASA Astrophysics Data System (ADS)

    Friedrich, J.; Hennig, R. J.; Mountford, H.; Altamirano, J. C.; Ge, M.; Fransen, T.

    2016-12-01

    This proposal for a presentation is centered around a new project which is developed collaboratively by the World Resources Institute (WRI), Google Inc., and Deep Decarbonization Pathways Project (DDPP). The project aims to develop an online, open portal, the Emissions Scenario Portal (ESP),to enable users to easily visualize a range of future greenhouse gas emission pathways linked to different scenarios of economic and energy developments, drawing from a variety of modeling tools. It is targeted to users who are not modelling experts, but instead policy analysts or advisors, investment analysts, and similar who draw on modelled scenarios to inform their work, and who can benefit from better access to, and transparency around, the wide range of emerging scenarios on ambitious climate action. The ESP will provide information from scenarios in a visually appealing and easy-to-understand manner that enable these users to recognize the opportunities to reduce GHG emissions, the implications of the different scenarios, and the underlying assumptions. To facilitate the application of the portal and tools in policy dialogues, a series of country-specific and potentially sector-specific workshops with key decision-makers and analysts, supported by relevant analysis, will be organized by the key partners and also in broader collaboration with others who might wish to convene relevant groups around the information. This project will provide opportunities for modelers to increase their outreach and visibility in the public space and to directly interact with key audiences of emissions scenarios, such as policy analysts and advisors. The information displayed on the portal will cover a wide range of indicators, sectors and important scenario characteristics such as macroeconomic information, emission factors and policy as well as technology assumptions in order to facilitate comparison. These indicators have been selected based on existing standards (such as the IIASA AR5 database, the Greenhouse Gas Protocol and accounting literature) and stakeholder consultations. Examples for use cases include: technical advisers for governments NGO/Civil Society advocates Investors and bankers Modelers and academics Business sustainability officers

  17. Where are you, my beloved? On absence, loss, and the enigma of telepathic dreams.

    PubMed

    Eshel, Ofra

    2006-12-01

    The subject of dream telepathy (especially patients' telepathic dreams) and related phenomena in the psychoanalytic context has been a controversial, disturbing 'foreign body' ever since it was introduced into psychoanalysis by Freud in 1921. Telepathy- suffering (or intense feeling) at a distance (Greek: pathos + tele)-is the transfer or communication of thoughts, impressions and information over distance between two people without the normal operation of the recognized sense organs. The author offers a comprehensive historical review of the psychoanalytic literature on this controversial issue, beginning with Freud's years-long struggles over the possibility of thoughttransference and dream telepathy. She then describes her own analytic encounter over the years with five patients' telepathic dreams-dreams involving precise details of the time, place, sensory impressions, and experiential states that the analyst was in at that time, which the patients could not have known through ordinary sensory perception and communication. The author's ensuing explanation combines contributory factors involving patient, archaic communication and analyst. Each of these patients, in early childhood, had a mother who was emotionally absent-within-absence, due to the absence of a significant figure in her own life. This primary traumatic loss was imprinted in their nascent selves and inchoate relating to others, with a fixation on a nonverbal, archaic mode of communication. The patient's telepathic dream is formed as a search engine when the analyst is suddenly emotionally absent, in order to find the analyst and thus halt the process of abandonment and prevent collapse into the despair of the early traumatization. Hence, the telepathic dream embodies an enigmatic 'impossible' extreme of patient-analyst deep-level interconnectedness and unconscious communication in the analytic process. This paper is part of the author's endeavour to grasp the true experiential scope and therapeutic significance of this dimension of fundamental patient-analyst interconnectedness.

  18. FOCIS: A forest classification and inventory system using LANDSAT and digital terrain data

    NASA Technical Reports Server (NTRS)

    Strahler, A. H.; Franklin, J.; Woodcook, C. E.; Logan, T. L.

    1981-01-01

    Accurate, cost-effective stratification of forest vegetation and timber inventory is the primary goal of a Forest Classification and Inventory System (FOCIS). Conventional timber stratification using photointerpretation can be time-consuming, costly, and inconsistent from analyst to analyst. FOCIS was designed to overcome these problems by using machine processing techniques to extract and process tonal, textural, and terrain information from registered LANDSAT multispectral and digital terrain data. Comparison of samples from timber strata identified by conventional procedures showed that both have about the same potential to reduce the variance of timber volume estimates over simple random sampling.

  19. RUPTURES IN THE ANALYTIC SETTING AND DISTURBANCES IN THE TRANSFORMATIONAL FIELD OF DREAMS.

    PubMed

    Brown, Lawrence J

    2015-10-01

    This paper explores some implications of Bleger's (1967, 2013) concept of the analytic situation, which he views as comprising the analytic setting and the analytic process. The author discusses Bleger's idea of the analytic setting as the depositary for projected painful aspects in either the analyst or patient or both-affects that are then rendered as nonprocess. In contrast, the contents of the analytic process are subject to an incessant process of transformation (Green 2005). The author goes on to enumerate various components of the analytic setting: the nonhuman, object relational, and the analyst's "person" (including mental functioning). An extended clinical vignette is offered as an illustration. © 2015 The Psychoanalytic Quarterly, Inc.

  20. ICAP - An Interactive Cluster Analysis Procedure for analyzing remotely sensed data

    NASA Technical Reports Server (NTRS)

    Wharton, S. W.; Turner, B. J.

    1981-01-01

    An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. ICAP differs from conventional clustering algorithms by allowing the analyst to optimize the cluster configuration by inspection, rather than by manipulating process parameters. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters, and the analyst, who can evaluate and elect to modify the cluster structure. Clusters can be deleted, or lumped together pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The principal advantage of this approach is that it allows prior information (when available) to be used directly in the analysis, since the analyst interacts with ICAP in a straightforward manner, using basic terms with which he is more likely to be familiar. Results from testing ICAP showed that an informed use of ICAP can improve classification, as compared to an existing cluster analysis procedure.

  1. An eye tracking study of bloodstain pattern analysts during pattern classification.

    PubMed

    Arthur, R M; Hoogenboom, J; Green, R D; Taylor, M C; de Bruin, K G

    2018-05-01

    Bloodstain pattern analysis (BPA) is the forensic discipline concerned with the classification and interpretation of bloodstains and bloodstain patterns at the crime scene. At present, it is unclear exactly which stain or pattern properties and their associated values are most relevant to analysts when classifying a bloodstain pattern. Eye tracking technology has been widely used to investigate human perception and cognition. Its application to forensics, however, is limited. This is the first study to use eye tracking as a tool for gaining access to the mindset of the bloodstain pattern expert. An eye tracking method was used to follow the gaze of 24 bloodstain pattern analysts during an assigned task of classifying a laboratory-generated test bloodstain pattern. With the aid of an automated image-processing methodology, the properties of selected features of the pattern were quantified leading to the delineation of areas of interest (AOIs). Eye tracking data were collected for each AOI and combined with verbal statements made by analysts after the classification task to determine the critical range of values for relevant diagnostic features. Eye-tracking data indicated that there were four main regions of the pattern that analysts were most interested in. Within each region, individual elements or groups of elements that exhibited features associated with directionality, size, colour and shape appeared to capture the most interest of analysts during the classification task. The study showed that the eye movements of trained bloodstain pattern experts and their verbal descriptions of a pattern were well correlated.

  2. The Case for Licensure of Applied Behavior Analysts

    PubMed Central

    Dorsey, Michael F; Weinberg, Michael; Zane, Thomas; Guidi, Megan M

    2009-01-01

    The evolution of the field of applied behavior analysis to a practice-oriented profession has created the need to ensure that the consumers of these services are adequately protected. We review the limitations of the current board certification process and present a rationale for the establishment of licensing standards for applied behavior analysts on a state-by-state basis. Recommendations for securing the passage of a licensure bill also are discussed. PMID:22477697

  3. Macro-economic factors influencing the architectural business model shift in the pharmaceutical industry.

    PubMed

    Dierks, Raphaela Marie Louisa; Bruyère, Olivier; Reginster, Jean-Yves; Richy, Florent-Frederic

    2016-10-01

    Technological innovations, new regulations, increasing costs of drug productions and new demands are only few key drivers of a projected alternation in the pharmaceutical industry. The purpose of this review is to understand the macro economic factors responsible for the business model revolution to possess a competitive advantage over market players. Areas covered: Existing literature on macro-economic factors changing the pharmaceutical landscape has been reviewed to present a clear image of the current market environment. Expert commentary: Literature shows that pharmaceutical companies are facing an architectural alteration, however the evidence on the rationale driving the transformation is outstanding. Merger & Acquisitions (M&A) deals and collaborations are headlining the papers. Q1 2016 did show a major slowdown in M&A deals by volume since 2013 (with deal cancellations of Pfizer and Allergan, or the downfall of Valeant), but pharmaceutical analysts remain confident that this shortfall was a consequence of the equity market volatility. It seems likely that the shift to an M&A model will become apparent during the remainder of 2016, with deal announcements of Abbott Laboratories, AbbVie and Sanofi worth USD 45billion showing the appetite of big pharma companies to shift from the fully vertical integrated business model to more horizontal business models.

  4. Using participatory design to develop (public) health decision support systems through GIS.

    PubMed

    Dredger, S Michelle; Kothari, Anita; Morrison, Jason; Sawada, Michael; Crighton, Eric J; Graham, Ian D

    2007-11-27

    Organizations that collect substantial data for decision-making purposes are often characterized as being 'data rich' but 'information poor'. Maps and mapping tools can be very useful for research transfer in converting locally collected data into information. Challenges involved in incorporating GIS applications into the decision-making process within the non-profit (public) health sector include a lack of financial resources for software acquisition and training for non-specialists to use such tools. This on-going project has two primary phases. This paper critically reflects on Phase 1: the participatory design (PD) process of developing a collaborative web-based GIS tool. A case study design is being used whereby the case is defined as the data analyst and manager dyad (a two person team) in selected Ontario Early Year Centres (OEYCs). Multiple cases are used to support the reliability of findings. With nine producer/user pair participants, the goal in Phase 1 was to identify barriers to map production, and through the participatory design process, develop a web-based GIS tool suited for data analysts and their managers. This study has been guided by the Ottawa Model of Research Use (OMRU) conceptual framework. Due to wide variations in OEYC structures, only some data analysts used mapping software and there was no consistency or standardization in the software being used. Consequently, very little sharing of maps and data occurred among data analysts. Using PD, this project developed a web-based mapping tool (EYEMAP) that was easy to use, protected proprietary data, and permit limited and controlled sharing between participants. By providing data analysts with training on its use, the project also ensured that data analysts would not break cartographic conventions (e.g. using a chloropleth map for count data). Interoperability was built into the web-based solution; that is, EYEMAP can read many different standard mapping file formats (e.g. ESRI, MapInfo, CSV). Based on the evaluation of Phase 1, the PD process has served both as a facilitator and a barrier. In terms of successes, the PD process identified two key components that are important to users: increased data/map sharing functionality and interoperability. Some of the challenges affected developers and users; both individually and as a collective. From a development perspective, this project experienced difficulties in obtaining personnel skilled in web application development and GIS. For users, some data sharing barriers are beyond what a technological tool can address (e.g. third party data). Lastly, the PD process occurs in real time; both a strength and a limitation. Programmatic changes at the provincial level and staff turnover at the organizational level made it difficult to maintain buy-in as participants changed over time. The impacts of these successes and challenges will be evaluated more concretely at the end of Phase 2. PD approaches, by their very nature, encourage buy-in to the development process, better addresses user-needs, and creates a sense of user-investment and ownership.

  5. Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.

    PubMed

    Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A

    2018-01-01

    Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.

  6. Psychotherapy in the aesthetic attitude.

    PubMed

    Beebe, John

    2010-04-01

    Drawing upon the writings of Jungian analyst Joseph Henderson on unconscious attitudes toward culture that patients and analysts may bring to therapy, the author defines the aesthetic attitude as one of the basic ways that cultural experience is instinctively accessed and processed so that it can become part of an individual's self experience. In analytic treatment, the aesthetic attitude emerges as part of what Jung called the transcendent function to create new symbolic possibilities for the growth of consciousness. It can provide creative opportunities for new adaptation where individuation has become stuck in unconscious complexes, both personal and cultural. In contrast to formulations that have compared depth psychotherapy to religious ritual, philosophic discourse, and renewal of socialization, this paper focuses upon the considerations of beauty that make psychotherapy also an art. In psychotherapeutic work, the aesthetic attitude confronts both analyst and patient with the problem of taste, affects how the treatment is shaped and 'framed', and can grant a dimension of grace to the analyst's mirroring of the struggles that attend the patient's effort to be a more smoothly functioning human being. The patient may learn to extend the same grace to the analyst's fumbling attempts to be helpful. The author suggests that the aesthetic attitude is thus a help in the resolution of both countertransference and transference en route to psychological healing.

  7. Utilizing semantic Wiki technology for intelligence analysis at the tactical edge

    NASA Astrophysics Data System (ADS)

    Little, Eric

    2014-05-01

    Challenges exist for intelligence analysts to efficiently and accurately process large amounts of data collected from a myriad of available data sources. These challenges are even more evident for analysts who must operate within small military units at the tactical edge. In such environments, decisions must be made quickly without guaranteed access to the kinds of large-scale data sources available to analysts working at intelligence agencies. Improved technologies must be provided to analysts at the tactical edge to make informed, reliable decisions, since this is often a critical collection point for important intelligence data. To aid tactical edge users, new types of intelligent, automated technology interfaces are required to allow them to rapidly explore information associated with the intersection of hard and soft data fusion, such as multi-INT signals, semantic models, social network data, and natural language processing of text. Abilities to fuse these types of data is paramount to providing decision superiority. For these types of applications, we have developed BLADE. BLADE allows users to dynamically add, delete and link data via a semantic wiki, allowing for improved interaction between different users. Analysts can see information updates in near-real-time due to a common underlying set of semantic models operating within a triple store that allows for updates on related data points from independent users tracking different items (persons, events, locations, organizations, etc.). The wiki can capture pictures, videos and related information. New information added directly to pages is automatically updated in the triple store and its provenance and pedigree is tracked over time, making that data more trustworthy and easily integrated with other users' pages.

  8. Rockfall hazard analysis using LiDAR and spatial modeling

    NASA Astrophysics Data System (ADS)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  9. Modeling a terminology-based electronic nursing record system: an object-oriented approach.

    PubMed

    Park, Hyeoun-Ae; Cho, InSook; Byeun, NamSoo

    2007-10-01

    The aim of this study was to present our perspectives on healthcare information analysis at a conceptual level and the lessons learned from our experience with the development of a terminology-based enterprise electronic nursing record system - which was one of components in an EMR system at a tertiary teaching hospital in Korea - using an object-oriented system analysis and design concept. To ensure a systematic approach and effective collaboration, the department of nursing constituted a system modeling team comprising a project manager, systems analysts, user representatives, an object-oriented methodology expert, and healthcare informaticists (including the authors). A rational unified process (RUP) and the Unified Modeling Language were used as a development process and for modeling notation, respectively. From the scenario and RUP approach, user requirements were formulated into use case sets and the sequence of activities in the scenario was depicted in an activity diagram. The structure of the system was presented in a class diagram. This approach allowed us to identify clearly the structural and behavioral states and important factors of a terminology-based ENR system (e.g., business concerns and system design concerns) according to the viewpoints of both domain and technical experts.

  10. An Introduction to the Mission Risk Diagnostic for Incident Management Capabilities (MRD-IMC)

    DTIC Science & Technology

    2014-05-01

    objectives. Analysts applying the MRD- IMC evaluate a set of systemic risk factors (called drivers) to aggregate decision-making data and provide decision...function is in position to achieve its mission and objective(s) [Alberts 2012]. To accomplish this goal, analysts applying the MRD- IMC evaluate a...005 | 3 evaluation of IM processes and capabilities. The MRD- IMC comprises the following three core tasks: 1. Identify the mission and objective(s

  11. Safety analysts training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolton, P.

    The purpose of this task was to support ESH-3 in providing Airborne Release Fraction and Respirable Fraction training to safety analysts at LANL who perform accident analysis, hazard analysis, safety analysis, and/or risk assessments at nuclear facilities. The task included preparation of materials for and the conduct of two 3-day training courses covering the following topics: safety analysis process; calculation model; aerosol physic concepts for safety analysis; and overview of empirically derived airborne release fractions and respirable fractions.

  12. On the question of self-disclosure by the analyst: error or advance in technique?

    PubMed

    Jacobs, T

    1999-04-01

    The question of self-disclosure by the analyst and its uses in treatment is an issue widely debated today. In this paper, the author reviews this controversial technique from historical and contemporary points of view, delineates several forms of self-disclosure, and, by means of several clinical examples, discusses the effects on the patient and the analytic process of utilizing one or another kind of self-disclosure in these particular situations.

  13. Concept Development and Experimentation Policy and Process: How Analysis Provides Rigour

    DTIC Science & Technology

    2010-04-01

    modelling and simulation techniques, but in reality the main tool in use is common sense and logic. The main goal of OA analyst is to bring forward those...doing so she should distinguish between the ideal and the intended or desired models to approach the reality as much as possible. Subsequently, the...and collection of measurements to be conducted. In doing so the analyst must ensure to distinguish between the actual and the perceived reality . From

  14. Examination of suspicious objects by virus analysts

    NASA Astrophysics Data System (ADS)

    Ananin, E. V.; Ananina, I. S.; Nikishova, A. V.

    2018-05-01

    The paper presents data on virus threats urgency. But in order for antiviruses to work properly, all data on new implementations of viruses should be added to its database. For that to be done, all suspicious objects should be investigated. It is a dangerous process and should be done in the virtual system. However, it is not secure for the main system as well. So the diagram of a secure workplace for a virus analyst is proposed. It contains software for its protection. Also all kinds of setting to ensure security of the process of investigating suspicious objects are proposed. The proposed approach allows minimizing risks caused by the virus.

  15. ICAP: An Interactive Cluster Analysis Procedure for analyzing remotely sensed data. [to classify the radiance data to produce a thematic map

    NASA Technical Reports Server (NTRS)

    Wharton, S. W.

    1980-01-01

    An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. The algorithm interfaces the rapid numerical processing capacity of a computer with the human ability to integrate qualitative information. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters and the analyst, who evaluate and elect to modify the cluster structure. Clusters can be deleted or lumped pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The ICAP was implemented in APL (A Programming Language), an interactive computer language. The flexibility of the algorithm was evaluated using data from different LANDSAT scenes to simulate two situations: one in which the analyst is assumed to have no prior knowledge about the data and wishes to have the clusters formed more or less automatically; and the other in which the analyst is assumed to have some knowledge about the data structure and wishes to use that information to closely supervise the clustering process. For comparison, an existing clustering method was also applied to the two data sets.

  16. Thick- and thin-skinned organisations and enactment in borderline and narcissistic disorders.

    PubMed

    Bateman, A W

    1998-02-01

    In this paper the author argues that enactment is any mutual action within the patient/analyst relationship that arises in the context of difficulties in countertransference work. Such enactment is common during the treatment of borderline and narcissistic disorders. In order to delineate different forms of enactment, which in his view may be either to the detriment or to the benefit of the analytic process, the author describes a patient who was identified primarily with a sadistic mother and who threatened the analyst with a knife during treatment. Three levels of enactment involving countertransference responses are described of which two, namely a collusive countertransference and a defensive countertransference, were detrimental to the analytic process. The third level of enactment was beneficial but only because the intervention by the analyst was independent of the analytic process and yet in response to it. The author uses Rosenfeld's distinction between thin-skinned and thick-skinned narcissists to illustrate how enactment is most likely when a patient moves between thick-skinned and thin-skinned narcissistic positions. Nevertheless the move between thin and thick-skinned positions presents an opportunity for effective interpretation, allowing progress in treatment.

  17. Comparison of air-coupled GPR data analysis results determined by multiple analysts

    NASA Astrophysics Data System (ADS)

    Martino, Nicole; Maser, Ken

    2016-04-01

    Current bridge deck condition assessments using ground penetrating radar (GPR) requires a trained analyst to manually interpret substructure layering information from B-scan images in order to proceed with an intended analysis (pavement thickness, concrete cover, effects of rebar corrosion, etc.) For example, a recently developed method to rapidly and accurately analyze air-coupled GPR data based on the effects of rebar corrosion, requires that a user "picks" a layer of rebar reflections in each B-scan image collected along the length of the deck. These "picks" have information like signal amplitude and two way travel time. When a deck is new, or has little rebar corrosion, the resulting layer of rebar reflections is readily evident and there is little room for subjectivity. However, when a deck is severely deteriorated, the rebar layer may be difficult to identify, and different analysts may make different interpretations of the appropriate layer to analyze. One highly corroded bridge deck, was assessed with a number of nondestructive evaluation techniques including 2GHz air-coupled GPR. Two trained analysts separately selected the rebar layer in each B-scan image, choosing as much information as possible, even in areas of significant deterioration. The post processing of the selected data points was then completed and the results from each analyst were contour plotted to observe any discrepancies. The paper describes the differences between ground coupled and air-coupled GPR systems, the data collection and analysis methods used by two different analysts for one case study, and the results of the two different analyses.

  18. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.

  19. The triadic intersubjective matrix in supervision: the use of disclosure to work through painful affects.

    PubMed

    Brown, Lawrence J; Miller, Martin

    2002-08-01

    The use of the psychoanalyst's subjective reactions as a tool to better understand his/her patient has been a central feature of clinical thinking in recent decades. While there has been much discussion and debate about the analyst's use of countertransference in individual psychoanalysis, including possible disclosure of his/her feelings to the patient, the literature on supervision has been slower to consider such matters. The attention to parallel processes in supervision has been helpful in appreciating the impact of affects arising in either the analyst/patient or the supervisor/analyst dyads upon the analytic treatment and its supervision. This contribution addresses the ways in which overlapping aspects of the personalities of the supervisor, analyst and patient may intersect and create resistances in the treatment. That three-way intersection, described here as the triadic intersubjective matrix, is considered inevitable in all supervised treatments. A clinical example from the termination phase of a supervised analysis of an adolescent is offered to illustrate these points. Finally, the question of self-disclosure as an aspect of the supervisory alliance is also discussed.

  20. Behavior analysis and public policy

    PubMed Central

    Fawcett, Stephen B.; Bernstein, Gail S.; Czyzewski, Mare J.; Greene, Brandon F.; Hannah, Gerald T.; Iwata, Brian A.; Jason, Leonard A.; Mathews, R. Mark; Morris, Edward K.; Otis-Wilborn, Amy; Seekins, Tom; Winett, Richard A.

    1988-01-01

    The Task Force on Public Policy was created to examine ways for behavior analysts to be more functional citizen scientists in the policymaking arena. This report informs readers about the contexts and processes of policymaking; and it outlines issues regarding the roles of behavior analysts in crating policy-relevant conceptual analyses, generating research data, and communicating policy-relevant information. We also discuss a possible role for the professional association in enhancing analysis, research, and advocacy on policies relevant to the public interest. PMID:22477991

  1. An Analysis of Airline Costs. Lecture Notes for MIT Courses. 16.73 Airline Management and Marketing

    NASA Technical Reports Server (NTRS)

    Simpson, R. W.

    1972-01-01

    The cost analyst must understand the operations of the airline and how the activities of the airline are measured, as well as how the costs are incurred and recorded. The data source is usually a cost accounting process. This provides data on the cumulated expenses in various categories over a time period like a quarter, or year, and must be correlated by the analyst with cumulated measures of airline activity which seem to be causing this expense.

  2. Idealization of the analyst by the young adult.

    PubMed

    Chused, J F

    1987-01-01

    Idealization is an intrapsychic process that serves many functions. In addition to its use defensively and for gratification of libidinal and aggressive drive derivatives, it can contribute to developmental progression, particularly during late adolescence and young adulthood. During an analysis, it is important to recognize all the determinants of idealization, including those related to the reworking of developmental conflicts. If an analyst understands idealization solely as a manifestation of pathology, he may interfere with his patient's use of it for the development of autonomous functioning.

  3. Human Processes in Intelligence Analysis: Phase I Overview

    DTIC Science & Technology

    1979-12-01

    Inodrtpusehsreac, several operating definitions were thv model, and is based on field obser- adopted. A basic defnition was that vations made from tha...Similarly, the IMINT bo•xes of different analysts, analyst who understands the problems Comnputer data bases, such as those of of the reconnaissance pilot has...TaiulanNevl Franea 3 USA Aviation Test K, Pt Rucler. ATTN: STlG-P( I Prin Scientific Off. Ar-1 HIm mngr Rich Olv. Miniatry 1 USA Apy hr Av4iao SAWe

  4. Famine Early Warning Systems Network (FEWS NET) Agro-climatology Analysis Tools and Knowledge Base Products for Food Security Applications

    NASA Astrophysics Data System (ADS)

    Budde, M. E.; Rowland, J.; Anthony, M.; Palka, S.; Martinez, J.; Hussain, R.

    2017-12-01

    The U.S. Geological Survey (USGS) supports the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET). The USGS Earth Resources Observation and Science (EROS) Center has developed tools designed to aid food security analysts in developing assumptions of agro-climatological outcomes. There are four primary steps to developing agro-climatology assumptions; including: 1) understanding the climatology, 2) evaluating current climate modes, 3) interpretation of forecast information, and 4) incorporation of monitoring data. Analysts routinely forecast outcomes well in advance of the growing season, which relies on knowledge of climatology. A few months prior to the growing season, analysts can assess large-scale climate modes that might influence seasonal outcomes. Within two months of the growing season, analysts can evaluate seasonal forecast information as indicators. Once the growing season begins, monitoring data, based on remote sensing and field information, can characterize the start of season and remain integral monitoring tools throughout the duration of the season. Each subsequent step in the process can lead to modifications of the original climatology assumption. To support such analyses, we have created an agro-climatology analysis tool that characterizes each step in the assumption building process. Satellite-based rainfall and normalized difference vegetation index (NDVI)-based products support both the climatology and monitoring steps, sea-surface temperature data and knowledge of the global climate system inform the climate modes, and precipitation forecasts at multiple scales support the interpretation of forecast information. Organizing these data for a user-specified area provides a valuable tool for food security analysts to better formulate agro-climatology assumptions that feed into food security assessments. We have also developed a knowledge base for over 80 countries that provide rainfall and NDVI-based products, including annual and seasonal summaries, historical anomalies, coefficient of variation, and number of years below 70% of annual or seasonal averages. These products provide a quick look for analysts to assess the agro-climatology of a country.

  5. Frizzled to finance: one PhD's path from a Drosophila lab to Wall Street.

    PubMed

    Taylor, Job

    2016-06-01

    An equity research analyst's job is to determine whether the price of a stock is likely to go up or down. For science-based businesses, particularly biotechnology companies, a PhD in the life sciences can be very helpful in making this determination. I transitioned from a postdoc position to working in equity research. Here I present information on how I made the transition, an overview of the day-to-day activities of an analyst, and thoughts on how to prepare to look for a job in finance. There are significant positives to working on Wall Street, including exposure to cutting-edge clinical/translational research, access to some of the best scientists in the world, a dynamic work environment, and compensation that generally exceeds academic salaries. This comes at the cost of some independence and the satisfaction of being able to call oneself a scientist. © 2016 Taylor. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  6. Co-Creativity and Interactive Repair: Commentary on Berta Bornstein's "The Analysis of a Phobic Child".

    PubMed

    Harrison, Alexandra

    2014-01-01

    My comments focus on a consideration of three issues central to child psychoanalysis stimulated by rereading the classic paper by Berta Bornstein, "The Analysis of a Phobic Child: Some Problems of Theory and Technique in Child Analysis": (1) the importance of "co-creativity" and its use in analysis to repair disruptions in the mother-child relationship; (2) working analytically with the "inner world of the child "; and (3) the fundamental importance of multiple simultaneous meaning-making processes. I begin with a discussion of current thinking about the importance of interactive processes in developmental and therapeutic change and then lead to the concepts of "co-creativity" and interactive repair, elements that are missing in the "Frankie" paper. The co-creative process that I outline includes multiple contributions that Frankie and his caregivers brought to their relationships--his mother, his father, his nurse, and even his analyst. I then address the question of how child analysts can maintain a central focus on the inner world of the child while still taking into account the complex nature of co-creativity in the change process. Finally, I discuss insights into the multiple simultaneous meaning-making processes in the analytic relationship to effect therapeutic change, including what I call the "sandwich model," an attempt to organize this complexity so that is more accessible to the practicing clinician. In terms of the specific case of Frankie, my reading of the case suggests that failure to repair disruptions in the mother-child relationship from infancy through the time of the analytic treatment was central to Frankie's problems. My hypothesis is that, rather than the content of his analyst's interpretations, what was helpful to Frankie in the analysis was the series of attempts at interactive repair in the analytic process. Unfortunately, the case report does not offer data to test this hypothesis. Indeed, one concluding observation from my reading of this classic case is how useful it would be for the contemporary analyst to pay attention to the multifaceted co-creative process in order to explain and foster the therapeutic change that can occur in analysis.

  7. The Environmental Protection Agency in the Early Trump Administration: Prelude to Regulatory Capture.

    PubMed

    Dillon, Lindsey; Sellers, Christopher; Underhill, Vivian; Shapiro, Nicholas; Ohayon, Jennifer Liss; Sullivan, Marianne; Brown, Phil; Harrison, Jill; Wylie, Sara

    2018-04-01

    We explore and contextualize changes at the Environmental Protection Agency (EPA) over the first 6 months of the Trump administration, arguing that its pro-business direction is enabling a form of regulatory capture. We draw on news articles, public documents, and a rapid response, multisited interview study of current and retired EPA employees to (1) document changes associated with the new administration, (2) contextualize and compare the current pro-business makeover with previous ones, and (3) publicly convey findings in a timely manner. The lengthy, combined experience of interviewees with previous Republican and Democratic administrations made them valuable analysts for assessing recent shifts at the Scott Pruitt-led EPA and the extent to which these shifts steer the EPA away from its stated mission to "protect human and environmental health." Considering the extent of its pro-business leanings in the absence of mitigating power from the legislative branch, we conclude that its regulatory capture has become likely-more so than at similar moments in the agency's 47-year history. The public and environmental health consequences of regulatory capture of the EPA will probably be severe and far-reaching.

  8. The Environmental Protection Agency in the Early Trump Administration: Prelude to Regulatory Capture

    PubMed Central

    Sellers, Christopher; Underhill, Vivian; Shapiro, Nicholas; Ohayon, Jennifer Liss; Sullivan, Marianne; Brown, Phil; Harrison, Jill; Wylie, Sara

    2018-01-01

    We explore and contextualize changes at the Environmental Protection Agency (EPA) over the first 6 months of the Trump administration, arguing that its pro-business direction is enabling a form of regulatory capture. We draw on news articles, public documents, and a rapid response, multisited interview study of current and retired EPA employees to (1) document changes associated with the new administration, (2) contextualize and compare the current pro-business makeover with previous ones, and (3) publicly convey findings in a timely manner. The lengthy, combined experience of interviewees with previous Republican and Democratic administrations made them valuable analysts for assessing recent shifts at the Scott Pruitt–led EPA and the extent to which these shifts steer the EPA away from its stated mission to “protect human and environmental health.” Considering the extent of its pro-business leanings in the absence of mitigating power from the legislative branch, we conclude that its regulatory capture has become likely—more so than at similar moments in the agency’s 47-year history. The public and environmental health consequences of regulatory capture of the EPA will probably be severe and far-reaching. PMID:29698086

  9. MetaboAnalyst 3.0--making metabolomics more meaningful.

    PubMed

    Xia, Jianguo; Sinelnikov, Igor V; Han, Beomsoo; Wishart, David S

    2015-07-01

    MetaboAnalyst (www.metaboanalyst.ca) is a web server designed to permit comprehensive metabolomic data analysis, visualization and interpretation. It supports a wide range of complex statistical calculations and high quality graphical rendering functions that require significant computational resources. First introduced in 2009, MetaboAnalyst has experienced more than a 50X growth in user traffic (>50 000 jobs processed each month). In order to keep up with the rapidly increasing computational demands and a growing number of requests to support translational and systems biology applications, we performed a substantial rewrite and major feature upgrade of the server. The result is MetaboAnalyst 3.0. By completely re-implementing the MetaboAnalyst suite using the latest web framework technologies, we have been able substantially improve its performance, capacity and user interactivity. Three new modules have also been added including: (i) a module for biomarker analysis based on the calculation of receiver operating characteristic curves; (ii) a module for sample size estimation and power analysis for improved planning of metabolomics studies and (iii) a module to support integrative pathway analysis for both genes and metabolites. In addition, popular features found in existing modules have been significantly enhanced by upgrading the graphical output, expanding the compound libraries and by adding support for more diverse organisms. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Human-machine interaction to disambiguate entities in unstructured text and structured datasets

    NASA Astrophysics Data System (ADS)

    Ward, Kevin; Davenport, Jack

    2017-05-01

    Creating entity network graphs is a manual, time consuming process for an intelligence analyst. Beyond the traditional big data problems of information overload, individuals are often referred to by multiple names and shifting titles as they advance in their organizations over time which quickly makes simple string or phonetic alignment methods for entities insufficient. Conversely, automated methods for relationship extraction and entity disambiguation typically produce questionable results with no way for users to vet results, correct mistakes or influence the algorithm's future results. We present an entity disambiguation tool, DRADIS, which aims to bridge the gap between human-centric and machinecentric methods. DRADIS automatically extracts entities from multi-source datasets and models them as a complex set of attributes and relationships. Entities are disambiguated across the corpus using a hierarchical model executed in Spark allowing it to scale to operational sized data. Resolution results are presented to the analyst complete with sourcing information for each mention and relationship allowing analysts to quickly vet the correctness of results as well as correct mistakes. Corrected results are used by the system to refine the underlying model allowing analysts to optimize the general model to better deal with their operational data. Providing analysts with the ability to validate and correct the model to produce a system they can trust enables them to better focus their time on producing higher quality analysis products.

  11. Using participatory design to develop (public) health decision support systems through GIS

    PubMed Central

    Dredger, S Michelle; Kothari, Anita; Morrison, Jason; Sawada, Michael; Crighton, Eric J; Graham, Ian D

    2007-01-01

    Background Organizations that collect substantial data for decision-making purposes are often characterized as being 'data rich' but 'information poor'. Maps and mapping tools can be very useful for research transfer in converting locally collected data into information. Challenges involved in incorporating GIS applications into the decision-making process within the non-profit (public) health sector include a lack of financial resources for software acquisition and training for non-specialists to use such tools. This on-going project has two primary phases. This paper critically reflects on Phase 1: the participatory design (PD) process of developing a collaborative web-based GIS tool. Methods A case study design is being used whereby the case is defined as the data analyst and manager dyad (a two person team) in selected Ontario Early Year Centres (OEYCs). Multiple cases are used to support the reliability of findings. With nine producer/user pair participants, the goal in Phase 1 was to identify barriers to map production, and through the participatory design process, develop a web-based GIS tool suited for data analysts and their managers. This study has been guided by the Ottawa Model of Research Use (OMRU) conceptual framework. Results Due to wide variations in OEYC structures, only some data analysts used mapping software and there was no consistency or standardization in the software being used. Consequently, very little sharing of maps and data occurred among data analysts. Using PD, this project developed a web-based mapping tool (EYEMAP) that was easy to use, protected proprietary data, and permit limited and controlled sharing between participants. By providing data analysts with training on its use, the project also ensured that data analysts would not break cartographic conventions (e.g. using a chloropleth map for count data). Interoperability was built into the web-based solution; that is, EYEMAP can read many different standard mapping file formats (e.g. ESRI, MapInfo, CSV). Discussion Based on the evaluation of Phase 1, the PD process has served both as a facilitator and a barrier. In terms of successes, the PD process identified two key components that are important to users: increased data/map sharing functionality and interoperability. Some of the challenges affected developers and users; both individually and as a collective. From a development perspective, this project experienced difficulties in obtaining personnel skilled in web application development and GIS. For users, some data sharing barriers are beyond what a technological tool can address (e.g. third party data). Lastly, the PD process occurs in real time; both a strength and a limitation. Programmatic changes at the provincial level and staff turnover at the organizational level made it difficult to maintain buy-in as participants changed over time. The impacts of these successes and challenges will be evaluated more concretely at the end of Phase 2. Conclusion PD approaches, by their very nature, encourage buy-in to the development process, better addresses user-needs, and creates a sense of user-investment and ownership. PMID:18042298

  12. Essentials of psychoanalytic cure: a symposium. Introduction and survey of some previous views.

    PubMed

    Osman, M P; Tabachnick, N D

    1988-01-01

    Friedman (1978) suggested that implicit in the theories of the psychoanalytic process a classification of three separate trends can be identified. In the first instance, there is what could be called "understanding," whether it be intellectual or emotional. Second, there is "attachment," which refers to curative measures based on some "binding emotional reaction to the analyst." And third, and less explicitly, there is "integration," which refers to the development of a synthesis that has the effect of harmonizing parts of the mind or elevating psychic functioning to a higher level. Freud's writings embodied all three of these trends. The participants of the symposium at Marienbad, being strongly influenced by Strachey's emphasis on superego alteration through introjection, placed the greater stress on attachment. Loewald, emphasizing as he does the importance of the patient's identification with the analyst as a corrective reliving of the origins of identification in childhood, highlights attachment while also relating it to understanding. Stone and Gitelson also focused on the beneficial aspects of the affective link to the analyst and the important function served by this link in facilitating understanding of the analyst's interpretation. At the Edinburgh conference, however, aside from Gitelson and Nacht, who viewed attachment as an integrating or structuring aspect of the analytic process, the participants placed their confidence almost completely on "understanding" strictly through interpretation. In the latest debate between the proponents of self psychology and the object relations approach proposed by Kernberg, many aspects of these previous discussions and controversies have resurfaced (Friedman, 1978). Kohut, utilizing Freud's concept that links gratification and minimal frustration together as the developer of structure, relied on the empathic bond between patient and analyst as a basic component of the process of cure. Kernberg, however, relying predominantly on the conveying of insight through interpretation, is suspicious that this emphasis on attachment might reduce the clarity of understanding and in general prevent meaningful change from occurring. This sounds very much like the reaction of most participants of the Edinburgh symposium to the proposals of Gitelson.(ABSTRACT TRUNCATED AT 400 WORDS)

  13. Validation in the clinical process: four settings for objectification of the subjectivity of understanding.

    PubMed

    Beland, H

    1994-12-01

    Clinical material is presented for discussion with the aim of exemplifying the author's conceptions of validation in a number of sessions and in psychoanalytic research and of making them verifiable, susceptible to consensus and/or falsifiable. Since Freud's postscript to the Dora case, the first clinical validation in the history of psychoanalysis, validation has been group-related and society-related, that is to say, it combines the evidence of subjectivity with the consensus of the research community (the scientific community). Validation verifies the conformity of the unconscious transference meaning with the analyst's understanding. The deciding criterion is the patient's reaction to the interpretation. In terms of the theory of science, validation in the clinical process corresponds to experimental testing of truth in the sphere of inanimate nature. Four settings of validation can be distinguished: the analyst's self-supervision during the process of understanding, which goes from incomprehension to comprehension (container-contained, PS-->D, selected fact); the patient's reaction to the interpretation (insight) and the analyst's assessment of the reaction; supervision and second thoughts; and discussion in groups and publications leading to consensus. It is a peculiarity of psychoanalytic research that in the event of positive validation the three criteria of truth (evidence, consensus and utility) coincide.

  14. The CPAT 2.0.2 Domain Model - How CPAT 2.0.2 "Thinks" From an Analyst Perspective.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waddell, Lucas; Muldoon, Frank; Melander, Darryl J.

    To help effectively plan the management and modernization of their large and diverse fleets of vehicles, the Program Executive Office Ground Combat Systems (PEO GCS) and the Program Executive Office Combat Support and Combat Service Support (PEO CS &CSS) commissioned the development of a large - scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This reportmore » contains a description of the organizational fleet structure and a thorough explanation of the business rules that the CPAT formulation follows involving performance, scheduling, production, and budgets. This report, which is an update to the original CPAT domain model published in 2015 (SAND2015 - 4009), covers important new CPAT features. This page intentionally left blank« less

  15. Cybersim: geographic, temporal, and organizational dynamics of malware propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santhi, Nandakishore; Yan, Guanhua; Eidenbenz, Stephan

    2010-01-01

    Cyber-infractions into a nation's strategic security envelope pose a constant and daunting challenge. We present the modular CyberSim tool which has been developed in response to the need to realistically simulate at a national level, software vulnerabilities and resulting mal ware propagation in online social networks. CyberSim suite (a) can generate realistic scale-free networks from a database of geocoordinated computers to closely model social networks arising from personal and business email contacts and online communities; (b) maintains for each,bost a list of installed software, along with the latest published vulnerabilities; (d) allows designated initial nodes where malware gets introduced; (e)more » simulates, using distributed discrete event-driven technology, the spread of malware exploiting a specific vulnerability, with packet delay and user online behavior models; (f) provides a graphical visualization of spread of infection, its severity, businesses affected etc to the analyst. We present sample simulations on a national level network with millions of computers.« less

  16. W-026, Waste Receiving and Processing Facility data management system validation and verification report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmer, M.E.

    1997-12-05

    This V and V Report includes analysis of two revisions of the DMS [data management system] System Requirements Specification (SRS) and the Preliminary System Design Document (PSDD); the source code for the DMS Communication Module (DMSCOM) messages; the source code for selected DMS Screens, and the code for the BWAS Simulator. BDM Federal analysts used a series of matrices to: compare the requirements in the System Requirements Specification (SRS) to the specifications found in the System Design Document (SDD), to ensure the design supports the business functions, compare the discreet parts of the SDD with each other, to ensure thatmore » the design is consistent and cohesive, compare the source code of the DMS Communication Module with the specifications, to ensure that the resultant messages will support the design, compare the source code of selected screens to the specifications to ensure that resultant system screens will support the design, compare the source code of the BWAS simulator with the requirements to interface with DMS messages and data transfers relating to the BWAS operations.« less

  17. Inserting Phase Change Lines into Microsoft Excel® Graphs.

    PubMed

    Dubuque, Erick M

    2015-10-01

    Microsoft Excel® is a popular graphing tool used by behavior analysts to visually display data. However, this program is not always friendly to the graphing conventions used by behavior analysts. For example, adding phase change lines has typically been a cumbersome process involving the insertion of line objects that do not move when new data is added to a graph. The purpose of this article is to describe a novel way to add phase change lines that move when new data is added and when graphs are resized.

  18. The case of David: on the couch for sixty minutes, nine years of once-a-week treatment.

    PubMed

    Kavaler-Adler, Susan

    2005-06-01

    This paper illustrates a unique case of object relations psychoanalytic psychotherapy on a once-a-week treatment basis. The work of developmental mourning that would be thought to require two to five sessions a week was accomplished on a once-a-week basis. The analyst adjusted the treatment hour, in this one case, to 60 minutes, as opposed to the 45- or 50-minute hour. When treatment began, the analyst made an intuitive judgment to increase the patient's one session a week--which the patient made clear was all he was ready to do--to 60 minutes. The analyst made time in her practice for this 60-minute session and has continued with the patient using this format for 9 years of treatment. This had led up to the current stage of treatment, which has been so critical to the patient's self-integration process.

  19. Lacie phase 1 Classification and Mensuration Subsystem (CAMS) rework experiment

    NASA Technical Reports Server (NTRS)

    Chhikara, R. S.; Hsu, E. M.; Liszcz, C. J.

    1976-01-01

    An experiment was designed to test the ability of the Classification and Mensuration Subsystem rework operations to improve wheat proportion estimates for segments that had been processed previously. Sites selected for the experiment included three in Kansas and three in Texas, with the remaining five distributed in Montana and North and South Dakota. The acquisition dates were selected to be representative of imagery available in actual operations. No more than one acquisition per biophase were used, and biophases were determined by actual crop calendars. All sites were worked by each of four Analyst-Interpreter/Data Processing Analyst Teams who reviewed the initial processing of each segment and accepted or reworked it for an estimate of the proportion of small grains in the segment. Classification results, acquisitions and classification errors and performance results between CAMS regular and ITS rework are tabulated.

  20. SafetyAnalyst

    DOT National Transportation Integrated Search

    2009-01-01

    This booklet provides an overview of SafetyAnalyst. SafetyAnalyst is a set of software tools under development to help State and local highway agencies advance their programming of site-specific safety improvements. SafetyAnalyst will incorporate sta...

  1. Effects of Motivation: Rewarding Hackers for Undetected Attacks Cause Analysts to Perform Poorly.

    PubMed

    Maqbool, Zahid; Makhijani, Nidhi; Pammi, V S Chandrasekhar; Dutt, Varun

    2017-05-01

    The aim of this study was to determine how monetary motivations influence decision making of humans performing as security analysts and hackers in a cybersecurity game. Cyberattacks are increasing at an alarming rate. As cyberattacks often cause damage to existing cyber infrastructures, it is important to understand how monetary rewards may influence decision making of hackers and analysts in the cyber world. Currently, only limited attention has been given to this area. In an experiment, participants were randomly assigned to three between-subjects conditions ( n = 26 for each condition): equal payoff, where the magnitude of monetary rewards for hackers and defenders was the same; rewarding hacker, where the magnitude of monetary reward for hacker's successful attack was 10 times the reward for analyst's successful defense; and rewarding analyst, where the magnitude of monetary reward for analyst's successful defense was 10 times the reward for hacker's successful attack. In all conditions, half of the participants were human hackers playing against Nash analysts and half were human analysts playing against Nash hackers. Results revealed that monetary rewards for human hackers and analysts caused a decrease in attack and defend actions compared with the baseline. Furthermore, rewarding human hackers for undetected attacks made analysts deviate significantly from their optimal behavior. If hackers are rewarded for their undetected attack actions, then this causes analysts to deviate from optimal defend proportions. Thus, analysts need to be trained not become overenthusiastic in defending networks. Applications of our results are to networks where the influence of monetary rewards may cause information theft and system damage.

  2. Robotics-assisted mass spectrometry assay platform enabled by open-source electronics.

    PubMed

    Chiu, Shih-Hao; Urban, Pawel L

    2015-02-15

    Mass spectrometry (MS) is an important analytical technique with numerous applications in clinical analysis, biochemistry, environmental analysis, geology and physics. Its success builds on the ability of MS to determine molecular weights of analytes, and elucidate their structures. However, sample handling prior to MS requires a lot of attention and labor. In this work we were aiming to automate processing samples for MS so that analyses could be conducted without much supervision of experienced analysts. The goal of this study was to develop a robotics and information technology-oriented platform that could control the whole analysis process including sample delivery, reaction-based assay, data acquisition, and interaction with the analyst. The proposed platform incorporates a robotic arm for handling sample vials delivered to the laboratory, and several auxiliary devices which facilitate and secure the analysis process. They include: multi-relay board, infrared sensors, photo-interrupters, gyroscopes, force sensors, fingerprint scanner, barcode scanner, touch screen panel, and internet interface. The control of all the building blocks is achieved through implementation of open-source electronics (Arduino), and enabled by custom-written programs in C language. The advantages of the proposed system include: low cost, simplicity, small size, as well as facile automation of sample delivery and processing without the intervention of the analyst. It is envisaged that this simple robotic system may be the forerunner of automated laboratories dedicated to mass spectrometric analysis of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Methods of Estimating Strategic Intentions

    DTIC Science & Technology

    1982-05-01

    34mental Images of future events" which might be brought to reality. To assess this step In the intention process the analyst may have to consler the...li. DESCRIPTION OF THE INTENTION EST!MATION PROCESS AND RELATED ANALYTICAL AIDS ........ ....... ................ A...OF AIDS AND PROCESS SUMMARY ............................... 115 v . BIBLIOGRAPHY ....................... .. . . . ............... . ...... 138 3on

  4. Space shuttle/payload interface analysis. Volume 4: Business Risk and Value of Operations in Space (BRAVO). Part 3: Workbook

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A collection of blank worksheets for use on each BRAVO problem to be analyzed is supplied, for the purposes of recording the inputs for the BRAVO analysis, working out the definition of mission equipment, recording inputs to the satellite synthesis computer program, estimating satellite earth station costs, costing terrestrial systems, and cost effectiveness calculations. The group of analysts working BRAVO will normally use a set of worksheets on each problem, however, the workbook pages are of sufficiently good quality that the user can duplicate them, if more worksheet blanks are required than supplied. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.

  5. NREL Develops OpenEI.org, a Public Website Where Energy Data can be Generated, Shared, and Compared (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2013-12-01

    The National Renewable Energy Laboratory (NREL) has developed OpenEI.org, a public, open, data-sharing platform where consumers, analysts, industry experts, and energy decision makers can go to boost their energy IQs, search for energy data, share data, and get access to energy applications. The free site blends elements of social media, linked open-data practices, and MediaWiki-based technology to build a collaborative environment for creating and sharing energy data with the world. The result is a powerful platform that is helping government and industry leaders around the world define policy options, make informed investment decisions, and create new businesses.

  6. IT Security Support for the Spaceport Command Control System Development

    NASA Technical Reports Server (NTRS)

    Varise, Brian

    2014-01-01

    My job title is IT Security support for the Spaceport Command & Control System Development. As a cyber-security analyst it is my job to ensure NASA's information stays safe from cyber threats, such as, viruses, malware and denial-of-service attacks by establishing and enforcing system access controls. Security is very important in the world of technology and it is used everywhere from personal computers to giant networks ran by Government agencies worldwide. Without constant monitoring analysis, businesses, public organizations and government agencies are vulnerable to potential harmful infiltration of their computer information system. It is my responsibility to ensure authorized access by examining improper access, reporting violations, revoke access, monitor information request by new programming and recommend improvements. My department oversees the Launch Control System and networks. An audit will be conducted for the LCS based on compliance with the Federal Information Security Management Act (FISMA) and The National Institute of Standards and Technology (NIST). I recently finished analyzing the SANS top 20 critical controls to give cost effective recommendations on various software and hardware products for compliance. Upon my completion of this internship, I will have successfully completed my duties as well as gain knowledge that will be helpful to my career in the future as a Cyber Security Analyst.

  7. Proposed Conceptual Requirements for the CTBT Knowledge Base,

    DTIC Science & Technology

    1995-08-14

    knowledge available to automated processing routines and human analysts are significant, and solving these problems is an essential step in ensuring...knowledge storage in a CTBT system. In addition to providing regional knowledge to automated processing routines, the knowledge base will also address

  8. ASPECTS: an automation-assisted SPE method development system.

    PubMed

    Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu

    2013-07-01

    A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.

  9. Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.

    PubMed

    Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam

    2018-01-01

    During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.

  10. A reconsideration of the clinical work of Harold Searles.

    PubMed

    Benatar, May

    2008-01-01

    A rereading of the work of Harold Searles in light of the more contemporary paradigm of dissociation refreshes our insights into transference and countertransference phenomena with patients who rely heavily on dissociation. Searles's perspective on the dependence of the analyst (therapist) on his or her patients, the reality of patients' projections, and patients' need to cure their analyst can be helpful in resolving conundrums and impasses in the treatment of dissociative disordered patients. The reciprocity of dissociative phenomena between patients and therapists, wherein therapists must own the reality of their own dissociative processes in the therapeutic transaction, recontextualizes Searles's clinical contributions.

  11. An agile acquisition decision-support workbench for evaluating ISR effectiveness

    NASA Astrophysics Data System (ADS)

    Stouch, Daniel W.; Champagne, Valerie; Mow, Christopher; Rosenberg, Brad; Serrin, Joshua

    2011-06-01

    The U.S. Air Force is consistently evolving to support current and future operations through the planning and execution of intelligence, surveillance and reconnaissance (ISR) missions. However, it is a challenge to maintain a precise awareness of current and emerging ISR capabilities to properly prepare for future conflicts. We present a decisionsupport tool for acquisition managers to empirically compare ISR capabilities and approaches to employing them, thereby enabling the DoD to acquire ISR platforms and sensors that provide the greatest return on investment. We have developed an analysis environment to perform modeling and simulation-based experiments to objectively compare alternatives. First, the analyst specifies an operational scenario for an area of operations by providing terrain and threat information; a set of nominated collections; sensor and platform capabilities; and processing, exploitation, and dissemination (PED) capacities. Next, the analyst selects and configures ISR collection strategies to generate collection plans. The analyst then defines customizable measures of effectiveness or performance to compute during the experiment. Finally, the analyst empirically compares the efficacy of each solution and generates concise reports to document their conclusions, providing traceable evidence for acquisition decisions. Our capability demonstrates the utility of using a workbench environment for analysts to design and run experiments. Crafting impartial metrics enables the acquisition manager to focus on evaluating solutions based on specific military needs. Finally, the metric and collection plan visualizations provide an intuitive understanding of the suitability of particular solutions. This facilitates a more agile acquisition strategy that handles rapidly changing technology in response to current military needs.

  12. Employing socially driven techniques for framing, contextualization, and collaboration in complex analytical threads

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin

    2015-05-01

    The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.

  13. Projective identification and consciousness alteration: a bridge between psychoanalysis and neuroscience?

    PubMed

    Cimino, Cristiana; Correale, Antonello

    2005-02-01

    The authors claim that projective identification in the process of analysis should be considered in a circumscribed manner and seen as a very specific type of communication between the patient and the analyst, characterised through a modality that is simultaneously active, unconscious and discrete. In other words, the patient actively, though unconsciously and discretely--that is, in specific moments of the analysis--brings about particular changes in the analysts state. From the analyst's side, the effect of this type of communication is a sudden change in his general state--a sense of passivity and coercion and a change in the state of consciousness. This altered consciousness can range from an almost automatic repetition of a relational script to a moderate or serious contraction of the field of attention to full-fledged changes in the analyst's sense of self. The authors propose the theory that this type of communication is, in fact, the expression of traumatic contents of experiences emerging from the non-declarative memory. These contents belong to a pre-symbolic and pre-representative area of the mind. They are made of inert fragments of psychic material that are felt rather than thought, which can thus be viewed as a kind of writing to be completed. These pieces of psychic material are the expression of traumatic experiences that in turn exercise a traumatic effect on the analyst, inducing an altered state of consciousness in him as well. Such material should be understood as belonging to an unrepressed unconscious. Restitution of these fragments to the patient in representable forms must take place gradually and without trying to accelerate the timing, in order to avoid the possibility that the restitution itself constitute an acting on the part of the analyst, which would thus be a traumatic response to the traumatic action of the analytic material.

  14. MetaboAnalystR: an R package for flexible and reproducible analysis of metabolomics data.

    PubMed

    Chong, Jasmine; Xia, Jianguo

    2018-06-28

    The MetaboAnalyst web application has been widely used for metabolomics data analysis and interpretation. Despite its user-friendliness, the web interface has presented its inherent limitations (especially for advanced users) with regard to flexibility in creating customized workflow, support for reproducible analysis, and capacity in dealing with large data. To address these limitations, we have developed a companion R package (MetaboAnalystR) based on the R code base of the web server. The package has been thoroughly tested to ensure that the same R commands will produce identical results from both interfaces. MetaboAnalystR complements the MetaboAnalyst web server to facilitate transparent, flexible and reproducible analysis of metabolomics data. MetaboAnalystR is freely available from https://github.com/xia-lab/MetaboAnalystR. Supplementary data are available at Bioinformatics online.

  15. Simplified Processing Method for Meter Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, Kimberly M.; Colotelo, Alison H. A.; Downs, Janelle L.

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  16. The Policy Formation Process: A Conceptual Framework for Analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Fuchs, E. F.

    1972-01-01

    A conceptual framework for analysis which is intended to assist both the policy analyst and the policy researcher in their empirical investigations into policy phenomena is developed. It is meant to facilitate understanding of the policy formation process by focusing attention on the basic forces shaping the main features of policy formation as a dynamic social-political-organizational process. The primary contribution of the framework lies in its capability to suggest useful ways of looking at policy formation reality. It provides the analyst and the researcher with a group of indicators which suggest where to look and what to look for when attempting to analyze and understand the mix of forces which energize, maintain, and direct the operation of strategic level policy systems. The framework also highlights interconnections, linkage, and relational patterns between and among important variables. The framework offers an integrated set of conceptual tools which facilitate understanding of and research on the complex and dynamic set of variables which interact in any major strategic level policy formation process.

  17. Retro-Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Paul; Hanson, Paige; Ardi, Calvin

    2016-11-04

    A system for processing network packet capture streams, extracting metadata and generating flow records (via Argus). The system can be used by network security operators and analysts to enable forensic investigations for network security events.

  18. How the analytic process is captured and absorbed into the familiar, the feared, and the desired.

    PubMed

    Waska, Robert

    2012-10-01

    In their transference efforts to maintain psychic equilibrium (Joseph, 1989), some patients will do their best to convert their analysts into familiar, dreaded, or desired internal objects which they then react to or relate to. The interpersonal, interactional, and intrapsychic pull for the absorption and utilization of the analyst into a predesigned and pathologically limited figure creates countertransference struggles and phases of enactment that can go unnoticed, denied, or justified. Even when analysts maintain their analytic balance, the patient can manipulate, mishear, and transform words, actions, and intentions into very specific archaic objects or part objects. Case material is used to illustrate the way in which patients attempt to turn the analytic process and the therapeutic relationship into an acting out of wished for or painfully familiar self and object interactions. This method of subsuming the analytic method can be quite subtle, or it can be very obvious but still extremely difficult to shift, interpret, or recover from. Indeed, the analyst can easily be drawn into this perversion of analytic procedure and end up participating in various enactments. With such patients, the nature of the unconscious fantasies projected into the transference matrix and the intensity of the patient's object relational conflicts almost guarantee some degree of ongoing countertransference acting out. So, the ongoing and repetitive interpretive style needed with such patients is both helpful and healing as well as often becoming a contribution to the fundamental pathology the patient repeats in the clinical setting. Although the transference dynamic being examined could be understood from a number of theoretical perspectives, the author focuses on the Kleinian psychoanalytic method.

  19. A Graph is Worth a Thousand Words: How Overconfidence and Graphical Disclosure of Numerical Information Influence Financial Analysts Accuracy on Decision Making

    PubMed Central

    Leite, Rodrigo Oliveira; de Aquino, André Carlos Busanelli

    2016-01-01

    Previous researches support that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Moreover, literature shows that different types of graphical information can help or harm the accuracy on decision making of accountants and financial analysts. We conducted a 4×2 mixed-design experiment to examine the effects of numerical information disclosure on financial analysts’ accuracy, and investigated the role of overconfidence in decision making. Results show that compared to text, column graph enhanced accuracy on decision making, followed by line graphs. No difference was found between table and textual disclosure. Overconfidence harmed accuracy, and both genders behaved overconfidently. Additionally, the type of disclosure (text, table, line graph and column graph) did not affect the overconfidence of individuals, providing evidence that overconfidence is a personal trait. This study makes three contributions. First, it provides evidence from a larger sample size (295) of financial analysts instead of a smaller sample size of students that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Second, it uses the text as a baseline comparison to test how different ways of information disclosure (line and column graphs, and tables) can enhance understandability of information. Third, it brings an internal factor to this process: overconfidence, a personal trait that harms the decision-making process of individuals. At the end of this paper several research paths are highlighted to further study the effect of internal factors (personal traits) on financial analysts’ accuracy on decision making regarding numerical information presented in a graphical form. In addition, we offer suggestions concerning some practical implications for professional accountants, auditors, financial analysts and standard setters. PMID:27508519

  20. Analysis and Characterization | Bioenergy | NREL

    Science.gov Websites

    Analysis and Characterization Analysis and Characterization NREL's team of bioenergy analysts takes equipment in a lab Biomass Characterization Photo of NRELs Biochemical Process Development Unit showing a

  1. Ceci n'est pas une micromachine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yarberry, Victor R.; Diegert, Carl F.

    2010-03-01

    The image created in reflected light DIC can often be interpreted as a true three-dimensional representation of the surface geometry, provided a clear distinction can be realized between raised and lowered regions in the specimen. It may be helpful if our definition of saliency embraces work on the human visual system (HVS) as well as the more abstract work on saliency, as it is certain that understanding by humans will always stand between recording of a useful signal from all manner of sensors and so-called actionable intelligence. A DARPA/DSO program lays down this requirement in a current program (Kruse 2010):more » The vision for the Neurotechnology for Intelligence Analysts (NIA) Program is to revolutionize the way that analysts handle intelligence imagery, increasing both the throughput of imagery to the analyst and overall accuracy of the assessments. Current computer-based target detection capabilities cannot process vast volumes of imagery with the speed, flexibility, and precision of the human visual system.« less

  2. Behavior analysis and the study of human aging

    PubMed Central

    Derenne, Adam; Baron, Alan

    2002-01-01

    As the population of older adults continues to rise, psychologists along with other behavioral and social scientists have shown increasing interest in this age group. Although behavior analysts have contributed to research on aging, the focus has been on applications that remedy age-related deficits, rather than a concern with aging as a developmental process. In particular, there has been little interest in the central theoretical questions that have guided gerontologists. How does behavior change with advancing years, and what are the sources of those changes? We consider the possibility that this neglect reflects the long-standing commitment of behavior analysts to variables that can be experimentally manipulated, a requirement that excludes the key variable—age itself. We review the options available to researchers and present strategies that minimize deviations from the traditional features of behavior-analytic designs. Our comments are predicated on the view that aging issues within contemporary society are far too important for behavior analysts to ignore. PMID:22478383

  3. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  4. Privacy and disclosure in psychoanalysis.

    PubMed

    Kantrowitz, Judy L

    2009-08-01

    The tension between privacy and disclosure in psychoanalysis operates in various ways in analyst, supervisee, and supervisor. Analysts need to maintain the privacy of their patients by keeping their material confidential; they also need to know and share their own internal conscious conflicts to be able to discover unconscious conflicts and their characterological ramifications. Clinical writing is one vehicle for the exploration, discovery, and communication of transference-countertransference issues and other conflicts stimulated by clinical work, but it does not provide the perspective that comes from sharing with another person. Telling a trusted colleague what we think and feel in relation to our patients and ourselves enables us to see our blind spots, as well as providing perspective and affect containment in our work. Mutuality in peer supervision tends to reduce the transference. The special problems of privacy and disclosure in psychoanalytic training are addressed, as are the ways the analyst's belief in maintaining privacy may affect the analytic process and therapeutic relationship.

  5. A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice

    ERIC Educational Resources Information Center

    Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.

    2015-01-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…

  6. Knowledge Building in Asynchronous Discussion Groups: Going Beyond Quantitative Analysis

    ERIC Educational Resources Information Center

    Schrire, Sarah

    2006-01-01

    This contribution examines the methodological challenges involved in defining the collaborative knowledge-building processes occurring in asynchronous discussion and proposes an approach that could advance understanding of these processes. The written protocols that are available to the analyst provide an exact record of the instructional…

  7. X-Ray Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Radiographic Image Acquisition & Processing Software for Security Markets. Used in operation of commercial x-ray scanners and manipulation of x-ray images for emergency responders including State, Local, Federal, and US Military bomb technicians and analysts.

  8. Does Educational Preparation Match Professional Practice: The Case of Higher Education Policy Analysts

    ERIC Educational Resources Information Center

    Arellano, Eduardo C.; Martinez, Mario C.

    2009-01-01

    This study compares the extent to which higher education policy analysts and master's and doctoral faculty of higher education and public affairs programs match on a set of competencies thought to be important to higher education policy analysis. Analysts matched master's faculty in three competencies while analysts and doctoral faculty matched in…

  9. The Variability of Crater Identification Among Expert and Community Crater Analysts

    NASA Astrophysics Data System (ADS)

    Robbins, S. J.; Antonenko, I.; Kirchoff, M. R.; Chapman, C. R.; Fassett, C. I.; Herrick, R. R.; Singer, K.; Zanetti, M.; Lehan, C.; Huang, D.; Gay, P.

    2014-04-01

    Statistical studies of impact crater populations have been used to model ages of planetary surfaces for several decades [1]. This assumes that crater counts are approximately invariant and a "correct" population will be identified if the analyst is skilled and diligent. However, the reality is that crater identification is somewhat subjective, so variability between analysts, or even a single analyst's variation from day-to-day, is expected [e.g., 2, 3]. This study was undertaken to quantify that variability within an expert analyst population and between experts and minimally trained volunteers.

  10. SAFARI, an On-Line Text-Processing System User's Manual.

    ERIC Educational Resources Information Center

    Chapin, P.G.; And Others.

    This report describes for the potential user a set of procedures for processing textual materials on-line. In this preliminary model an information analyst can scan through messages, reports, and other documents on a display scope and select relevant facts, which are processed linguistically and then stored in the computer in the form of logical…

  11. Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process

    ERIC Educational Resources Information Center

    Chenail, Ronald J.

    2012-01-01

    In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…

  12. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  13. A New Process for Organizing Assessments of Social, Economic, and Environmental Outcomes: Case Study of Wildland Fire Management in the USA

    EPA Science Inventory

    Ecological risk assessments typically are organized using the processes of planning (a discussion among managers, stakeholders, and analysts to clarify ecosystem management goals and assessment scope) and problem formulation (evaluation of existing information to generate hypothe...

  14. Test and Evaluation of Neural Network Applications for Seismic Signal Discrimination

    DTIC Science & Technology

    1992-09-28

    IMS) for automated processing and interpretation of regional seismic data. Also reported is the result of a preliminary study on the application of...of analyst-verified events that were missed by the automated processing decreased by more than a factor of 2 (about 10 events/week). The second

  15. Development of a Comprehensive Database System for Safety Analyst

    PubMed Central

    Paz, Alexander; Veeramisti, Naveen; Khanal, Indira; Baker, Justin

    2015-01-01

    This study addressed barriers associated with the use of Safety Analyst, a state-of-the-art tool that has been developed to assist during the entire Traffic Safety Management process but that is not widely used due to a number of challenges as described in this paper. As part of this study, a comprehensive database system and tools to provide data to multiple traffic safety applications, with a focus on Safety Analyst, were developed. A number of data management tools were developed to extract, collect, transform, integrate, and load the data. The system includes consistency-checking capabilities to ensure the adequate insertion and update of data into the database. This system focused on data from roadways, ramps, intersections, and traffic characteristics for Safety Analyst. To test the proposed system and tools, data from Clark County, which is the largest county in Nevada and includes the cities of Las Vegas, Henderson, Boulder City, and North Las Vegas, was used. The database and Safety Analyst together help identify the sites with the potential for safety improvements. Specifically, this study examined the results from two case studies. The first case study, which identified sites having a potential for safety improvements with respect to fatal and all injury crashes, included all roadway elements and used default and calibrated Safety Performance Functions (SPFs). The second case study identified sites having a potential for safety improvements with respect to fatal and all injury crashes, specifically regarding intersections; it used default and calibrated SPFs as well. Conclusions were developed for the calibration of safety performance functions and the classification of site subtypes. Guidelines were provided about the selection of a particular network screening type or performance measure for network screening. PMID:26167531

  16. The patient who believes and the analyst who does not (1).

    PubMed

    Lijtmaer, Ruth M

    2009-01-01

    A patient's religious beliefs and practices challenge the clinical experience and self-knowledge of the analyst owing to a great complexity of factors, and often take the form of the analyst's resistances and countertransference reactions to spiritual and religious issues. The analyst's feelings about the patient's encounters with religion and other forms of healing experiences may result in impasses and communication breakdown for a variety of reasons. These reasons include the analyst's own unresolved issues around her role as a psychoanalyst-which incorporates in some way psychoanalysis's views of religious belief-and these old conflicts may be irritated by the religious themes expressed by the patient. Vignettes from the treatments of two patients provide examples of the analyst's countertransference conflicts, particularly envy in the case of a therapist who is an atheist.

  17. Self-analysis and the development of an interpretation.

    PubMed

    Campbell, Donald

    2017-10-01

    In spite of the fact that Freud's self-analysis was at the centre of so many of his discoveries, self-analysis remains a complex, controversial and elusive exercise. While self-analysis is often seen as emerging at the end of an analysis and then used as a criteria in assessing the suitability for termination, I try to attend to the patient's resistance to self-analysis throughout an analysis. I take the view that the development of the patient's capacity for self-analysis within the analytic session contributes to the patient's growth and their creative and independent thinking during the analysis, which prepares him or her for a fuller life after the formal analysis ends. The model I will present is based on an over lapping of the patient's and the analyst's self-analysis, with recognition and use of the analyst's counter-transference. My focus is on the analyst's self-analysis that is in response to a particular crisis of not knowing, which results in feeling intellectually and emotionally stuck. This paper is not a case study, but a brief look at the process I went through to arrive at a particular interpretation with a particular patient during a particular session. I will concentrate on resistances in which both patient and analyst initially rely upon what is consciously known. Copyright © 2017 Institute of Psychoanalysis.

  18. Contacting a 19 month-old mute autistic girl: a clinical narrative.

    PubMed

    Busch de Ahumada, Luisa C; Ahumada, Jorge L

    2015-02-01

    Conveying that psychoanalysis offers rich opportunities for the very early treatment of autistic spectrum disorders, this clinical communication unfolds the clinical process of a 19-month-old 'shell-type' encapsulated mute autistic girl. It details how, in a four-weekly-sessions schedule, infant Lila evolved within two years from being emotionally out-of-contact to the affective aliveness of oedipal involvement. Following Frances Tustin's emphasis on the analyst's 'quality of attention' and Justin Call's advice that in baby-mother interaction the infant is the initiator and the mother is the follower, it is described how the analyst must, amid excruciating non-response, even-mindedly sustain her attention in order to meet the child half-way at those infrequent points where flickers of initiative on her side are adumbrated. This helps attain evanescent 'moments of contact' which coalesce later into 'moments of sharing', eventually leading to acknowledgment of the analyst's humanness and a receptiveness for to-and-fro communication. Thus the 'primal dialogue' (Spitz) is reawakened and, by experiencing herself in the mirror of the analyst, the child's sense of I-ness is reinstated. As evinced by the literature, the mainstream stance rests on systematic early interpretation of the transference, which has in our view strongly deterred progress in the psychoanalytic treatment of autistic spectrum disorders. Copyright © 2014 Institute of Psychoanalysis.

  19. AEDT sensor path methods using BADA4

    DOT National Transportation Integrated Search

    2017-06-01

    This report documents the development and use of sensor path data processing in the Federal Aviation Administration's (FAAs) Aviation Environmental Design Tool (AEDT). The methods are primarily intended to assist analysts with using AEDT to determ...

  20. Microlithography and resist technology information at your fingertips via SciFinder

    NASA Astrophysics Data System (ADS)

    Konuk, Rengin; Macko, John R.; Staggenborg, Lisa

    1997-07-01

    Finding and retrieving the information you need about microlithography and resist technology in a timely fashion can make or break your competitive edge in today's business environment. Chemical Abstracts Service (CAS) provides the most complete and comprehensive database of the chemical literature in the CAplus, REGISTRY, and CASREACT files including 13 million document references, 15 million substance records and over 1.2 million reactions. This includes comprehensive coverage of positive and negative resist formulations and processing, photoacid generation, silylation, single and multilayer resist systems, photomasks, dry and wet etching, photolithography, electron-beam, ion-beam and x-ray lithography technologies and process control, optical tools, exposure systems, radiation sources and steppers. Journal articles, conference proceedings and patents related to microlithography and resist technology are analyzed and indexed by scientific information analysts with strong technical background in these areas. The full CAS database, which is updated weekly with new information, is now available at your desktop, via a convenient, user-friendly tool called 'SciFinder.' Author, subject and chemical substance searching is simplified by SciFinder's smart search features. Chemical substances can be searched by chemical structure, chemical name, CAS registry number or molecular formula. Drawing chemical structures in SciFinder is easy and does not require compliance with CA conventions. Built-in intelligence of SciFinder enables users to retrieve substances with multiple components, tautomeric forms and salts.

  1. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorensek, M.; Hamm, L.; Garcia, H.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less

  2. Listening to and Sharing of Self in Psychoanalytic Supervision: The Supervisor's Self-Perspective.

    PubMed

    Watkins, C Edward

    2016-08-01

    Just as the analyst's self-perspective is critical to effective analytic process, the supervisor's self-perspective is accordingly critical to effective supervision process. But the supervisor's self-perspective has received virtually no attention as a listening/experiencing perspective in the psychoanalytic supervision literature. In this paper, the author defines the supervisor's self-perspective and considers five ways by which it contributes to an effective supervisory process: (1) sharing one's own impressions of/reactions to patients; (2) sharing personal disclosures about the supervisee-patient relationship; (3) sharing personal disclosures about the supervisee as a developing analytic therapist; (4) sharing personal disclosures about the supervisor-supervisee relationship; and (5) using one's own self-reflection as a check and balance for supervisory action. The supervisor's self-perspective provides the missing supervisory voice in the triadic complement of subject-other-self, has the potential to be eminently educative across the treatment/supervision dyads, and serves as a prototype for the supervisee's own development and use of analytic (or analyst) self-perspective.

  3. Convergence in full motion video processing, exploitation, and dissemination and activity based intelligence

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Lewis, Gina

    2012-06-01

    Over the last decade, intelligence capabilities within the Department of Defense/Intelligence Community (DoD/IC) have evolved from ad hoc, single source, just-in-time, analog processing; to multi source, digitally integrated, real-time analytics; to multi-INT, predictive Processing, Exploitation and Dissemination (PED). Full Motion Video (FMV) technology and motion imagery tradecraft advancements have greatly contributed to Intelligence, Surveillance and Reconnaissance (ISR) capabilities during this timeframe. Imagery analysts have exploited events, missions and high value targets, generating and disseminating critical intelligence reports within seconds of occurrence across operationally significant PED cells. Now, we go beyond FMV, enabling All-Source Analysts to effectively deliver ISR information in a multi-INT sensor rich environment. In this paper, we explore the operational benefits and technical challenges of an Activity Based Intelligence (ABI) approach to FMV PED. Existing and emerging ABI features within FMV PED frameworks are discussed, to include refined motion imagery tools, additional intelligence sources, activity relevant content management techniques and automated analytics.

  4. Situated, strategic, and AI-Enhanced technology introduction to healthcare.

    PubMed

    Bushko, Renata G

    2005-01-01

    We work hard on creating AI-wings for physicians to let them fly higher and faster in diagnosing patients--a task that physicians do not want to automate. What we do not work hard on is determining the ENVIRONMENT in which physicians' AI wings are supposed to function. It seems to be a job for social/business analysts that have their own separate kingdom. For the sake of all of us (potential patients!) social/business consultants and their methodologies should not be treated as a separate kingdom. The most urgent task is to achieve synergy between (1) AI/Fuzzy/Neural research, (2) Applied medical AI, (3) Social/Business research on medical institutions. We need this synergy in order to assure humanistic medical technology; technology flexible and sensitive enough to facilitate healthcare work while leaving space for human pride and creativity. In order to achieve humanistic technology, designers should consider the impact of technological breakthroughs on the organizations in which this technology will function and the nature of work of humans destined to use this technology. Situated (different for each organization), Strategic (based on an in-depth knowledge of Healthcare business), and AI-Enhanced (ended with a dynamic model) method for introducing technology to Healthcare allows identifying areas where technology can make medical work easier. Using this method before automating human work will get us closer to the ideal where there is no discontinuity between design and use of programs; where the technology matches users' needs perfectly--the world with humanistic technology and healthcare workers with AI-wings.

  5. How Continental Bank outsourced its "crown jewels.".

    PubMed

    Huber, R L

    1993-01-01

    No industry relies more on information than banking does, yet Continental, one of America's largest banks, outsources its information technology. Why? Because that's the best way to service the customers that form the core of the bank's business, says vice chairman Dick Huber. In the late 1970s and early 1980s, Continental participated heavily with Penn Square Bank in energy investments. When falling energy prices burst Penn Square's bubble in 1982, Continental was stuck with more than $1 billion in bad loans. Eight years later when Dick Huber came on board, Continental was working hard to restore its once solid reputation. Executives had made many tough decisions already, altering the bank's focus from retail to business banking and laying off thousands of employees. Yet management still needed to cut costs and improve services to stay afloat. Regulators, investors, and analysts were watching every step. Continental executives, eager to focus on the bank's core mission of serving business customers, decided to outsource one after another in-house service--from cafeteria services to information technology. While conventional wisdom holds that banks must retain complete internal control of IT, Continental bucked this argument when it entered into a ten-year, multimillion-dollar contract with Integrated Systems Solutions Corporation. Continental is already reaping benefits from outsourcing IT. Most important, Continental staffers today focus on their true core competencies: intimate knowledge of customers' needs and relationships with customers.

  6. Using the living laboratory framework as a basis for understanding next-generation analyst work

    NASA Astrophysics Data System (ADS)

    McNeese, Michael D.; Mancuso, Vincent; McNeese, Nathan; Endsley, Tristan; Forster, Pete

    2013-05-01

    The preparation of next generation analyst work requires alternative levels of understanding and new methodological departures from the way current work transpires. Current work practices typically do not provide a comprehensive approach that emphasizes the role of and interplay between (a) cognition, (b) emergent activities in a shared situated context, and (c) collaborative teamwork. In turn, effective and efficient problem solving fails to take place, and practice is often composed of piecemeal, techno-centric tools that isolate analysts by providing rigid, limited levels of understanding of situation awareness. This coupled with the fact that many analyst activities are classified produces a challenging situation for researching such phenomena and designing and evaluating systems to support analyst cognition and teamwork. Through our work with cyber, image, and intelligence analysts we have realized that there is more required of researchers to study human-centered designs to provide for analyst's needs in a timely fashion. This paper identifies and describes how The Living Laboratory Framework can be utilized as a means to develop a comprehensive, human-centric, and problem-focused approach to next generation analyst work, design, and training. We explain how the framework is utilized for specific cases in various applied settings (e.g., crisis management analysis, image analysis, and cyber analysis) to demonstrate its value and power in addressing an area of utmost importance to our national security. Attributes of analyst work settings are delineated to suggest potential design affordances that could help improve cognitive activities and awareness. Finally, the paper puts forth a research agenda for the use of the framework for future work that will move the analyst profession in a viable manner to address the concerns identified.

  7. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  8. A research and experimentation framework for exploiting VoI-based methods within analyst workflows in tactical operation centers

    NASA Astrophysics Data System (ADS)

    Sadler, Laurel

    2017-05-01

    In today's battlefield environments, analysts are inundated with real-time data received from the tactical edge that must be evaluated and used for managing and modifying current missions as well as planning for future missions. This paper describes a framework that facilitates a Value of Information (VoI) based data analytics tool for information object (IO) analysis in a tactical and command and control (C2) environment, which reduces analyst work load by providing automated or analyst assisted applications. It allows the analyst to adjust parameters for data matching of the IOs that will be received and provides agents for further filtering or fusing of the incoming data. It allows for analyst enhancement and markup to be made to and/or comments to be attached to the incoming IOs, which can then be re-disseminated utilizing the VoI based dissemination service. The analyst may also adjust the underlying parameters before re-dissemination of an IO, which will subsequently adjust the value of the IO based on this new/additional information that has been added, possibly increasing the value from the original. The framework is flexible and extendable, providing an easy to use, dynamically changing Command and Control decision aid that focuses and enhances the analyst workflow.

  9. The analyst: his professional novel.

    PubMed

    Ambrosiano, Laura

    2005-12-01

    The psychoanalyst needs to be in touch with a community of colleagues; he needs to feel part of a group with which he can share cognitive tension and therapeutic knowledge. Yet group ties are an aspect we analysts seldom discuss. The author defines the analyst's 'professional novel' as the emotional vicissitudes with the group that have marked the professional itinerary of every analyst; his relationship with institutions and with theories, and the emotional nuance of these relationships. The analyst's professional novel is the narrative elaboration of his professional autobiography. It is capable of transforming the individual's need to belong and the paths of identification and de-identification. Experience of the oedipal configuration allows the analyst to begin psychic work aimed at gaining spaces of separateness in his relationship with the group. This passage is marked by the work on mourning that separation involves, but also of mourning implicit in the awareness of the representative limits of our theories. Right from the start of analysis, the patient observes the emotional nuance of the analyst's connection to his group and theories; the patient notices how much this connection is governed by rigid needs to belong, and how much freedom of thought and exploration it allows the analyst. The author uses clinical examples to illustrate these hypotheses.

  10. Yes-No Questions in the Third-Turn Position: Pedagogical Discourse Processes

    ERIC Educational Resources Information Center

    Lee, Yo-An

    2008-01-01

    Yes-No (Y/N) questions are distinctive in calling for a bipolar response. Some Y/N questions predispose one answer over the other. Conversation analysts have examined the sequential relevance of this predisposition and found the institutional character of social actions enacted in Y/N questioning processes. Classroom interaction is one such…

  11. Residues in the analyst of the patient's symbiotic connection at a somatic level: unrepresented states in the patient and analyst.

    PubMed

    Godsil, Geraldine

    2018-02-01

    This paper discusses the residues of a somatic countertransference that revealed its meaning several years after apparently successful analytic work had ended. Psychoanalytic and Jungian analytic ideas on primitive communication, dissociation and enactment are explored in the working through of a shared respiratory symptom between patient and analyst. Growth in the analyst was necessary so that the patient's communication at a somatic level could be understood. Bleger's concept that both the patient's and analyst's body are part of the setting was central in the working through. © 2018, The Society of Analytical Psychology.

  12. Quantity and unit extraction for scientific and technical intelligence analysis

    NASA Astrophysics Data System (ADS)

    David, Peter; Hawes, Timothy

    2017-05-01

    Scientific and Technical (S and T) intelligence analysts consume huge amounts of data to understand how scientific progress and engineering efforts affect current and future military capabilities. One of the most important types of information S and T analysts exploit is the quantities discussed in their source material. Frequencies, ranges, size, weight, power, and numerous other properties and measurements describing the performance characteristics of systems and the engineering constraints that define them must be culled from source documents before quantified analysis can begin. Automating the process of finding and extracting the relevant quantities from a wide range of S and T documents is difficult because information about quantities and their units is often contained in unstructured text with ad hoc conventions used to convey their meaning. Currently, even simple tasks, such as searching for documents discussing RF frequencies in a band of interest, is a labor intensive and error prone process. This research addresses the challenges facing development of a document processing capability that extracts quantities and units from S and T data, and how Natural Language Processing algorithms can be used to overcome these challenges.

  13. Analytic Steering: Inserting Context into the Information Dialog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.

    2011-10-23

    An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less

  14. On psychobiology in psychoanalysis - salivary cortisol and secretory IgA as psychoanalytic process parameters

    PubMed Central

    Euler, Sebastian; Schimpf, Heinrich; Hennig, Jürgen; Brosig, Burkhard

    2005-01-01

    This study investigates the psychobiological impact of psychoanalysis in its four-hour setting. During a period of five weeks, 20 subsequent hours of psychoanalysis were evaluated, involving two patients and their analysts. Before and after each session, saliva samples were taken and analysed for cortisol (sCortisol) and secretory immunoglobuline A (sIgA). Four time-series (n=80 observations) resulted and were evaluated by "Pooled Time Series Analysis" (PTSA) for significant level changes and setting-mediated rhythms. Over all sessions, sCortisol levels were reduced and sIgA secretion augmented parallel to the analytic work. In one analytic dyad a significant rhythm within the four-hour setting was observed with an increase of sCortisol in sessions 2 and 3 of the week. Psychoanalysis may, therefore, have some psychobiological impact on patients and analysts alike and may modulate immunological and endocrinological processes. PMID:19742067

  15. Software life cycle dynamic simulation model: The organizational performance submodel

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1985-01-01

    The submodel structure of a software life cycle dynamic simulation model is described. The software process is divided into seven phases, each with product, staff, and funding flows. The model is subdivided into an organizational response submodel, a management submodel, a management influence interface, and a model analyst interface. The concentration here is on the organizational response model, which simulates the performance characteristics of a software development subject to external and internal influences. These influences emanate from two sources: the model analyst interface, which configures the model to simulate the response of an implementing organization subject to its own internal influences, and the management submodel that exerts external dynamic control over the production process. A complete characterization is given of the organizational response submodel in the form of parameterized differential equations governing product, staffing, and funding levels. The parameter values and functions are allocated to the two interfaces.

  16. Developing an Automated Method for Detection of Operationally Relevant Ocean Fronts and Eddies

    NASA Astrophysics Data System (ADS)

    Rogers-Cotrone, J. D.; Cadden, D. D. H.; Rivera, P.; Wynn, L. L.

    2016-02-01

    Since the early 90's, the U.S. Navy has utilized an observation-based process for identification of frontal systems and eddies. These Ocean Feature Assessments (OFA) rely on trained analysts to identify and position ocean features using satellite-observed sea surface temperatures. Meanwhile, as enhancements and expansion of the navy's Hybrid Coastal Ocean Model (HYCOM) and Regional Navy Coastal Ocean Model (RNCOM) domains have proceeded, the Naval Oceanographic Office (NAVO) has provided Tactical Oceanographic Feature Assessments (TOFA) that are based on data-validated model output but also rely on analyst identification of significant features. A recently completed project has migrated OFA production to the ArcGIS-based Acoustic Reach-back Cell Ocean Analysis Suite (ARCOAS), enabling use of additional observational datasets and significantly decreasing production time; however, it has highlighted inconsistencies inherent to this analyst-based identification process. Current efforts are focused on development of an automated method for detecting operationally significant fronts and eddies that integrates model output and observational data on a global scale. Previous attempts to employ techniques from the scientific community have been unable to meet the production tempo at NAVO. Thus, a system that incorporates existing techniques (Marr-Hildreth, Okubo-Weiss, etc.) with internally-developed feature identification methods (from model-derived physical and acoustic properties) is required. Ongoing expansions to the ARCOAS toolset have shown promising early results.

  17. The Pope's confessor: a metaphor relating to illness in the analyst.

    PubMed

    Clark, R W

    1995-01-01

    This paper examines some of the internal and external eventualities in the situation of illness in the analyst. The current emphasis on the use of the self as part of the analyzing instrument makes impairments in the analyst's physical well-being potentially disabling to the analytic work. A recommendation is made for analysts, both individually and as a professional group, to always consider this aspect of a personal medical problem.

  18. Desire and the female analyst.

    PubMed

    Schaverien, J

    1996-04-01

    The literature on erotic transference and countertransference between female analyst and male patient is reviewed and discussed. It is known that female analysts are less likely than their male colleagues to act out sexually with their patients. It has been claimed that a) male patients do not experience sustained erotic transferences, and b) female analysts do not experience erotic countertransferences with female or male patients. These views are challenged and it is argued that, if there is less sexual acting out by female analysts, it is not because of an absence of eros in the therapeutic relationship. The literature review covers material drawn from psychoanalysis, feminist psychotherapy, Jungian analysis, as well as some sociological and cultural sources. It is organized under the following headings: the gender of the analyst, sexual acting out, erotic transference, maternal and paternal transference, gender and power, countertransference, incest taboo--mothers and sons and sexual themes in the transference.

  19. [Proposal of a method for collective analysis of work-related accidents in the hospital setting].

    PubMed

    Osório, Claudia; Machado, Jorge Mesquita Huet; Minayo-Gomez, Carlos

    2005-01-01

    The article presents a method for the analysis of work-related accidents in hospitals, with the double aim of analyzing accidents in light of actual work activity and enhancing the vitality of the various professions that comprise hospital work. This process involves both research and intervention, combining knowledge output with training of health professionals, fostering expanded participation by workers in managing their daily work. The method consists of stimulating workers to recreate the situation in which a given accident occurred, shifting themselves to the position of observers of their own work. In the first stage of analysis, workers are asked to show the work analyst how the accident occurred; in the second stage, the work accident victim and analyst jointly record the described series of events in a diagram; in the third, the resulting record is re-discussed and further elaborated; in the fourth, the work accident victim and analyst evaluate and implement measures aimed to prevent the accident from recurring. The article concludes by discussing the method's possibilities and limitations in the hospital setting.

  20. Countertransference

    PubMed Central

    BOYER, L. BRYCE

    1994-01-01

    Freud’s ambivalently negative attitude toward countertransference discouraged systematic study until some psychoanalysts, predominantly Kleinians, began to treat patients with narcissistic neuroses. Recognizing the need to understand the unconscious and conscious contribution of the analyst to the therapeutic process, Heimann, Rosenfeld, Balint, and Racker pioneered in serious study of countertransference. Racker and Boyer found that unresolved countertransference problems contributed significantly to unfavorable responses to psychoanalysis in seriously disturbed patients. Searles, Giovacchini, Ogden, and Volkan have like-wise furthered countertransference research. Following a historical review, the author delineates his personal approach to understanding patients, especially seriously disturbed ones, in terms of the ongoing introjection of patient and analyst of each other’s projections. This approach stems from Rosenfeld’s initial propositions. PMID:22700186

  1. A self-psychological approach to the study of biography: the interplay of narratives in psychoanalysis and biography.

    PubMed

    Hershberg, Sandra G

    2009-04-01

    This chapter is an exploration of the psychoanalytic aspects of biography and the biographical aspects of psychoanalysis. The narratives that emerge from biography and psychoanalytic treatment incorporate elements of empathy, ideology (theory), and transference/countertransference and are co-constructed within an intersubjective field involving the subjectivities of both participants, the biographer and her subject and the analyst and her analysand. I will provide examples that demonstrate the way in which these processes play out in the biographical realm. Correspondingly, I will illustrate the way in which the analyst's biography and analysand's autobiography change in the course of the psychoanalytic treatment. Salient differences between biographical and psychoanalytic endeavors are also discussed.

  2. ON THE ANALYST'S IDENTIFICATION WITH THE PATIENT: THE CASE OF J.-B. PONTALIS AND G. PEREC.

    PubMed

    Schwartz, Henry P

    2016-01-01

    The writer Georges Perec was in psychoanalysis with Jean-Bertrand Pontalis for four years in the early 1970s. In this essay, the author presents the exceptional interest this analyst took in this patient and the ways in which that interest manifested itself in his work, psychoanalytic and otherwise. Many correlative factors suggest that identificatory processes persisted beyond the treatment and were maintained into Pontalis's later life. While this paper is primarily intended to provide evidence to support this view of a specific case, the author closes by reflecting that this may be a more general phenomenon and the reasons for this. © 2016 The Psychoanalytic Quarterly, Inc.

  3. 49 CFR 1245.5 - Classification of job titles.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...

  4. 49 CFR 1245.5 - Classification of job titles.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...

  5. 49 CFR 1245.5 - Classification of job titles.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...

  6. 49 CFR 1245.5 - Classification of job titles.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...

  7. 49 CFR 1245.5 - Classification of job titles.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...

  8. Katherine Dykes | NREL

    Science.gov Websites

    Smarter Cities Marketing Insights 2.0 initiative, a data quality analyst at EnerNOC for its demand wind energy as a wind program analyst for Green Energy Ohio in 2005 and as a data analyst for The

  9. Elements of analytic style: Bion's clinical seminars.

    PubMed

    Ogden, Thomas H

    2007-10-01

    The author finds that the idea of analytic style better describes significant aspects of the way he practices psychoanalysis than does the notion of analytic technique. The latter is comprised to a large extent of principles of practice developed by previous generations of analysts. By contrast, the concept of analytic style, though it presupposes the analyst's thorough knowledge of analytic theory and technique, emphasizes (1) the analyst's use of his unique personality as reflected in his individual ways of thinking, listening, and speaking, his own particular use of metaphor, humor, irony, and so on; (2) the analyst's drawing on his personal experience, for example, as an analyst, an analysand, a parent, a child, a spouse, a teacher, and a student; (3) the analyst's capacity to think in a way that draws on, but is independent of, the ideas of his colleagues, his teachers, his analyst, and his analytic ancestors; and (4) the responsibility of the analyst to invent psychoanalysis freshly for each patient. Close readings of three of Bion's 'Clinical seminars' are presented in order to articulate some of the elements of Bion's analytic style. Bion's style is not presented as a model for others to emulate or, worse yet, imitate; rather, it is described in an effort to help the reader consider from a different vantage point (provided by the concept of analytic style) the way in which he, the reader, practices psychoanalysis.

  10. The SAZA study: implementing health financing reform in South Africa and Zambia.

    PubMed

    Gilson, Lucy; Doherty, Jane; Lake, Sally; McIntyre, Di; Mwikisa, Chris; Thomas, Stephen

    2003-03-01

    This paper explores the policy-making process in the 1990s in two countries, South Africa and Zambia, in relation to health care financing reforms. While much of the analysis of health reform programmes has looked at design issues, assuming that a technically sound design is the primary requirement of effective policy change, this paper explores the political and bureaucratic realities shaping the pattern of policy change and its impacts. Through a case study approach, it provides a picture of the policy environment and processes in the two countries, specifically considering the extent to which technical analysts and technical knowledge were able to shape policy change. The two countries' experiences indicate the strong influence of political factors and actors over which health care financing policies were implemented, and which not, as well as over the details of policy design. Moments of political transition in both countries provided political leaders, specifically Ministers of Health, with windows of opportunity in which to introduce new policies. However, these transitions, and the changes in administrative structures introduced with them, also created environments that constrained the processes of reform design and implementation and limited the equity and sustainability gains achieved by the policies. Technical analysts, working either inside or outside government, had varying and often limited influence. In part, this reflected the limits of their own capacity as well as weaknesses in the way they were used in policy development. In addition, the analysts were constrained by the fact that their preferred policies often received only weak political support. Focusing almost exclusively on designing policy reforms, these analysts gave little attention to generating adequate support for the policy options they proposed. Finally, the country experiences showed that front-line health workers, middle level managers and the public had important influences over policy implementation and its impacts. The limited attention given to communicating policy changes to, or consulting with, these actors only heightened the potential for reforms to result in unanticipated and unwanted impacts. The strength of the paper lies in its 'thick description' of the policy process in each country, an empirical case study approach to policy that is under-represented in the literature. While such an approach allows only a cautious drawing of general conclusions, it suggests a number of ways in which to strengthen the implementation of financing policies in each country.

  11. Trauma and the transference-countertransference: working with the bad object and the wounded self.

    PubMed

    West, Marcus

    2013-02-01

    This paper focuses on the transference-countertransference dynamics that manifest in work with those individuals who experienced severe early relational trauma and, in particular, childhood sexual abuse. The literature is surveyed from Davies and Frawley's (1992a) seminal paper through to more current trauma-related and sensorimotor approaches, which deepen our understanding greatly. The rapidly shifting, powerful, conflicting and kaleidoscopic transference-countertransference dynamics are explored in the light of these views and in relation to a lengthy clinical example. The author elucidates the dual-aspect of the traumatic complex, whereby the abuser figure, which is disavowed by the patient, becomes manifest in prosecuting the analyst for the 'wounds' that the analysis evokes. The paper also explores the particular nature of the splitting processes, whereby pressure is put on the analyst to adopt an idealized role, in particular to act as a self-object, in order to enable the patient to safely express and 'be' themselves in an attempt to make up for what was not possible in childhood; the analyst will necessarily fail in this task. In the context of powerful masochisto-sadistic dynamics, the analyst's masochism is likely to be called up in the spirit of caring 'humanity' (another inevitable enactment), which can impede the progress of the analysis if not addressed. The extreme woundedness, intense affect and moral outrage associated with these dynamics are characteristic and compelling. Issues relating to disclosure, enactment and analytic attitude are also discussed. © 2013, The Society of Analytical Psychology.

  12. Toward a Model of Lifelong Education.

    ERIC Educational Resources Information Center

    Knowles, Malcolm S.

    Some of the criticisms that have been leveled at the educational establishment by social analysts are discussed. It is suggested that one of the new realities is that education must be a lifelong process in order to avoid the catastrophe of human obsolescence. The assumptions and elements for a new model of education as a lifelong process are…

  13. AeroADL: applying the integration of the Suomi-NPP science algorithms with the Algorithm Development Library to the calibration and validation task

    NASA Astrophysics Data System (ADS)

    Houchin, J. S.

    2014-09-01

    A common problem for the off-line validation of the calibration algorithms and algorithm coefficients is being able to run science data through the exact same software used for on-line calibration of that data. The Joint Polar Satellite System (JPSS) program solved part of this problem by making the Algorithm Development Library (ADL) available, which allows the operational algorithm code to be compiled and run on a desktop Linux workstation using flat file input and output. However, this solved only part of the problem, as the toolkit and methods to initiate the processing of data through the algorithms were geared specifically toward the algorithm developer, not the calibration analyst. In algorithm development mode, a limited number of sets of test data are staged for the algorithm once, and then run through the algorithm over and over as the software is developed and debugged. In calibration analyst mode, we are continually running new data sets through the algorithm, which requires significant effort to stage each of those data sets for the algorithm without additional tools. AeroADL solves this second problem by providing a set of scripts that wrap the ADL tools, providing both efficient means to stage and process an input data set, to override static calibration coefficient look-up-tables (LUT) with experimental versions of those tables, and to manage a library containing multiple versions of each of the static LUT files in such a way that the correct set of LUTs required for each algorithm are automatically provided to the algorithm without analyst effort. Using AeroADL, The Aerospace Corporation's analyst team has demonstrated the ability to quickly and efficiently perform analysis tasks for both the VIIRS and OMPS sensors with minimal training on the software tools.

  14. The OrbitOutlook Data Archive

    NASA Astrophysics Data System (ADS)

    Czajkowski, M.; Shilliday, A.; LoFaso, N.; Dipon, A.; Van Brackle, D.

    2016-09-01

    In this paper, we describe and depict the Defense Advanced Research Projects Agency (DARPA)'s OrbitOutlook Data Archive (OODA) architecture. OODA is the infrastructure that DARPA's OrbitOutlook program has developed to integrate diverse data from various academic, commercial, government, and amateur space situational awareness (SSA) telescopes. At the heart of the OODA system is its world model - a distributed data store built to quickly query big data quantities of information spread out across multiple processing nodes and data centers. The world model applies a multi-index approach where each index is a distinct view on the data. This allows for analysts and analytics (algorithms) to access information through queries with a variety of terms that may be of interest to them. Our indices include: a structured global-graph view of knowledge, a keyword search of data content, an object-characteristic range search, and a geospatial-temporal orientation of spatially located data. In addition, the world model applies a federated approach by connecting to existing databases and integrating them into one single interface as a "one-stop shopping place" to access SSA information. In addition to the world model, OODA provides a processing platform for various analysts to explore and analytics to execute upon this data. Analytic algorithms can use OODA to take raw data and build information from it. They can store these products back into the world model, allowing analysts to gain situational awareness with this information. Analysts in turn would help decision makers use this knowledge to address a wide range of SSA problems. OODA is designed to make it easy for software developers who build graphical user interfaces (GUIs) and algorithms to quickly get started with working with this data. This is done through a multi-language software development kit that includes multiple application program interfaces (APIs) and a data model with SSA concepts and terms such as: space observation, observable, measurable, metadata, track, space object, catalog, expectation, and maneuver.

  15. SafetyAnalyst : software tools for safety management of specific highway sites

    DOT National Transportation Integrated Search

    2010-07-01

    SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...

  16. Reflections: can the analyst share a traumatizing experience with a traumatized patient?

    PubMed

    Lijtmaer, Ruth

    2010-01-01

    This is a personal account of a dreadful event in the analyst's life that was similar to a patient's trauma. It is a reflection on how the analyst dealt with her own trauma, the patient's trauma, and the transference and countertransference dynamics. Included is a description of the analyst's inner struggles with self-disclosure, continuance of her professional work, and the need for persistent self-scrutiny. The meaning of objects in people's life, particularly the concept of home, will be addressed.

  17. Do Sell-Side Stock Analysts Exhibit Escalation of Commitment?

    PubMed Central

    Milkman, Katherine L.

    2010-01-01

    This paper presents evidence that when an analyst makes an out-of-consensus forecast of a company’s quarterly earnings that turns out to be incorrect, she escalates her commitment to maintaining an out-of-consensus view on the company. Relative to an analyst who was close to the consensus, the out-of-consensus analyst adjusts her forecasts for the current fiscal year’s earnings less in the direction of the quarterly earnings surprise. On average, this type of updating behavior reduces forecasting accuracy, so it does not seem to reflect superior private information. Further empirical results suggest that analysts do not have financial incentives to stand by extreme stock calls in the face of contradictory evidence. Managerial and financial market implications are discussed. PMID:21516220

  18. The tobacco industry's use of Wall Street analysts in shaping policy.

    PubMed

    Alamar, B C; Glantz, S A

    2004-09-01

    To document how the tobacco industry has used Wall Street analysts to further its public policy objectives. Searching tobacco documents available on the internet, newspaper articles, and transcripts of public hearings. The tobacco industry used nominally independent Wall Street analysts as third parties to support the tobacco industry's legislative agenda at both national and state levels in the USA. The tobacco industry has, for example, edited the testimony of at least one analyst before he testified to the US Senate Judiciary Committee, while representing himself as independent of the industry. The tobacco industry has used undisclosed collaboration with Wall Street analysts, as they have used undisclosed relationships with research scientists and academics, to advance the interests of the tobacco industry in public policy.

  19. 75 FR 20385 - Amended Certification Regarding Eligibility To Apply for Worker Adjustment Assistance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-19

    ..., Inconen, CTS, Hi-Tec, Woods, Ciber, Kelly Services, Analysts International Corp, Comsys, Filter LLC..., Ciber, Kelly Services, Analysts International Corp, Comsys, Filter LLC, Excell, Entegee, Chipton- Ross..., Kelly Services, Analysts International Corp, Comsys, Filter LLC, Excell, Entegee, Chipton- Ross, Ian...

  20. Third Party Services for Enabling Business-to-Business Interactions

    NASA Astrophysics Data System (ADS)

    Shrivastava, Santosh

    Business-to-business (B2B) interactions concerned with the fulfilment of a given business function (e.g., order processing) requires business partners to exchange electronic business documents and to act on them. This activity can be viewed as the business partners taking part in the execution of a shared business process, where each partner is responsible for performing their part in the process. Naturally, business process executions at each partner must be coordinated at run-time to ensure that the partners are performing mutually consistent actions (e.g., the seller is not hipping a product when the corresponding order has been cancelled by the buyer). A number of factors combine to make the task of business process coordination surprisingly hard:

  1. Business Performer-Centered Design of User Interfaces

    NASA Astrophysics Data System (ADS)

    Sousa, Kênia; Vanderdonckt, Jean

    Business Performer-Centered Design of User Interfaces is a new design methodology that adopts business process (BP) definition and a business performer perspective for managing the life cycle of user interfaces of enterprise systems. In this methodology, when the organization has a business process culture, the business processes of an organization are firstly defined according to a traditional methodology for this kind of artifact. These business processes are then transformed into a series of task models that represent the interactive parts of the business processes that will ultimately lead to interactive systems. When the organization has its enterprise systems, but not yet its business processes modeled, the user interfaces of the systems help derive tasks models, which are then used to derive the business processes. The double linking between a business process and a task model, and between a task model and a user interface model makes it possible to ensure traceability of the artifacts in multiple paths and enables a more active participation of business performers in analyzing the resulting user interfaces. In this paper, we outline how a human-perspective is used tied to a model-driven perspective.

  2. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  3. Technologies and problems of reengineering of the business processes of company

    NASA Astrophysics Data System (ADS)

    Silka, Dmitriy

    2017-10-01

    Management of the combination of business processes is a modern approach in the field of business management. Together with a lot of management approaches business processes allow us to identify all the resultant actions. Article reveals the modern view on the essence of business processes as well as the general approaches of their allocation. Principles of construction and business process re-engineering are proposed. Recommendations on how to perform re-engineering under high cyclic dynamics of business activity are provided.

  4. On Alternative Formulations for Linearised Miss Distance Analysis

    DTIC Science & Technology

    2013-05-01

    is traditionally employed by analysts as part of the solution process . To gain further insight into the nature of the missile-target engagement...a constant. Thus, following this process , the revised block diagram model for the linearised equations is presented in Figure 13. This model is... process is known as reducing the block to its fundamental closed loop form and has been achieved here using standard block diagram algebra. This

  5. Estimating costs in the economic evaluation of medical technologies.

    PubMed

    Luce, B R; Elixhauser, A

    1990-01-01

    The complexities and nuances of evaluating the costs associated with providing medical technologies are often underestimated by analysts engaged in economic evaluations. This article describes the theoretical underpinnings of cost estimation, emphasizing the importance of accounting for opportunity costs and marginal costs. The various types of costs that should be considered in an analysis are described; a listing of specific cost elements may provide a helpful guide to analysis. The process of identifying and estimating costs is detailed, and practical recommendations for handling the challenges of cost estimation are provided. The roles of sensitivity analysis and discounting are characterized, as are determinants of the types of costs to include in an analysis. Finally, common problems facing the analyst are enumerated with suggestions for managing these problems.

  6. Neutrality, abstinence, and the therapeutic alliance.

    PubMed

    Meissner, W W

    1998-01-01

    Concepts of neutrality and abstinence are discussed in terms of the variant opinions about them, pro and con, with particular reference to efforts to dispense with them based on the unavoidable role of the analyst's personal influence and subjectivity in the analytic process. Stereotypes of both neutrality and abstinence are examined, and the therapeutic alliance established as the most appropriate context within which to articulate the essential and constructive role of effective analytic neutrality and abstinence. The alliance is not possible without the persistent exercise of both neutrality and abstinence; conversely, other components of the alliance are intended to facilitate and preserve neutrality and abstinence on the part of both analyst and analysand. These elements are essential factors in effective analytic practice.

  7. Software for Partly Automated Recognition of Targets

    NASA Technical Reports Server (NTRS)

    Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark; Selinsky, T.

    2002-01-01

    The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user's tendencies while the user is selecting targets and to increase the user's productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.

  8. Setting analyst: A practical harvest planning technique

    Treesearch

    Olivier R.M. Halleux; W. Dale Greene

    2001-01-01

    Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...

  9. 21 CFR 1304.23 - Records for chemical analysts.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...

  10. 21 CFR 1304.23 - Records for chemical analysts.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...

  11. 21 CFR 1304.23 - Records for chemical analysts.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...

  12. 21 CFR 1304.23 - Records for chemical analysts.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...

  13. Bloodstain pattern classification: Accuracy, effect of contextual information and the role of analyst characteristics.

    PubMed

    Osborne, Nikola K P; Taylor, Michael C; Healey, Matthew; Zajac, Rachel

    2016-03-01

    It is becoming increasingly apparent that contextual information can exert a considerable influence on decisions about forensic evidence. Here, we explored accuracy and contextual influence in bloodstain pattern classification, and how these variables might relate to analyst characteristics. Thirty-nine bloodstain pattern analysts with varying degrees of experience each completed measures of compliance, decision-making style, and need for closure. Analysts then examined a bloodstain pattern without any additional contextual information, and allocated votes to listed pattern types according to favoured and less favoured classifications. Next, if they believed it would assist with their classification, analysts could request items of contextual information - from commonly encountered sources of information in bloodstain pattern analysis - and update their vote allocation. We calculated a shift score for each item of contextual information based on vote reallocation. Almost all forms of contextual information influenced decision-making, with medical findings leading to the highest shift scores. Although there was a small positive association between shift scores and the degree to which analysts displayed an intuitive decision-making style, shift scores did not vary meaningfully as a function of experience or the other characteristics measured. Almost all of the erroneous classifications were made by novice analysts. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  14. Disruptive Civil Technologies: Six Technologies With Potential Impacts on US Interests Out to 2025

    DTIC Science & Technology

    2008-04-01

    power (geopolitical, military, economic, or social cohesion). The six disruptive technologies were identified through a process carried out by...clustering, development of technology descriptors, screening, and prioritizing, analysts down-selected from 102 potentially disruptive technologies . They

  15. Sourcing Life Cycle Inventory Data

    EPA Science Inventory

    The collection and validation of quality lifecycle inventory (LCI) data can be the most difficult and time-consuming aspect of developing a life cycle assessment (LCA). Large amounts of process and production data are needed to complete the LCI. For many studies, the LCA analyst ...

  16. Decisions, Decisions....

    ERIC Educational Resources Information Center

    White, Owen Roberts

    1985-01-01

    The author reviews systems providing objective guidelines to facilitate ongoing, daily instructional decisions, focusing on those which utilize the sensitive datum and uniform charting procedures of precision teaching. Potential users are warned that the special education teacher must remain a critical and vigilant analyst of the learning process.…

  17. Modeling Business Processes in Public Administration

    NASA Astrophysics Data System (ADS)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  18. Information-Pooling Bias in Collaborative Security Incident Correlation Analysis.

    PubMed

    Rajivan, Prashanth; Cooke, Nancy J

    2018-03-01

    Incident correlation is a vital step in the cybersecurity threat detection process. This article presents research on the effect of group-level information-pooling bias on collaborative incident correlation analysis in a synthetic task environment. Past research has shown that uneven information distribution biases people to share information that is known to most team members and prevents them from sharing any unique information available with them. The effect of such biases on security team collaborations are largely unknown. Thirty 3-person teams performed two threat detection missions involving information sharing and correlating security incidents. Incidents were predistributed to each person in the team based on the hidden profile paradigm. Participant teams, randomly assigned to three experimental groups, used different collaboration aids during Mission 2. Communication analysis revealed that participant teams were 3 times more likely to discuss security incidents commonly known to the majority. Unaided team collaboration was inefficient in finding associations between security incidents uniquely available to each member of the team. Visualizations that augment perceptual processing and recognition memory were found to mitigate the bias. The data suggest that (a) security analyst teams, when conducting collaborative correlation analysis, could be inefficient in pooling unique information from their peers; (b) employing off-the-shelf collaboration tools in cybersecurity defense environments is inadequate; and (c) collaborative security visualization tools developed considering the human cognitive limitations of security analysts is necessary. Potential applications of this research include development of team training procedures and collaboration tool development for security analysts.

  19. Qualitative Information in Annual Reports & the Detection of Corporate Fraud: A Natural Language Processing Perspective

    ERIC Educational Resources Information Center

    Goel, Sunita

    2009-01-01

    High profile cases of fraudulent financial reporting such as those that occurred at Enron and WorldCom have shaken public confidence in the U.S. financial reporting process and have raised serious concerns about the roles of auditors, regulators, and analysts in financial reporting. In order to address these concerns and restore public confidence,…

  20. A new process for organizing assessments of social, economic, and environmental outcomes: case study of wildland fire management in the USA

    Treesearch

    Randall JF Bruins; Wayne R Jr. Munns; Stephen J Botti; Steve Brink; David Cleland; Larry Kapustka; Danny Lee; al. et

    2010-01-01

    Ecological risk assessments typically are organized using the processes of planning (a discussion among managers, stakeholders, and analysts to clarify ecosystem management goals and assessment scope) and problem formulation (evaluation of existing information to generate hypotheses about adverse ecological effects, select assessment endpoints, and develop an analysis...

  1. Analysis of a Memory Device Failure

    NASA Technical Reports Server (NTRS)

    Nicolas, David P.; Devaney, John; Gores, Mark; Dicken, Howard

    1998-01-01

    The recent failure of a vintage memory device presented a unique challenge to failure analysts. Normally device layouts, fabrication parameters and other technical information were available to assist the analyst in the analysis. However, this device was out of production for many years and the manufacturer was no longer in business, so the information was not available. To further complicate this analysis, the package leads were all but removed making additional electrical testing difficult. Under these conditions, new and innovative methods were used to analyze the failure. The external visual exam, radiography, PIND, and leak testing were performed with nominal results. Since electrical testing was precluded by the short lead lengths, the device was delidded to expose the internal structures for microscopic examination. No failure mechanism was identified. The available electrical data suggested an ESD or low level EOS type mechanism which left no visible surface damage. Due to parallel electrical paths, electrical probing on the chip failed to locate the failure site. Two non-destructive Scanning Electron Microscopy techniques, CIVA (Charge Induced Voltage Alteration) and EBIC (Electron Beam Induced Current), and a liquid crystal decoration technique which detects localized heating were employed to aid in the analysis. CIVA and EBIC isolated two faults in the input circuitry, and the liquid crystal technique further localized two hot spots in regions on two input gates. Removal of the glassivation and metallization revealed multiple failure sites located in the gate oxide of two input transistors suggesting machine (testing) induced damage.

  2. GPM Timeline Inhibits For IT Processing

    NASA Technical Reports Server (NTRS)

    Dion, Shirley K.

    2014-01-01

    The Safety Inhibit Timeline Tool was created as one approach to capturing and understanding inhibits and controls from IT through launch. Global Precipitation Measurement (GPM) Mission, which launched from Japan in March 2014, was a joint mission under a partnership between the National Aeronautics and Space Administration (NASA) and the Japan Aerospace Exploration Agency (JAXA). GPM was one of the first NASA Goddard in-house programs that extensively used software controls. Using this tool during the GPM buildup allowed a thorough review of inhibit and safety critical software design for hazardous subsystems such as the high gain antenna boom, solar array, and instrument deployments, transmitter turn-on, propulsion system release, and instrument radar turn-on. The GPM safety team developed a methodology to document software safety as part of the standard hazard report. As a result of this process, a new tool safety inhibit timeline was created for management of inhibits and their controls during spacecraft buildup and testing during IT at GSFC and at the launch range in Japan. The Safety Inhibit Timeline Tool was a pathfinder approach for reviewing software that controls the electrical inhibits. The Safety Inhibit Timeline Tool strengthens the Safety Analysts understanding of the removal of inhibits during the IT process with safety critical software. With this tool, the Safety Analyst can confirm proper safe configuration of a spacecraft during each IT test, track inhibit and software configuration changes, and assess software criticality. In addition to understanding inhibits and controls during IT, the tool allows the Safety Analyst to better communicate to engineers and management the changes in inhibit states with each phase of hardware and software testing and the impact of safety risks. Lessons learned from participating in the GPM campaign at NASA and JAXA will be discussed during this session.

  3. On Murray Jackson's 1961 'Chair, couch and countertransference'.

    PubMed

    Connolly, Angela

    2015-09-01

    One of the problems facing psychoanalysts of all schools is that theory has evolved at a much faster pace than practice. Whereas there has been an explosion of theory, practice has remained, at least officially, static and unchanging. It is in this sense that Murray Jackson's 1961 paper is still relevant today. Despite the rise of the new relational and intersubjective paradigms, most psychoanalysts, and not a few Jungian analysts, still seem to feel that the couch is an essential component of the analytical setting and process. If the use of the couch is usually justified by the argument that it favours regression, facilitates analytical reverie and protects the patient from the influence of the analyst, over time many important psychoanalysts have come to challenge this position. Increasingly these analysts suggest that the use of the couch may actually be incompatible with the newer theoretical models. This contention is strengthened by some of the findings coming from the neurosciences and infant research. This underlines the necessity of empirical research to verify the clinical effectiveness of these different positions, couch or face-to-face, but it is exactly this type of research that is lacking. © 2015, The Society of Analytical Psychology.

  4. Contextualized Language and Transferential Aspects of Context.

    PubMed

    Movahedi, Siamak

    2015-08-01

    The analytic process, in which the patient's and the analyst's internal characters struggle to create a script through the analysand's mouth and the analyst's pen, resembles Pirandello's Six Characters in Search of an Author (1921): different characters come together on the analytic stage to rehearse the analyst's role as coauthor of a play that depicts the ongoing analytic saga. To compose the text for the interplay of characters, the author must search for contexts that may confer meaning upon the words and actions of characters. This involves a search for a mise en scène (stage) that will assign mise en sens (meaning) to the actors' role-specific dialogue. Yet mise en scène, in the theatrical sense, is a set of iconic signs set with its own décor, props, and costumes. In contrast, the psychoanalytic scene is a symbolic stage for the play of words--words that may contain unconscious codes for switching into particular language games. A clinical case report describes a struggle with the contextual analysis of an aspect of a treatment that involved reported episodes of verbal indiscretions "taken out of context" with unwanted consequences. © 2015 by the American Psychoanalytic Association.

  5. Analytic neutrality, anonymity, abstinence, and elective self-disclosure.

    PubMed

    Shill, Merton A

    2004-01-01

    Recent contributions to the psychoanalytic literature propose new ways of understanding analytic neutrality, anonymity, abstinence, and self-disclosure. They advocate elective self-disclosure by the analyst as an antidote to the allegedly game-playing quality of transference and resistance analysis. The analytic relationship, they assert, becomes unreal when attempts are made to observe the principles of neutrality and abstinence. Both are seen as ill-conceived because of the irreducible subjectivity and unwarranted authority of the analyst. These relational and interactional views are criticized because (1) they ignore the fact that transference and resistance analysis have from Freud onward been accepted as minimal criteria qualifying a clinical process as psychoanalytic; (2) elective self-disclosure carries metapsychological implications dismissing not only Freud's theory of motivation but motivation as a basic feature of human personality; (3) they do not recognize interpersonal relations as mental events and so do not consider the ego's ability to create intrapsychic representations of object relations; (4) elective self-disclosures within the empathic parameters of the analytic situation are themselves unreal compared to the reality of the patient's experience with other objects. Abstinence and neutrality as ideals facilitate maintenance of an internal holding environment or container for the analyst's countertransference.

  6. Visual communication in the psychoanalytic situation.

    PubMed

    Kanzer, M

    1980-01-01

    The relationship between verbal and visual aspects of the analytic proceedings shows them blended integrally in the experiences of both patient and analyst and in contributing to the insights derived during the treatment. Areas in which the admixture of the verbal and visual occur are delineated. Awareness of the visual aspects gives substance to the operations of empathy, intuition, acting out, working through, etc. Some typical features of visual 'language" are noted and related to the analytic situation. As such they can be translated with the use of logic and consciousness on the analyst's part, not mere random eruptions of intuition. The original significance of dreams as a royal road to the unconscious is confirmed-but we also find in them insights to be derived with higher mental processes. Finally, dyadic aspects of the formation and aims of dreams during analysis are pointed out, with important implications for the analyst's own self-supervision of his techniques and 'real personality" and their effects upon the patient. how remarkable that Dora's dreams, all too belatedly teaching Freud about their transference implications, still have so much more to communicate that derives from his capacity to record faithfully observations he was not yet ready to explain.

  7. Utilizing functional near-infrared spectroscopy for prediction of cognitive workload in noisy work environments.

    PubMed

    Gabbard, Ryan; Fendley, Mary; Dar, Irfaan A; Warren, Rik; Kashou, Nasser H

    2017-10-01

    Occupational noise frequently occurs in the work environment in military intelligence, surveillance, and reconnaissance operations. This impacts cognitive performance by acting as a stressor, potentially interfering with the analysts' decision-making process. We investigated the effects of different noise stimuli on analysts' performance and workload in anomaly detection by simulating a noisy work environment. We utilized functional near-infrared spectroscopy (fNIRS) to quantify oxy-hemoglobin (HbO) and deoxy-hemoglobin concentration changes in the prefrontal cortex (PFC), as well as behavioral measures, which include eye tracking, reaction time, and accuracy rate. We hypothesized that noisy environments would have a negative effect on the participant in terms of anomaly detection performance due to the increase in workload, which would be reflected by an increase in PFC activity. We found that HbO for some of the channels analyzed were significantly different across noise types ([Formula: see text]). Our results also indicated that HbO activation for short-intermittent noise stimuli was greater in the PFC compared to long-intermittent noises. These approaches using fNIRS in conjunction with an understanding of the impact on human analysts in anomaly detection could potentially lead to better performance by optimizing work environments.

  8. Visualization of Spatio-Temporal Relations in Movement Event Using Multi-View

    NASA Astrophysics Data System (ADS)

    Zheng, K.; Gu, D.; Fang, F.; Wang, Y.; Liu, H.; Zhao, W.; Zhang, M.; Li, Q.

    2017-09-01

    Spatio-temporal relations among movement events extracted from temporally varying trajectory data can provide useful information about the evolution of individual or collective movers, as well as their interactions with their spatial and temporal contexts. However, the pure statistical tools commonly used by analysts pose many difficulties, due to the large number of attributes embedded in multi-scale and multi-semantic trajectory data. The need for models that operate at multiple scales to search for relations at different locations within time and space, as well as intuitively interpret what these relations mean, also presents challenges. Since analysts do not know where or when these relevant spatio-temporal relations might emerge, these models must compute statistical summaries of multiple attributes at different granularities. In this paper, we propose a multi-view approach to visualize the spatio-temporal relations among movement events. We describe a method for visualizing movement events and spatio-temporal relations that uses multiple displays. A visual interface is presented, and the user can interactively select or filter spatial and temporal extents to guide the knowledge discovery process. We also demonstrate how this approach can help analysts to derive and explain the spatio-temporal relations of movement events from taxi trajectory data.

  9. Modes of therapeutic action.

    PubMed

    Jones, E E

    1997-12-01

    The dialectic in psychoanalysis between theories about the mutative effects of interpretation and psychological knowledge and those concerning the effects of interpersonal interaction constitutes an important tension for approaches to psychoanalytic technique. This essay briefly summarises the thinking around these alternative conceptualisations of therapeutic action, and introduces a new empirically derived model, that of 'repetitive interaction structure', which attempts to bridge therapeutic action by insight and by relationship. Interaction structure is a way of formulating those aspects of the analytic process that have come to be termed intersubjectivity, transference-countertransference enactments and role responsiveness. The concept operationalises important aspects of interpersonal interaction, and can help specify the two-person patterns that emerge in an analysis. Patient and analyst interact in repetitive ways; these patterns of interaction, which are slow to change, probably reflect the psychological structure of both patient and analyst, whether psychic structure is conceptualised in terms of object-representations or compromise formations and impulse-defence configurations. Therapeutic action is located in the experience, recognition and understanding by patient and analyst of these repetitive interactions. Interaction structures stress the importance of the intrapsychic as a basis for what becomes manifest in the interactive field. Clinical illustrations from a psychoanalysis are provided, and research on repetitive interaction structures is described.

  10. A COMPARISON OF INTER-ANALYST DIFFERENCES IN THE CLASSIFICATION OF A LANDSAT TEM+ SCENE IN SOUTH-CENTRAL VIRGINIA

    EPA Science Inventory

    This study examined inter-analyst classification variability based on training site signature selection only for six classifications from a 10 km2 Landsat ETM+ image centered over a highly heterogeneous area in south-central Virginia. Six analysts classified the image...

  11. [Concordance among analysts from Latin-American laboratories for rice grain appearance determination using a gallery of digital images].

    PubMed

    Avila, Manuel; Graterol, Eduardo; Alezones, Jesús; Criollo, Beisy; Castillo, Dámaso; Kuri, Victoria; Oviedo, Norman; Moquete, Cesar; Romero, Marbella; Hanley, Zaida; Taylor, Margie

    2012-06-01

    The appearance of rice grain is a key aspect in quality determination. Mainly, this analysis is performed by expert analysts through visual observation; however, due to the subjective nature of the analysis, the results may vary among analysts. In order to evaluate the concordance between analysts from Latin-American rice quality laboratories for rice grain appearance through digital images, an inter-laboratory test was performed with ten analysts and images of 90 grains captured with a high resolution scanner. Rice grains were classified in four categories including translucent, chalky, white belly, and damaged grain. Data was categorized using statistic parameters like mode and its frequency, the relative concordance, and the reproducibility parameter kappa. Additionally, a referential image gallery of typical grain for each category was constructed based on mode frequency. Results showed a Kappa value of 0.49, corresponding to a moderate reproducibility, attributable to subjectivity in the visual analysis of grain images. These results reveal the need for standardize the evaluation criteria among analysts to improve the confidence of the determination of rice grain appearance.

  12. Composable Analytic Systems for next-generation intelligence analysis

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Llinas, James; Barry, Kevin

    2015-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.

  13. 40 CFR Appendix C to Part 425 - Definition and Procedure for the Determination of the Method Detection Limit 1

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... It is recognized that the experience of the analyst is important to this process. However, the..., G.D., Quave, S.A., and Budde, W.L., “Trace Analysis for Wastewaters,” Environmental Science and...

  14. 40 CFR Appendix C to Part 425 - Definition and Procedure for the Determination of the Method Detection Limit 1

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... It is recognized that the experience of the analyst is important to this process. However, the..., G.D., Quave, S.A., and Budde, W.L., “Trace Analysis for Wastewaters,” Environmental Science and...

  15. 40 CFR Appendix C to Part 425 - Definition and Procedure for the Determination of the Method Detection Limit 1

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... It is recognized that the experience of the analyst is important to this process. However, the..., G.D., Quave, S.A., and Budde, W.L., “Trace Analysis for Wastewaters,” Environmental Science and...

  16. 40 CFR Appendix C to Part 425 - Definition and Procedure for the Determination of the Method Detection Limit 1

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... It is recognized that the experience of the analyst is important to this process. However, the..., G.D., Quave, S.A., and Budde, W.L., “Trace Analysis for Wastewaters,” Environmental Science and...

  17. Enactment controversies: a critical review of current debates.

    PubMed

    Ivey, Gavin

    2008-02-01

    This critical review of the current disputes concerning countertransference enactment systematically outlines the various issues and the perspectives adopted by the relevant psychoanalytic authors. In the light of this the 'common ground ' hypothesis concerning the unifying influence of contemporary countertransference theory is challenged. While the existence of enactments, minimally defined as the analyst's inadvertent actualization of the patient's transference fantasies, is widely accepted, controversies regarding the specific scope, nature, prevalence, relationship to countertransference experience, impact on the analytic process, role played by the analyst's subjectivity, and the correct handling of enactments abound. Rather than taking a stand based on ideological allegiance to any particular psychoanalytic school or philosophical position, the author argues that the relative merits of contending perspectives is best evaluated with reference to close process scrutiny of the context, manifestation and impact of specific enactments on patients' intrapsychic functioning and the analytic relationship. A detailed account of an interpretative enactment provides a context for the author's position on these debates.

  18. Aspect-Oriented Business Process Modeling with AO4BPMN

    NASA Astrophysics Data System (ADS)

    Charfi, Anis; Müller, Heiko; Mezini, Mira

    Many crosscutting concerns in business processes need to be addressed already at the business process modeling level such as compliance, auditing, billing, and separation of duties. However, existing business process modeling languages including OMG's Business Process Modeling Notation (BPMN) lack appropriate means for expressing such concerns in a modular way. In this paper, we motivate the need for aspect-oriented concepts in business process modeling languages and propose an aspect-oriented extension to BPMN called AO4BPMN. We also present a graphical editor supporting that extension.

  19. Integrated Business Process Adaptation towards Friction-Free Business-to-Business Collaboration

    ERIC Educational Resources Information Center

    Shan, Zhe

    2011-01-01

    One key issue in process-aware E-commerce collaboration is the orchestration of business processes of multiple business partners throughout a supply chain network in an automated and seamless way. Since each partner has its own internal processes with different control flow structures and message interfaces, the real challenge lies in verifying…

  20. Enhancing Student Learning of Enterprise Integration and Business Process Orientation through an ERP Business Simulation Game

    ERIC Educational Resources Information Center

    Seethamraju, Ravi

    2011-01-01

    The sophistication of the integrated world of work and increased recognition of business processes as critical corporate assets require graduates to develop "process orientation" and an "integrated view" of business. Responding to these dynamic changes in business organizations, business schools are also continuing to modify…

  1. Articulating the Resources for Business Process Analysis and Design

    ERIC Educational Resources Information Center

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  2. Milestones of mathematical model for business process management related to cost estimate documentation in petroleum industry

    NASA Astrophysics Data System (ADS)

    Khamidullin, R. I.

    2018-05-01

    The paper is devoted to milestones of the optimal mathematical model for a business process related to cost estimate documentation compiled during construction and reconstruction of oil and gas facilities. It describes the study and analysis of fundamental issues in petroleum industry, which are caused by economic instability and deterioration of a business strategy. Business process management is presented as business process modeling aimed at the improvement of the studied business process, namely main criteria of optimization and recommendations for the improvement of the above-mentioned business model.

  3. An analysis of Milwaukee county land use

    NASA Technical Reports Server (NTRS)

    Todd, W. J.; Mausel, P. E.

    1973-01-01

    The identification and classification of urban and suburban phenomena through analysis of remotely-acquired sensor data can provide information of great potential value to many regional analysts. Such classifications, particularly those using spectral data obtained from satellites such as the first Earth Resources Technology Satellite (ERTS-1) orbited by NASA, allow rapid frequent and accurate general land use inventories that are of value in many types of spatial analyses. In this study, Milwaukee County, Wisconsin was classified into several broad land use categories on the basis of computer analysis of four bands of ERTS spectral data (ERTS Frame Number E1017-16093). Categories identified were: (1) road-central business district, (2) grass (green vegetation), (3) suburban, (4) wooded suburb, (5) heavy industry, (6) inner city, and (7) water. Overall, 90 percent accuracy was attained in classification of these urban land use categories.

  4. GASP- General Aviation Synthesis Program. Volume 1: Main program. Part 1: Theoretical development

    NASA Technical Reports Server (NTRS)

    Hague, D.

    1978-01-01

    The General Aviation synthesis program performs tasks generally associated with aircraft preliminary design and allows an analyst the capability of performing parametric studies in a rapid manner. GASP emphasizes small fixed-wing aircraft employing propulsion systems varying froma single piston engine with fixed pitch propeller through twin turboprop/ turbofan powered business or transport type aircraft. The program, which may be operated from a computer terminal in either the batch or interactive graphic mode, is comprised of modules representing the various technical disciplines integrated into a computational flow which ensures that the interacting effects of design variables are continuously accounted for in the aircraft sizing procedure. The model is a useful tool for comparing configurations, assessing aircraft performance and economics, performing tradeoff and sensitivity studies, and assessing the impact of advanced technologies on aircraft performance and economics.

  5. Improve Performance of Data Warehouse by Query Cache

    NASA Astrophysics Data System (ADS)

    Gour, Vishal; Sarangdevot, S. S.; Sharma, Anand; Choudhary, Vinod

    2010-11-01

    The primary goal of data warehouse is to free the information locked up in the operational database so that decision makers and business analyst can make queries, analysis and planning regardless of the data changes in operational database. As the number of queries is large, therefore, in certain cases there is reasonable probability that same query submitted by the one or multiple users at different times. Each time when query is executed, all the data of warehouse is analyzed to generate the result of that query. In this paper we will study how using query cache improves performance of Data Warehouse and try to find the common problems faced. These kinds of problems are faced by Data Warehouse administrators which are minimizes response time and improves the efficiency of query in data warehouse overall, particularly when data warehouse is updated at regular interval.

  6. The Hazard Mapping System (HMS)-a Multiplatform Remote Sensing Approach to Fire and Smoke Detection

    NASA Astrophysics Data System (ADS)

    Kibler, J.; Ruminski, M. G.

    2003-12-01

    The HMS is a multiplatform remote sensing approach to detecting fires and smoke over the US and adjacent areas of Canada and Mexico that has been in place since June 2002. This system is an integral part of the National Environmental Satellite and Data Information Service (NESDIS) near realtime hazard detection and mitigation efforts. The system utilizes NOAA's Geostationary Operational Environmental Satellites (GOES), Polar Operational Environmental Satellites (POES) and the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on NASA's Terra and Aqua spacecraft. Automated detection algorithms are employed for each of the satellites for the fire detects while smoke is added by a satellite image analyst. In June 2003 the HMS underwent an upgrade. A number of features were added for users of the products generated on the HMS. Sectors covering Alaska and Hawaii were added. The use of Geographic Information System (GIS) shape files for smoke analysis is a new feature. Shape files show the progression and time of a single smoke plume as each analysis is drawn and then updated. The analyst now has the ability to view GOES, POES, and MODIS data in a single loop. This allows the fire analyst the ability to easily confirm a fire in three different data sets. The upgraded HMS has faster satellite looping and gives the analyst the ability to design a false color image for a particular region. The GOES satellites provide a relatively coarse 4 km infrared resolution at satellite subpoint for thermal fire detection but provide the advantage of a rapid update cycle. GOES imagery is updated every 15 minutes utilizing both GOES-10 and GOES-12. POES imagery from NOAA-15, NOAA-16 and NOAA-17 and MODIS from Terra and Aqua are employed with each satellite providing twice per day coverage (more frequent over Alaska). While the frequency of imagery is much less than with GOES the higher resolution of these satellites (1 km along the suborbital track) allows for detection of smaller and/or cooler burning fires. Each of the algorithms utilizes a number of temporal, thermal and contextual filters in an attempt to screen out false detects. However, false detects do get processed by the algorithms to varying degrees. Therefore, the automated fire detects from each algorithm are quality controlled by an analyst who scans the imagery and may either accept or delete fire points. The analyst also has the ability to manually add additional fire points based on the imagery. Smoke is outlined by the analyst using visible imagery, primarily GOES which provides 1 km resolution. Occasionally a smoke plume seen in visible imagery is the only indicator of a fire and would be manually added to the fire detect file. The Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) is a forecast model that projects the trajectory and dispersion of a smoke plume over a period of time. The HYSPLIT is run for fires that are selected by the analyst that are seen to be producing a significant smoke plume. The analyst defines a smoke producing area commensurate with the size of the fire and amount of smoke detected. The output is hosted on an Air Resources Lab (ARL) web site which can be accessed from the web site listed below. All of the information is posted to the web page noted below. Besides the interactive GIS presentation users can view the product in graphical jpg format. The analyst edited points as well as the unedited automated fire detects are available for users to view directly on the web page or to download. All of the data is also archived and accessed via ftp.

  7. The Similarity of Job Types Reported from Two Independent Analyses of Occupational Data. Interim Report. April 2, 1973-October 12, 1973.

    ERIC Educational Resources Information Center

    Watson, William J.

    Occupational analysts using Comprehensive Occupational Data Analysis Programs (CODAP) make subjective decisions at various stages in their analysis of an occupation. The possibility exists that two different analysts could reach different conclusions in analyzing an occupation, and thereby provide divergent guidance to management. Two analysts,…

  8. A case history: from traumatic repetition towards psychic representability.

    PubMed

    Bichi, Estela L

    2008-06-01

    This paper is devoted principally to a case history concerning an analytic process extending over a period of almost ten years. The patient is B, who consulted the author after a traumatic episode. Although that was her reason for commencing treatment, a history of previous traumatogenic situations, including a rape during her adolescence, subsequently came to light. The author describes three stages of the treatment, reflected in three different settings in accordance with the work done by both patient and analyst in enabling B to own and work through her infantile and adult traumatic experiences. The process of transformation of traumatic traces lacking psychic representation, which was undertaken by both members of the analytic couple from the beginning of the treatment, was eventually approached in a particular way on the basis of their respective creative capacities, which facilitated the patient's psychic progress towards representability and the possibility of working through the experiences of the past. Much of the challenge of this case involved the analyst's capacity to maintain and at the same time consolidate her analytic posture within her internal setting, while doing her best to overcome any possible misfit (Balint, 1968) between her own technique and the specific complexities of the individual patient. The account illustrates the alternation of phases, at the beginning of the analysis, of remembering and interpretation on the one hand and of the representational void and construction on the other. In the case history proper and in her detailed summing up, the author refers to the place of the analyst during the analytic process, the involvement of her psychic functioning, and the importance of her capacity to work on and make use of her countertransference and self-analytic introspection, with a view to neutralizing any influence that aspects of her 'real person' might have had on the analytic field and on the complex processes taking place within it.

  9. An Application of Business Process Management to Health Care Facilities.

    PubMed

    Hassan, Mohsen M D

    The purpose of this article is to help health care facility managers and personnel identify significant elements of their facilities to address, and steps and actions to follow, when applying business process management to them. The ABPMP (Association of Business Process Management Professionals) life-cycle model of business process management is adopted, and steps from Lean, business process reengineering, and Six Sigma, and actions from operations management are presented to implement it. Managers of health care facilities can find in business process management a more comprehensive approach to improving their facilities than Lean, Six Sigma, business process reengineering, and ad hoc approaches that does not conflict with them because many of their elements can be included under its umbrella. Furthermore, the suggested application of business process management can guide and relieve them from selecting among these approaches, as well as provide them with specific steps and actions that they can follow. This article fills a gap in the literature by presenting a much needed comprehensive application of business process management to health care facilities that has specific steps and actions for implementation.

  10. NPTool: Towards Scalability and Reliability of Business Process Management

    NASA Astrophysics Data System (ADS)

    Braghetto, Kelly Rosa; Ferreira, João Eduardo; Pu, Calton

    Currently one important challenge in business process management is provide at the same time scalability and reliability of business process executions. This difficulty becomes more accentuated when the execution control assumes complex countless business processes. This work presents NavigationPlanTool (NPTool), a tool to control the execution of business processes. NPTool is supported by Navigation Plan Definition Language (NPDL), a language for business processes specification that uses process algebra as formal foundation. NPTool implements the NPDL language as a SQL extension. The main contribution of this paper is a description of the NPTool showing how the process algebra features combined with a relational database model can be used to provide a scalable and reliable control in the execution of business processes. The next steps of NPTool include reuse of control-flow patterns and support to data flow management.

  11. Amie Sluiter | NREL

    Science.gov Websites

    biomass analysis methods and is primary author on 11 Laboratory Analytical Procedures, which are ) spectroscopic analysis methods. These methods allow analysts to predict the composition of feedstock and process . Patent No. 6,737,258 (2002) Featured Publications "Improved methods for the determination of drying

  12. Should I use that model? Assessing the transferability of ecological models to new settings

    EPA Science Inventory

    Analysts and scientists frequently apply existing models that estimate ecological endpoints or simulate ecological processes to settings where the models have not been used previously, and where data to parameterize and validate the model may be sparse. Prior to transferring an ...

  13. Three Perspectives on Innovation--the Technological, the Political, and the Cultural. Draft.

    ERIC Educational Resources Information Center

    House, Ernest R.

    Awareness of the three analytical perspectives on educational innovation leads to better understanding of educational change processes and better innovation strategies and policies. The three perspectives--technological, political, and cultural--are "screens" of facts, values, and presuppositions through which analysts view innovation.…

  14. The analyst's authenticity: "if you see something, say something".

    PubMed

    Goldstein, George; Suzuki, Jessica Y

    2015-05-01

    The history of authenticity in psychoanalysis is as old as analysis itself, but the analyst's authenticity in particular has become an increasingly important area of focus in recent decades. This article traces the development of conceptions of analytic authenticity and proposes that the analyst's spontaneous verbalization of his or her unformulated experience in session can be a potent force in the course of an analysis. We acknowledge that although analytic authenticity can be a challenging ideal for the analyst to strive for, it contains the power to transform the experience of the patient and the analyst, as well as the meaning of their work together. Whether it comes in the form of an insight-oriented comment or a simple acknowledgment of things as they seem to be, a therapist's willingness to speak aloud something that has lost its language is a powerful clinical phenomenon that transcends theoretical orientation and modality. © 2015 Wiley Periodicals, Inc.

  15. Instruction in information structuring improves Bayesian judgment in intelligence analysts.

    PubMed

    Mandel, David R

    2015-01-01

    An experiment was conducted to test the effectiveness of brief instruction in information structuring (i.e., representing and integrating information) for improving the coherence of probability judgments and binary choices among intelligence analysts. Forty-three analysts were presented with comparable sets of Bayesian judgment problems before and immediately after instruction. After instruction, analysts' probability judgments were more coherent (i.e., more additive and compliant with Bayes theorem). Instruction also improved the coherence of binary choices regarding category membership: after instruction, subjects were more likely to invariably choose the category to which they assigned the higher probability of a target's membership. The research provides a rare example of evidence-based validation of effectiveness in instruction to improve the statistical assessment skills of intelligence analysts. Such instruction could also be used to improve the assessment quality of other types of experts who are required to integrate statistical information or make probabilistic assessments.

  16. The lure of the symptom in psychoanalytic treatment.

    PubMed

    Ogden, Thomas H; Gabbard, Glen O

    2010-06-01

    Psychoanalysis, which at its core is a search for truth, stands in a subversive position vis-à-vis the contemporary therapeutic culture that places a premium on symptomatic "cure." Nevertheless, analysts are vulnerable to succumbing to the internal and external pressures for the achievement of symptomatic improvement. In this communication we trace the evolution of Freud's thinking about the relationship between the aims of psychoanalysis and the alleviation of symptoms. We note that analysts today may recapitulate Freud's early struggles in their pursuit of symptom removal. We present an account of a clinical consultation in which the analytic pair were ensnared in an impasse that involved the analyst's preoccupation with the intransigence of one of the patient's symptoms. We suggest alternative ways of working with these clinical issues and offer some thoughts on how our own work as analysts and consultants to colleagues has been influenced by our understanding of what frequently occurs when the analyst becomes symptom-focused.

  17. Self-disclosure, trauma and the pressures on the analyst.

    PubMed

    West, Marcus

    2017-09-01

    This paper argues that self-disclosure is intimately related to traumatic experience and the pressures on the analyst not to re-traumatize the patient or repeat traumatic dynamics. The paper gives a number of examples of such pressures and outlines the difficulties the analyst may experience in adopting an analytic attitude - attempting to stay as closely as possible with what the patient brings. It suggests that self-disclosure may be used to try to disconfirm the patient's negative sense of themselves or the analyst, or to try to induce a positive sense of self or of the analyst which, whilst well-meaning, may be missing the point and may be prolonging the patient's distress. Examples are given of staying with the co-construction of the traumatic early relational dynamics and thus working through the traumatic complex; this attitude is compared and contrasted with some relational psychoanalytic attitudes. © 2017, The Society of Analytical Psychology.

  18. Principles of computer processing of Landsat data for geologic applications

    USGS Publications Warehouse

    Taranik, James V.

    1978-01-01

    The main objectives of computer processing of Landsat data for geologic applications are to improve display of image data to the analyst or to facilitate evaluation of the multispectral characteristics of the data. Interpretations of the data are made from enhanced and classified data by an analyst trained in geology. Image enhancements involve adjustments of brightness values for individual picture elements. Image classification involves determination of the brightness values of picture elements for a particular cover type. Histograms are used to display the range and frequency of occurrence of brightness values. Landsat-1 and -2 data are preprocessed at Goddard Space Flight Center (GSFC) to adjust for the detector response of the multispectral scanner (MSS). Adjustments are applied to minimize the effects of striping, adjust for bad-data lines and line segments and lost individual pixel data. Because illumination conditions and landscape characteristics vary considerably and detector response changes with time, the radiometric adjustments applied at GSFC are seldom perfect and some detector striping remain in Landsat data. Rotation of the Earth under the satellite and movements of the satellite platform introduce geometric distortions in the data that must also be compensated for if image data are to be correctly displayed to the data analyst. Adjustments to Landsat data are made to compensate for variable solar illumination and for atmospheric effects. GeoMetric registration of Landsat data involves determination of the spatial location of a pixel in. the output image and the determination of a new value for the pixel. The general objective of image enhancement is to optimize display of the data to the analyst. Contrast enhancements are employed to expand the range of brightness values in Landsat data so that the data can be efficiently recorded in a manner desired by the analyst. Spatial frequency enhancements are designed to enhance boundaries between features which have subtle differences in brightness values. Ratioing tends to reduce the effects due to topography and it tends to emphasize changes in brightness values between two Landsat bands. Simulated natural color is produced for geologists so that the colors of materials on images appear similar to colors of actual materials in the field. Image classification of Landsat data involves both machine assisted delineation of multispectral patterns in four-dimensional spectral space and identification of machine delineated multispectral patterns that represent particular cover conditions. The geological information derived from an analysis of a multispectral classification is usually related to lithology.

  19. Bureaucracy and creativity: do they make companionable bedfellows?

    PubMed

    Wiener, Jan

    2017-11-01

    This essay will look at the benefits and weaknesses of the increasingly bureaucratic nature of training structures and processes in the training of Jungian psychotherapists and analysts. The author will draw on her experiences during two different periods of time as Director of Training at the Society of Analytical Psychology in London with observations on and discussion about some of the changes that have evolved. By way of contrast, she will offer some comparisons with developments in the training of Jungian analysts in countries with little or no legacy of an analytic culture. Here, there is a need to professionalize training in Jungian analysis but the attendant growth of bureaucracy can easily come to echo the politics of non-democratic regimes. © 2017, The Society of Analytical Psychology.

  20. Business Process Design Method Based on Business Event Model for Enterprise Information System Integration

    NASA Astrophysics Data System (ADS)

    Kobayashi, Takashi; Komoda, Norihisa

    The traditional business process design methods, in which the usecase is the most typical, have no useful framework to design the activity sequence with. Therefore, the design efficiency and quality vary widely according to the designer’s experience and skill. In this paper, to solve this problem, we propose the business events and their state transition model (a basic business event model) based on the language/action perspective, which is the result in the cognitive science domain. In the business process design, using this model, we decide event occurrence conditions so that every event synchronizes with each other. We also propose the design pattern to decide the event occurrence condition (a business event improvement strategy). Lastly, we apply the business process design method based on the business event model and the business event improvement strategy to the credit card issue process and estimate its effect.

  1. Multidimensional Data Modeling for Business Process Analysis

    NASA Astrophysics Data System (ADS)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  2. Business Development Process

    DTIC Science & Technology

    2001-10-31

    832-4736. DISTRIBUTION STATEMENT A Approved for Public Release Distribution Unlimited Attorney Docket No. 83042 BUSINESS DEVELOPMENT PROCESS TO... BUSINESS DEVELOPMENT PROCESS 3 4 STATEMENT OF GOVERNMENT INTEREST 5 The invention described herein may be manufactured and used 6 by or for the...INVENTION 11 (1) Field of the Invention 12 This invention generally relates to a business 13 development process for assessing new business ideas

  3. A method of demand-driven and data-centric Web service configuration for flexible business process implementation

    NASA Astrophysics Data System (ADS)

    Xu, Boyi; Xu, Li Da; Fei, Xiang; Jiang, Lihong; Cai, Hongming; Wang, Shuai

    2017-08-01

    Facing the rapidly changing business environments, implementation of flexible business process is crucial, but difficult especially in data-intensive application areas. This study aims to provide scalable and easily accessible information resources to leverage business process management. In this article, with a resource-oriented approach, enterprise data resources are represented as data-centric Web services, grouped on-demand of business requirement and configured dynamically to adapt to changing business processes. First, a configurable architecture CIRPA involving information resource pool is proposed to act as a scalable and dynamic platform to virtualise enterprise information resources as data-centric Web services. By exposing data-centric resources as REST services in larger granularities, tenant-isolated information resources could be accessed in business process execution. Second, dynamic information resource pool is designed to fulfil configurable and on-demand data accessing in business process execution. CIRPA also isolates transaction data from business process while supporting diverse business processes composition. Finally, a case study of using our method in logistics application shows that CIRPA provides an enhanced performance both in static service encapsulation and dynamic service execution in cloud computing environment.

  4. Department of the Air Force Information Technology Program FY 95 President’s Budget

    DTIC Science & Technology

    1994-03-01

    2095 2200 552 900 1032 Description: Contractor hardware maintenan support, systems analyst support software development and maintenance, and off -the...hardware maintenance support, systems analyst support, operations support, configuration management, test support, and off -the-shelf software license...2419 2505 2594 Description: Contractor hardware maintenance support, systems analyst support, operations support, and off -the-shelf software license

  5. Nothing but the truth: self-disclosure, self-revelation, and the persona of the analyst.

    PubMed

    Levine, Susan S

    2007-01-01

    The question of the analyst's self-disclosure and self-revelation inhabits every moment of every psychoanalytic treatment. All self-disclosures and revelations, however, are not equivalent, and differentiating among them allows us to define a construct that can be called the analytic persona. Analysts already rely on an unarticulated concept of an analytic persona that guides them, for instance, as they decide what constitutes appropriate boundaries. Clinical examples illustrate how self-disclosures and revelations from within and without the analytic persona feel different, for both patient and analyst. The analyst plays a specific role for each patient and is both purposefully and unconsciously different in this context than in other settings. To a great degree, the self is a relational phenomenon. Our ethics call for us to tell nothing but the truth and simultaneously for us not to tell the whole truth. The unarticulated working concept of an analytic persona that many analysts have refers to the self we step out of at the close of each session and the self we step into as the patient enters the room. Attitudes toward self-disclosure and self-revelation can be considered reflections of how we conceptualize this persona.

  6. Analyst-centered models for systems design, analysis, and development

    NASA Technical Reports Server (NTRS)

    Bukley, A. P.; Pritchard, Richard H.; Burke, Steven M.; Kiss, P. A.

    1988-01-01

    Much has been written about the possible use of Expert Systems (ES) technology for strategic defense system applications, particularly for battle management algorithms and mission planning. It is proposed that ES (or more accurately, Knowledge Based System (KBS)) technology can be used in situations for which no human expert exists, namely to create design and analysis environments that allow an analyst to rapidly pose many different possible problem resolutions in game like fashion and to then work through the solution space in search of the optimal solution. Portions of such an environment exist for expensive AI hardware/software combinations such as the Xerox LOOPS and Intellicorp KEE systems. Efforts are discussed to build an analyst centered model (ACM) using an ES programming environment, ExperOPS5 for a simple missile system tradeoff study. By analyst centered, it is meant that the focus of learning is for the benefit of the analyst, not the model. The model's environment allows the analyst to pose a variety of what if questions without resorting to programming changes. Although not an ES per se, the ACM would allow for a design and analysis environment that is much superior to that of current technologies.

  7. Fault Tree in the Trenches, A Success Story

    NASA Technical Reports Server (NTRS)

    Long, R. Allen; Goodson, Amanda (Technical Monitor)

    2000-01-01

    Getting caught up in the explanation of Fault Tree Analysis (FTA) minutiae is easy. In fact, most FTA literature tends to address FTA concepts and methodology. Yet there seems to be few articles addressing actual design changes resulting from the successful application of fault tree analysis. This paper demonstrates how fault tree analysis was used to identify and solve a potentially catastrophic mechanical problem at a rocket motor manufacturer. While developing the fault tree given in this example, the analyst was told by several organizations that the piece of equipment in question had been evaluated by several committees and organizations, and that the analyst was wasting his time. The fault tree/cutset analysis resulted in a joint-redesign of the control system by the tool engineering group and the fault tree analyst, as well as bragging rights for the analyst. (That the fault tree found problems where other engineering reviews had failed was not lost on the other engineering groups.) Even more interesting was that this was the analyst's first fault tree which further demonstrates how effective fault tree analysis can be in guiding (i.e., forcing) the analyst to take a methodical approach in evaluating complex systems.

  8. Service Oriented Architecture for Coast Guard Command and Control

    DTIC Science & Technology

    2007-03-01

    Operations BPEL4WS The Business Process Execution Language for Web Services BPMN Business Process Modeling Notation CASP Computer Aided Search Planning...Business Process Modeling Notation ( BPMN ) provides a standardized graphical notation for drawing business processes in a workflow. Software tools

  9. Business Process Reengineering for Quality Improvement.

    DTIC Science & Technology

    1995-07-01

    17 1.7.3. UNDERSTANDING BUSINESS VALUE...44 3.1.1. UNDERSTANDING BUSINESS PROCESS...targets which will promote the survival and growth of the business. 1.7.3. UNDERSTANDING BUSINESS VALUE Many businesses that have survived the test of time

  10. Improvement of hospital processes through business process management in Qaem Teaching Hospital: A work in progress.

    PubMed

    Yarmohammadian, Mohammad H; Ebrahimipour, Hossein; Doosty, Farzaneh

    2014-01-01

    In a world of continuously changing business environments, organizations have no option; however, to deal with such a big level of transformation in order to adjust the consequential demands. Therefore, many companies need to continually improve and review their processes to maintain their competitive advantages in an uncertain environment. Meeting these challenges requires implementing the most efficient possible business processes, geared to the needs of the industry and market segments that the organization serves globally. In the last 10 years, total quality management, business process reengineering, and business process management (BPM) have been some of the management tools applied by organizations to increase business competiveness. This paper is an original article that presents implementation of "BPM" approach in the healthcare domain that allows an organization to improve and review its critical business processes. This project was performed in "Qaem Teaching Hospital" in Mashhad city, Iran and consists of four distinct steps; (1) identify business processes, (2) document the process, (3) analyze and measure the process, and (4) improve the process. Implementing BPM in Qaem Teaching Hospital changed the nature of management by allowing the organization to avoid the complexity of disparate, soloed systems. BPM instead enabled the organization to focus on business processes at a higher level.

  11. Aspects of the BPRIM Language for Risk Driven Process Engineering

    NASA Astrophysics Data System (ADS)

    Sienou, Amadou; Lamine, Elyes; Pingaud, Hervé; Karduck, Achim

    Nowadays organizations are exposed to frequent changes in business environment requiring continuous alignment of business processes on business strategies. This agility requires methods promoted in enterprise engineering approaches. Risk consideration in enterprise engineering is getting important since the business environment is becoming more and more competitive and unpredictable. Business processes are subject to the same quality requirements as material and human resources. Thus, process management is supposed to tackle value creation challenges but also the ones related to value preservation. Our research considers risk driven business process design as an integral part of enterprise engineering. A graphical modelling language for risk driven business process engineering was introduced in former research. This paper extends the language and handles questions related to modelling risk in organisational context.

  12. Modeling Business Processes of the Social Insurance Fund in Information System Runa WFE

    NASA Astrophysics Data System (ADS)

    Kataev, M. Yu; Bulysheva, L. A.; Xu, Li D.; Loseva, N. V.

    2016-08-01

    Introduction - Business processes are gradually becoming a tool that allows you at a new level to put employees or to make more efficient document management system. In these directions the main work, and presents the largest possible number of publications. However, business processes are still poorly implemented in public institutions, where it is very difficult to formalize the main existing processes. Us attempts to build a system of business processes for such state agencies as the Russian social insurance Fund (SIF), where virtually all of the processes, when different inputs have the same output: public service. The parameters of the state services (as a rule, time limits) are set by state laws and regulations. The article provides a brief overview of the FSS, the formulation of requirements to business processes, the justification of the choice of software for modeling business processes and create models of work in the system Runa WFE and optimization models one of the main business processes of the FSS. The result of the work of Runa WFE is an optimized model of the business process of FSS.

  13. Memory Forensics: Review of Acquisition and Analysis Techniques

    DTIC Science & Technology

    2013-11-01

    Management Overview Processes running on modern multitasking operating systems operate on an abstraction of RAM, called virtual memory [7]. In these systems...information such as user names, email addresses and passwords [7]. Analysts also use tools such as WinHex to identify headers or other suspicious data within

  14. Improved processes for meeting the data requirements for implementing the Highway Safety Manual (HSM) and Safety Analyst in Florida.

    DOT National Transportation Integrated Search

    2014-03-01

    Recent research in highway safety has focused on the more advanced and statistically proven techniques of highway : safety analysis. This project focuses on the two most recent safety analysis tools, the Highway Safety Manual (HSM) : and SafetyAnalys...

  15. A Graph Oriented Approach for Network Forensic Analysis

    ERIC Educational Resources Information Center

    Wang, Wei

    2010-01-01

    Network forensic analysis is a process that analyzes intrusion evidence captured from networked environment to identify suspicious entities and stepwise actions in an attack scenario. Unfortunately, the overwhelming amount and low quality of output from security sensors make it difficult for analysts to obtain a succinct high-level view of complex…

  16. Stage Evolution of Office Automation Technological Change and Organizational Learning.

    ERIC Educational Resources Information Center

    Sumner, Mary

    1985-01-01

    A study was conducted to identify stage characteristics in terms of technology, applications, the role and responsibilities of the office automation organization, and planning and control strategies; and to describe the respective roles of data processing professionals, office automation analysts, and users in office automation systems development…

  17. Glendale Central Library Circulation Department Work Simplication Study.

    ERIC Educational Resources Information Center

    Applegate, H. C.

    an analyst from the staff of the City Manager of Glendale, California, conducted a work simplification study for the central library to adjust public service operations in the library's new quarters. This report presents the study's recommendations concerning the book circulation process. Following an outline of the organization and functions of…

  18. Using Performance Analysis for Training in an Organization Implementing ISO-9000 Manufacturing Practices: A Case Study.

    ERIC Educational Resources Information Center

    Kunneman, Dale E.; Sleezer, Catherine M.

    2000-01-01

    This case study examines the application of the Performance Analysis for Training (PAT) Model in an organization that was implementing ISO-9000 (International Standards Organization) processes for manufacturing practices. Discusses the interaction of organization characteristics, decision maker characteristics, and analyst characteristics to…

  19. 78 FR 39284 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-01

    ... environmental justice assessments of its regulatory actions for years. This experience and body of work... set of questions to guide analysts in evaluating potential environmental justice concerns in EPA rules.... This guidance takes into account EPA's past experience in integrating EJ into the rulemaking process...

  20. Rubber Plastics Processing Industry Training Board

    ERIC Educational Resources Information Center

    Industrial Training International, 1974

    1974-01-01

    The training adviser's role is changing from trainer to problem analyst. Some of the problems being dealt with include: (1) the school to industry transition, (2) new training methods for the 16 to 18 year old entry worker, (3) foreign language training, (4) safety programs, and (5) tire-fitter training. (MW)

  1. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  2. Democratic Nation-Building in South Africa.

    ERIC Educational Resources Information Center

    Rhoodie, Nic, Ed.; Liebenberg, Ian, Ed.

    This book is a collection of essays by 50 eminent experts/analysts representing a broad range of ideological perspectives and interest groups. Its aim is to contribute to the process of democratic nation-building and the creation of a culture of tolerance by educating South Africans about the intricacies of community reconciliation and…

  3. Noah Pflaum | NREL

    Science.gov Websites

    | 303-384-7527 Noah joined NREL in 2017 after having worked as a consulting building energy analyst. His to smooth the integration of building energy modeling into the building design process. Noah applies a variety of analytical techniques to solve problems associated with building performance as they

  4. Hypersonic and Supersonic Flow Roadmaps Using Bibliometrics and Database Tomography.

    ERIC Educational Resources Information Center

    Kostoff, R. N.; Eberhart, Henry J.; Toothman, Darrell Ray

    1999-01-01

    Database Tomography (DT) is a textual database-analysis system consisting of algorithms for extracting multiword phrase frequencies and proximities from a large textual database, to augment interpretative capabilities of the expert human analyst. Describes use of the DT process, supplemented by literature bibliometric analyses, to derive technical…

  5. Better Incident Response with SCOT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruner, Todd

    2015-04-01

    SCOT is an incident response management system and knowledge base designed for incident responders by incident responders. SCOT increases the effectiveness of the team without adding undue burdens. Focused on reducing the friction between analysts and their tools, SCOT enables analysts to document and share their research and response efforts in near real time. Automatically identifying indicators and correlating those indicators, SCOT helps analysts discover and respond to advanced threats.

  6. On talking-as-dreaming.

    PubMed

    Ogden, Thomas H

    2007-06-01

    Many patients are unable to engage in waking-dreaming in the analytic setting in the form of free association or in any other form. The author has found that "talking-as-dreaming" has served as a form of waking-dreaming in which such patients have been able to begin to dream formerly undreamable experience. Such talking is a loosely structured form of conversation between patient and analyst that is often marked by primary process thinking and apparent non sequiturs. Talking-as-dreaming superficially appears to be "unanalytic" in that it may seem to consist "merely" of talking about such topics as books, films, etymology, baseball, the taste of chocolate, the structure of light, and so on. When an analysis is "a going concern," talking-as-dreaming moves unobtrusively into and out of talking about dreaming. The author provides two detailed clinical examples of analytic work with patients who had very little capacity to dream in the analytic setting. In the first clinical example, talking-as-dreaming served as a form of thinking and relating in which the patient was able for the first time to dream her own (and, in a sense, her father's) formerly unthinkable, undreamable experience. The second clinical example involves the use of talking-as-dreaming as an emotional experience in which the formerly "invisible" patient was able to begin to dream himself into existence. The analyst, while engaging with a patient in talking-as-dreaming, must remain keenly aware that it is critical that the difference in roles of patient and analyst be a continuously felt presence; that the therapeutic goals of analysis be firmly held in mind; and that the patient be given the opportunity to dream himself into existence (as opposed to being dreamt up by the analyst).

  7. A visual analytic framework for data fusion in investigative intelligence

    NASA Astrophysics Data System (ADS)

    Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David

    2014-05-01

    Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.

  8. Simulating the Composite Propellant Manufacturing Process

    NASA Technical Reports Server (NTRS)

    Williamson, Suzanne; Love, Gregory

    2000-01-01

    There is a strategic interest in understanding how the propellant manufacturing process contributes to military capabilities outside the United States. The paper will discuss how system dynamics (SD) has been applied to rapidly assess the capabilities and vulnerabilities of a specific composite propellant production complex. These facilities produce a commonly used solid propellant with military applications. The authors will explain how an SD model can be configured to match a specific production facility followed by a series of scenarios designed to analyze operational vulnerabilities. By using the simulation model to rapidly analyze operational risks, the analyst gains a better understanding of production complexities. There are several benefits of developing SD models to simulate chemical production. SD is an effective tool for characterizing complex problems, especially the production process where the cascading effect of outages quickly taxes common understanding. By programming expert knowledge into an SD application, these tools are transformed into a knowledge management resource that facilitates rapid learning without requiring years of experience in production operations. It also permits the analyst to rapidly respond to crisis situations and other time-sensitive missions. Most importantly, the quantitative understanding gained from applying the SD model lends itself to strategic analysis and planning.

  9. Echo™ User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Dustin Yewell

    Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less

  10. Software for Partly Automated Recognition of Targets

    NASA Technical Reports Server (NTRS)

    Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark

    2003-01-01

    The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user s tendencies while the user is selecting targets and to increase the user s productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.

  11. Rethinking Intelligence to Integrate Counterterrorism into the Local Law Enforcement Mission

    DTIC Science & Technology

    2007-03-01

    a needle in the haystack problem. Also referred to as the wheat versus the chaff problem, valuable information must be separated from unimportant...information and processed before analysts can yield any useful intelligence.25 3. Processing and Exploitation To address the wheat -versus-chaff...93 Despite the perception that Chicago is an aging Rust Belt city, some experts report that it has the largest high technology and information

  12. Modeling interdependencies between business and communication processes in hospitals.

    PubMed

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  13. A Framework for Business Process Change Requirements Analysis

    NASA Astrophysics Data System (ADS)

    Grover, Varun; Otim, Samuel

    The ability to quickly and continually adapt business processes to accommodate evolving requirements and opportunities is critical for success in competitive environments. Without appropriate linkage between redesign decisions and strategic inputs, identifying processes that need to be modified will be difficult. In this paper, we draw attention to the analysis of business process change requirements in support of process change initiatives. Business process redesign is a multifaceted phenomenon involving processes, organizational structure, management systems, human resource architecture, and many other aspects of organizational life. To be successful, the business process initiative should focus not only on identifying the processes to be redesigned, but also pay attention to various enablers of change. Above all, a framework is just a blueprint; management must lead change. We hope our modest contribution will draw attention to the broader framing of requirements for business process change.

  14. An Information System Development Method Connecting Business Process Modeling and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao

    Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.

  15. Problems of internalization: a button is a button is-not.

    PubMed

    Rockwell, Shelley

    2014-01-01

    Analysts hope to help the patient internalize a relationship with the analyst that contrasts with the original archaic object relation. In this paper, the author describes particular difficulties in working with a patient whose defenses and anxieties were bulimic, her movement toward internalization inevitably undone. Several issues are considered: how does the nonsymbolizing patient come to internalize the analyst's understanding, and when this does not hold, what is the nature of the patient's subsequent methods of dispersal? When the patient can maintain connection to the analyst as a good object, even fleetingly, in the depressive position, the possibility of internalization and symbolic communication is increased. © 2014 The Psychoanalytic Quarterly, Inc.

  16. Telling about the analyst's pregnancy.

    PubMed

    Uyehara, L A; Austrian, S; Upton, L G; Warner, R H; Williamson, R A

    1995-01-01

    Pregnancy is one of several events in the life of an analyst which may affect an analysis, calling for special technical considerations. For the analyst, this exception to the tenet of anonymity, along with countertransference guilt, narcissistic preoccupation, heightened infantile conflicts, and intense patient responses, may stimulate anxiety that becomes focused on the timing and manner of informing the patient. For the patient, preoccupation with the timing of the telling may serve as a displacement from other meanings of the pregnancy. Candidate analysts may face particular difficulties managing the impact of their pregnancies on control cases. We address practical and technical considerations in telling, the transference and counter-transference surrounding it, ethical concerns, and the challenges of supervising a pregnant candidate.

  17. An analysis of e-business adoption by Indonesian manufacturing SMEs: A conceptual framework

    NASA Astrophysics Data System (ADS)

    Saptadi, Singgih; Pratama, Hanggar; Sudirman, Iman; Aisha, Atya Nur; Bernadhi, Brav Deva

    2017-11-01

    Many researches had shown IT contribution to business. Considering the contribution of SMEs to Indonesia economy, improving the competitiveness of SMEs is a concern in Indonesia development. Many studies had shown many IT projects failed to provide business performance. So, it is important to understand the pattern of e-business that provides business performances of a company. Using business process approach, we had studied SMEs' e-business initiatives in the form of "which business processes that had been supported with IT" by SMEs and business performances that SMEs gained from these e-business initiatives. But, we have not studied the intensity of implemented IT for SMEs' business processes. This paper presents a conceptual framework that relates the business performance and the intensity of e-business adoption. We also propose some antecedents that may relate to the intensity of e-business adoption.

  18. Outpatient imaging center valuations: do you need a fair-market value analysis?

    PubMed

    Koonsman, G S

    2001-01-01

    Typically, outpatient diagnostic imaging centers are formed as partnerships between radiologists, radiologists and hospitals, and/or radiologists and diagnostic imaging center management companies. As a result of these partnership structures, the question of equity valuation frequently arises. It is not only important to understand when an independent valuation would be required, but also what "type" of valuation needs to be performed. The type of valuation may vary based upon the use of the valuation. In partnerships that involve hospitals and physicians, the federal anti-kickback statutes (fraud and abuse laws) require that all transactions between referring physicians and hospitals be consummated at fair-market value. In addition, tax-exempt hospitals that enter into partnerships with physicians are required to enter into those transactions at fair-market value or risk losing their tax-exempt status. Fair-market value is also typically the standard of value that all partnerships strive to conduct equity transactions with shareholders. Qualifications required by those who perform independent fair-market value opinions include: Proper business valuation training and focus on valuations as a primary business Focus on the healthcare industry and specifically on the valuation of diagnostic imaging centers In order to perform a reasonable business valuation analysis, the appraiser must have access to a significant amount of financial, operational and legal information. The analyst must be able to understand the history of the imaging center as well as the projected future of the center. Ultimately, a valuation is a measurement of the estimated future cash flows of the center--risk adjusted--in order to quantify the present value of those cash flows.

  19. Technical Guidance for Assessing Environmental Justice in ...

    EPA Pesticide Factsheets

    The Technical Guidance for Assessing Environmental Justice in Regulatory Analysis (also referred to as the Environmental Justice Technical Guidance or EJTG) is intended for use by Agency analysts, including risk assessors, economists, and other analytic staff that conduct analyses to evaluate EJ concerns in the context of regulatory actions. Senior EPA managers and decision makers also may find this document useful to understand analytic expectations and to ensure that EJ concerns are appropriately considered in the development of analyses to support regulatory actions under EPA’s action development process. Specifically, the document outlines approaches and methods to help Agency analysts evaluate EJ concerns. The document provides overarching direction to analysts by outlining a series of questions that will ensure the decision maker has appropriate information about baseline risks across population groups, and how those risks are distributed under the options being considered. In addition, the document provides a set of recommendations and requirements as well as best practices for use in analyzing and reporting results from consideration of EJ concerns. These principles will help ensure consistency, quality, and transparency across regulatory actions, while allowing for flexibility needed across different regulatory actions. The purpose of the EJTG is ensure consistency, quality, and transparency in considering environmental justice, while allowing f

  20. Annotation Graphs: A Graph-Based Visualization for Meta-Analysis of Data Based on User-Authored Annotations.

    PubMed

    Zhao, Jian; Glueck, Michael; Breslav, Simon; Chevalier, Fanny; Khan, Azam

    2017-01-01

    User-authored annotations of data can support analysts in the activity of hypothesis generation and sensemaking, where it is not only critical to document key observations, but also to communicate insights between analysts. We present annotation graphs, a dynamic graph visualization that enables meta-analysis of data based on user-authored annotations. The annotation graph topology encodes annotation semantics, which describe the content of and relations between data selections, comments, and tags. We present a mixed-initiative approach to graph layout that integrates an analyst's manual manipulations with an automatic method based on similarity inferred from the annotation semantics. Various visual graph layout styles reveal different perspectives on the annotation semantics. Annotation graphs are implemented within C8, a system that supports authoring annotations during exploratory analysis of a dataset. We apply principles of Exploratory Sequential Data Analysis (ESDA) in designing C8, and further link these to an existing task typology in the visualization literature. We develop and evaluate the system through an iterative user-centered design process with three experts, situated in the domain of analyzing HCI experiment data. The results suggest that annotation graphs are effective as a method of visually extending user-authored annotations to data meta-analysis for discovery and organization of ideas.

  1. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  2. The Generic Spacecraft Analyst Assistant (gensaa): a Tool for Developing Graphical Expert Systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.

    1993-01-01

    During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real-time data. The analysts must watch for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As the satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At NASA GSFC, fault-isolation expert systems are in operation supporting this data monitoring task. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will readily support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.

  3. 75 FR 68806 - Statement of Organization, Functions and Delegations of Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-09

    ... Agency business applications architectures, the engineering of business processes, the building and... architecture, engineers technology for business processes, builds, deploys, maintains and manages enterprise systems and data collections efforts; (5) applies business applications architecture to process specific...

  4. Community of Interest Engagement Process Plan

    DTIC Science & Technology

    2012-02-09

    and input from Subject Matter Experts (SMEs), as shown in the far left of Figure 2. The team may prepare a Business Process Model Notation ( BPMN ) 22...22 Business Process Modeling Notation ( BPMN ) is a method of illustrating business processes in the form of a...Community of Interest Engagement Plan Joint Planning and Development Office 21 10. Acronyms BPMN Business Process Modeling Notation COI

  5. Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods, and Results for a User Study

    DTIC Science & Technology

    2016-11-01

    Display Design, Methods , and Results for a User Study by Christopher J Garneau and Robert F Erbacher Approved for public...NOV 2016 US Army Research Laboratory Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods ...January 2013–September 2015 4. TITLE AND SUBTITLE Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods

  6. AN ANALYST'S UNCERTAINTY AND FEAR.

    PubMed

    Chused, Judith Fingert

    2016-10-01

    The motivations for choosing psychoanalysis as a profession are many and differ depending on the psychology of the analyst. However, common to most psychoanalysts is the desire to forge a helpful relationship with the individuals with whom they work therapeutically. This article presents an example of what happens when an analyst is confronted by a patient for whom being in a relationship and being helped are intolerable. © 2016 The Psychoanalytic Quarterly, Inc.

  7. Accuracy and consistency of grass pollen identification by human analysts using electron micrographs of surface ornamentation1

    PubMed Central

    Mander, Luke; Baker, Sarah J.; Belcher, Claire M.; Haselhorst, Derek S.; Rodriguez, Jacklyn; Thorn, Jessica L.; Tiwari, Shivangi; Urrego, Dunia H.; Wesseln, Cassandra J.; Punyasena, Surangi W.

    2014-01-01

    • Premise of the study: Humans frequently identify pollen grains at a taxonomic rank above species. Grass pollen is a classic case of this situation, which has led to the development of computational methods for identifying grass pollen species. This paper aims to provide context for these computational methods by quantifying the accuracy and consistency of human identification. • Methods: We measured the ability of nine human analysts to identify 12 species of grass pollen using scanning electron microscopy images. These are the same images that were used in computational identifications. We have measured the coverage, accuracy, and consistency of each analyst, and investigated their ability to recognize duplicate images. • Results: Coverage ranged from 87.5% to 100%. Mean identification accuracy ranged from 46.67% to 87.5%. The identification consistency of each analyst ranged from 32.5% to 87.5%, and each of the nine analysts produced considerably different identification schemes. The proportion of duplicate image pairs that were missed ranged from 6.25% to 58.33%. • Discussion: The identification errors made by each analyst, which result in a decline in accuracy and consistency, are likely related to psychological factors such as the limited capacity of human memory, fatigue and boredom, recency effects, and positivity bias. PMID:25202649

  8. T-Check in Technologies for Interoperability: Business Process Management in a Web Services Context

    DTIC Science & Technology

    2008-09-01

    UML Sequence Diagram) 6  Figure 3:   BPMN Diagram of the Order Processing Business Process 9  Figure 4:   T-Check Process for Technology Evaluation 10...Figure 5:  Notional System Architecture 12  Figure 6:  Flow Chart of the Order Processing Business Process 14  Figure 7:  Order Processing Activities...features. Figure 3 (created with Intalio BPMS Designer [Intalio 2008]) shows a BPMN view of the Order Processing business process that is used in the

  9. A note on notes: note taking and containment.

    PubMed

    Levine, Howard B

    2007-07-01

    In extreme situations of massive projective identification, both the analyst and the patient may come to share a fantasy or belief that his or her own psychic reality will be annihilated if the psychic reality of the other is accepted or adopted (Britton 1998). In the example of' Dr. M and his patient, the paradoxical dilemma around note taking had highly specific transference meanings; it was not simply an instance of the generalized human response of distracted attention that Freud (1912) had spoken of, nor was it the destabilization of analytic functioning that I tried to describe in my work with Mr. L. Whether such meanings will always exist in these situations remains a matter to be determined by further clinical experience. In reopening a dialogue about note taking during sessions, I have attempted to move the discussion away from categorical injunctions about what analysis should or should not do, and instead to foster a more nuanced, dynamic, and pair-specific consideration of the analyst's functioning in the immediate context of the analytic relationship. There is, of course, a wide variety of listening styles among analysts, and each analyst's mental functioning may be affected differently by each patient whom the analyst sees. I have raised many questions in the hopes of stimulating an expanded discussion that will allow us to share our experiences and perhaps reach additional conclusions. Further consideration may lead us to decide whether note taking may have very different meanings for other analysts and analyst-patient pairs, and whether it may serve useful functions in addition to the one that I have described.

  10. Conducting meta-analyses of HIV prevention literatures from a theory-testing perspective.

    PubMed

    Marsh, K L; Johnson, B T; Carey, M P

    2001-09-01

    Using illustrations from HIV prevention research, the current article advocates approaching meta-analysis as a theory-testing scientific method rather than as merely a set of rules for quantitative analysis. Like other scientific methods, meta-analysis has central concerns with internal, external, and construct validity. The focus of a meta-analysis should only rarely be merely describing the effects of health promotion, but rather should be on understanding and explaining phenomena and the processes underlying them. The methodological decisions meta-analysts make in conducting reviews should be guided by a consideration of the underlying goals of the review (e.g., simply effect size estimation or, preferably theory testing). From the advocated perspective that a health behavior meta-analyst should test theory, the authors present a number of issues to be considered during the conduct of meta-analyses.

  11. Design Through Manufacturing: The Solid Model-Finite Element Analysis Interface

    NASA Technical Reports Server (NTRS)

    Rubin, Carol

    2002-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts reflecting every detail of the finished product. Ideally, in the aerospace industry, these models should fulfill two very important functions: (1) provide numerical. control information for automated manufacturing of precision parts, and (2) enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in aircraft and space vehicles. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. Presently, the process of preparing CAD models for FEA consumes a great deal of the analyst's time.

  12. The learning curve, interobserver, and intraobserver agreement of endoscopic confocal laser endomicroscopy in the assessment of mucosal barrier defects.

    PubMed

    Chang, Jeff; Ip, Matthew; Yang, Michael; Wong, Brendon; Power, Theresa; Lin, Lisa; Xuan, Wei; Phan, Tri Giang; Leong, Rupert W

    2016-04-01

    Confocal laser endomicroscopy can dynamically assess intestinal mucosal barrier defects and increased intestinal permeability (IP). These are functional features that do not have corresponding appearance on histopathology. As such, previous pathology training may not be beneficial in learning these dynamic features. This study aims to evaluate the diagnostic accuracy, learning curve, inter- and intraobserver agreement for identifying features of increased IP in experienced and inexperienced analysts and pathologists. A total of 180 endoscopic confocal laser endomicroscopy (Pentax EC-3870FK; Pentax, Tokyo, Japan) images of the terminal ileum, subdivided into 6 sets of 30 were evaluated by 6 experienced analysts, 13 inexperienced analysts, and 2 pathologists, after a 30-minute teaching session. Cell-junction enhancement, fluorescein leak, and cell dropout were used to represent increased IP and were either present or absent in each image. For each image, the diagnostic accuracy, confidence, and quality were assessed. Diagnostic accuracy was significantly higher for experienced analysts compared with inexperienced analysts from the first set (96.7% vs 83.1%, P < .001) to the third set (95% vs 89.7, P = .127). No differences in accuracy were noted between inexperienced analysts and pathologists. Confidence (odds ratio, 8.71; 95% confidence interval, 5.58-13.57) and good image quality (odds ratio, 1.58; 95% confidence interval, 1.22-2.03) were associated with improved interpretation. Interobserver agreement κ values were high and improved with experience (experienced analysts, 0.83; inexperienced analysts, 0.73; and pathologists, 0.62). Intraobserver agreement was >0.86 for experienced observers. Features representative of increased IP can be rapidly learned with high inter- and intraobserver agreement. Confidence and image quality were significant predictors of accurate interpretation. Previous pathology training did not have an effect on learning. Copyright © 2016 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  13. The Many Faces of a Software Engineer in a Research Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marinovici, Maria C.; Kirkham, Harold

    2013-10-14

    The ability to gather, analyze and make decisions based on real world data is changing nearly every field of human endeavor. These changes are particularly challenging for software engineers working in a scientific community, designing and developing large, complex systems. To avoid the creation of a communications gap (almost a language barrier), the software engineers should possess an ‘adaptive’ skill. In the science and engineering research community, the software engineers must be responsible for more than creating mechanisms for storing and analyzing data. They must also develop a fundamental scientific and engineering understanding of the data. This paper looks atmore » the many faces that a software engineer should have: developer, domain expert, business analyst, security expert, project manager, tester, user experience professional, etc. Observations made during work on a power-systems scientific software development are analyzed and extended to describe more generic software development projects.« less

  14. Competing on talent analytics.

    PubMed

    Davenport, Thomas H; Harris, Jeanne; Shapiro, Jeremy

    2010-10-01

    Do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel? Leading-edge companies such as Google, Best Buy, Procter & Gamble, and Sysco use sophisticated data-collection technology and analysis to answer these questions, leveraging a range of analytics to improve the way they attract and retain talent, connect their employee data to business performance, differentiate themselves from competitors, and more. The authors present the six key ways in which companies track, analyze, and use data about their people-ranging from a simple baseline of metrics to monitor the organization's overall health to custom modeling for predicting future head count depending on various "what if" scenarios. They go on to show that companies competing on talent analytics manage data and technology at an enterprise level, support what analytical leaders do, choose realistic targets for analysis, and hire analysts with strong interpersonal skills as well as broad expertise.

  15. Advancing Development and Greenhouse Gas Reductions in Vietnam's Wind Sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bilello, D.; Katz, J.; Esterly, S.

    2014-09-01

    Clean energy development is a key component of Vietnam's Green Growth Strategy, which establishes a target to reduce greenhouse gas (GHG) emissions from domestic energy activities by 20-30 percent by 2030 relative to a business-as-usual scenario. Vietnam has significant wind energy resources, which, if developed, could help the country reach this target while providing ancillary economic, social, and environmental benefits. Given Vietnam's ambitious clean energy goals and the relatively nascent state of wind energy development in the country, this paper seeks to fulfill two primary objectives: to distill timely and useful information to provincial-level planners, analysts, and project developers asmore » they evaluate opportunities to develop local wind resources; and, to provide insights to policymakers on how coordinated efforts may help advance large-scale wind development, deliver near-term GHG emission reductions, and promote national objectives in the context of a low emission development framework.« less

  16. Frizzled to finance: one PhD’s path from a Drosophila lab to Wall Street

    PubMed Central

    Taylor, Job

    2016-01-01

    An equity research analyst’s job is to determine whether the price of a stock is likely to go up or down. For science-based businesses, particularly biotechnology companies, a PhD in the life sciences can be very helpful in making this determination. I transitioned from a postdoc position to working in equity research. Here I present information on how I made the transition, an overview of the day-to-day activities of an analyst, and thoughts on how to prepare to look for a job in finance. There are significant positives to working on Wall Street, including exposure to cutting-edge clinical/translational research, access to some of the best scientists in the world, a dynamic work environment, and compensation that generally exceeds academic salaries. This comes at the cost of some independence and the satisfaction of being able to call oneself a scientist. PMID:27235096

  17. Shuttle user analysis (study 2.2): Volume 3. Business Risk And Value of Operations in space (BRAVO). Part 4: Computer programs and data look-up

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Computer program listings as well as graphical and tabulated data needed by the analyst to perform a BRAVO analysis were examined. Graphical aid which can be used to determine the earth coverage of satellites in synchronous equatorial orbits was described. A listing for satellite synthesis computer program as well as a sample printout for the DSCS-11 satellite program and a listing of the symbols used in the program were included. The APL language listing for the payload program cost estimating computer program was given. This language is compatible with many of the time sharing remote terminals computers used in the United States. Data on the intelsat communications network was studied. Costs for telecommunications systems leasing, line of sight microwave relay communications systems, submarine telephone cables, and terrestrial power generation systems were also described.

  18. Survey of business process management: challenges and solutions

    NASA Astrophysics Data System (ADS)

    Alotaibi, Youseef; Liu, Fei

    2017-09-01

    The current literature shows that creating a good framework on business process model (PM) is not an easy task. A successful business PM should have the ability to ensure accurate alignment between business processes (BPs) and information technology (IT) designs, provide security protection, manage the rapidly changing business environment and BPs, manage customer power, be flexible for reengineering and ensure that IT goals can be easily derived from business goals and hence an information system (IS) can be easily implemented. This article presents an overview of research in the business PM domain. We have presented a review of the challenges facing business PMs, such as misalignment between business and IT, difficulty of deriving IT goals from business goals, creating secured business PM, reengineering BPs, managing the rapidly changing BP and business environment and managing customer power. Also, it presents the limitations of existing business PM frameworks. Finally, we outline several guidelines to create good business PM and the possible further research directions in the business PM domain.

  19. Artifact-Based Transformation of IBM Global Financing

    NASA Astrophysics Data System (ADS)

    Chao, Tian; Cohn, David; Flatgard, Adrian; Hahn, Sandy; Linehan, Mark; Nandi, Prabir; Nigam, Anil; Pinel, Florian; Vergo, John; Wu, Frederick Y.

    IBM Global Financing (IGF) is transforming its business using the Business Artifact Method, an innovative business process modeling technique that identifies key business artifacts and traces their life cycles as they are processed by the business. IGF is a complex, global business operation with many business design challenges. The Business Artifact Method is a fundamental shift in how to conceptualize, design and implement business operations. The Business Artifact Method was extended to solve the problem of designing a global standard for a complex, end-to-end process while supporting local geographic variations. Prior to employing the Business Artifact method, process decomposition, Lean and Six Sigma methods were each employed on different parts of the financing operation. Although they provided critical input to the final operational model, they proved insufficient for designing a complete, integrated, standard operation. The artifact method resulted in a business operations model that was at the right level of granularity for the problem at hand. A fully functional rapid prototype was created early in the engagement, which facilitated an improved understanding of the redesigned operations model. The resulting business operations model is being used as the basis for all aspects of business transformation in IBM Global Financing.

  20. Business Process Management

    NASA Astrophysics Data System (ADS)

    Hantry, Francois; Papazoglou, Mike; van den Heuvel, Willem-Jan; Haque, Rafique; Whelan, Eoin; Carroll, Noel; Karastoyanova, Dimka; Leymann, Frank; Nikolaou, Christos; Lammersdorf, Winfried; Hacid, Mohand-Said

    Business process management is one of the core drivers of business innovation and is based on strategic technology and capable of creating and successfully executing end-to-end business processes. The trend will be to move from relatively stable, organization-specific applications to more dynamic, high-value ones where business process interactions and trends are examined closely to understand more accurately an application's requirements. Such collaborative, complex end-to-end service interactions give rise to the concept of Service Networks (SNs).

  1. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes were developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. The SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  2. Spacelab Data Processing Facility (SLDPF) quality assurance expert systems development

    NASA Technical Reports Server (NTRS)

    Kelly, Angelita C.; Basile, Lisa; Ames, Troy; Watson, Janice; Dallam, William

    1987-01-01

    Spacelab Data Processing Facility (SLDPF) expert system prototypes have been developed to assist in the quality assurance of Spacelab and/or Attached Shuttle Payload (ASP) processed telemetry data. SLDPF functions include the capturing, quality monitoring, processing, accounting, and forwarding of mission data to various user facilities. Prototypes for the two SLDPF functional elements, the Spacelab Output Processing System and the Spacelab Input Processing Element, are described. The prototypes have produced beneficial results including an increase in analyst productivity, a decrease in the burden of tedious analyses, the consistent evaluation of data, and the providing of concise historical records.

  3. 13 CFR 120.383 - Restrictions on loan processing.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Restrictions on loan processing. 120.383 Section 120.383 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Special Purpose Loans Defense Economic Transition Assistance § 120.383 Restrictions on loan processing...

  4. 13 CFR 120.383 - Restrictions on loan processing.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Restrictions on loan processing. 120.383 Section 120.383 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Special Purpose Loans Defense Economic Transition Assistance § 120.383 Restrictions on loan processing...

  5. 13 CFR 120.383 - Restrictions on loan processing.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Restrictions on loan processing. 120.383 Section 120.383 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Special Purpose Loans Defense Economic Transition Assistance § 120.383 Restrictions on loan processing...

  6. 13 CFR 120.383 - Restrictions on loan processing.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Restrictions on loan processing. 120.383 Section 120.383 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Special Purpose Loans Defense Economic Transition Assistance § 120.383 Restrictions on loan processing...

  7. Teaching Tip: Using a Group Role-Play Exercise to Engage Students in Learning Business Processes and ERP

    ERIC Educational Resources Information Center

    Shen, Yide; Nicholson, Jennifer; Nicholson, Darren

    2015-01-01

    With the increasing process-centric focus and proliferation of Enterprise Resource Planning (ERP) systems in organizations, it is imperative for business graduates to understand cross-functional business processes and ERP system's role in supporting business processes. However, this topic can be rather abstract and dry to undergraduate students,…

  8. Manual LANDSAT data analysis for crop type identification

    NASA Technical Reports Server (NTRS)

    Hay, C. M. (Principal Investigator)

    1979-01-01

    The process of manual identification of crop type by human analysts and problems associated in LACIE that were associated with manual crop identification measurement procedures are described. Research undertaken in cooperation with LACIE operations by the supporting research community to effect solutions to, or obtain greater understanding of the problems is discussed.

  9. The Responsiveness of Public Schools to Their Clientele. Milestone 1: Report of Progress.

    ERIC Educational Resources Information Center

    Zeigler, L. Harmon; And Others

    An analysis of the literature dealing with the responsiveness of public institutions to their clientele constitutes the main body of this interim project report. The analysts adopted Dahl and Lindblom's classification of political decision-making processes for summarizing the range of governing systems possible in public education. These four…

  10. 78 FR 66899 - Proposed Information Collection; Comment Request; Commercial Fisheries Seafood Processor Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... Act of 1996 (and reauthorized in 2007), NMFS is required to enumerate the economic impacts of the... allow analysts to estimate the economic contributions and impacts of marine fish processing to each... paper forms. Methods of submittal include email of electronic forms, and mail and facsimile transmission...

  11. Registered Domestic Partnerships, Same-Sex Marriage, and the Pursuit of Equality in California

    ERIC Educational Resources Information Center

    Willetts, Marion C.

    2011-01-01

    Policies in California are examined to inform analysts of the process by which legal recognition of same-sex relationships may be achieved. Content analysis was conducted of relevant legislation, court cases, and voter initiatives, along with interviews with state legislators to gain an eyewitness understanding of the social climate surrounding…

  12. Improved processes for meeting the data requirements for implementing the Highway Safety Manual (HSM) and Safety Analyst in Florida : [summary].

    DOT National Transportation Integrated Search

    2014-03-01

    Similar to an ill patient, road safety issues can : also be diagnosed, if the right tools are available. : Statistics on roadway incidents can locate areas : that have a high rate of incidents and require : a solution, such as better signage, lightin...

  13. An Advanced Simulation Framework for Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.

    1994-01-01

    Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.

  14. Considering Materiality in Educational Policy: Messy Objects and Multiple Reals

    ERIC Educational Resources Information Center

    Fenwick, Tara; Edwards, Richard

    2011-01-01

    Educational analysts need new ways to engage with policy processes in a networked world of complex transnational connections. In this discussion, Tara Fenwick and Richard Edwards argue for a greater focus on materiality in educational policy as a way to trace the heterogeneous interactions and precarious linkages that enact policy as complex…

  15. An automated data exploitation system for airborne sensors

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Mike

    2014-06-01

    Advanced wide area persistent surveillance (WAPS) sensor systems on manned or unmanned airborne vehicles are essential for wide-area urban security monitoring in order to protect our people and our warfighter from terrorist attacks. Currently, human (imagery) analysts process huge data collections from full motion video (FMV) for data exploitation and analysis (real-time and forensic), providing slow and inaccurate results. An Automated Data Exploitation System (ADES) is urgently needed. In this paper, we present a recently developed ADES for airborne vehicles under heavy urban background clutter conditions. This system includes four processes: (1) fast image registration, stabilization, and mosaicking; (2) advanced non-linear morphological moving target detection; (3) robust multiple target (vehicles, dismounts, and human) tracking (up to 100 target tracks); and (4) moving or static target/object recognition (super-resolution). Test results with real FMV data indicate that our ADES can reliably detect, track, and recognize multiple vehicles under heavy urban background clutters. Furthermore, our example shows that ADES as a baseline platform can provide capability for vehicle abnormal behavior detection to help imagery analysts quickly trace down potential threats and crimes.

  16. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  17. Software Tool Integrating Data Flow Diagrams and Petri Nets

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  18. An artificial neural network system to identify alleles in reference electropherograms.

    PubMed

    Taylor, Duncan; Harrison, Ash; Powers, David

    2017-09-01

    Electropherograms are produced in great numbers in forensic DNA laboratories as part of everyday criminal casework. Before the results of these electropherograms can be used they must be scrutinised by analysts to determine what the identified data tells them about the underlying DNA sequences and what is purely an artefact of the DNA profiling process. This process of interpreting the electropherograms can be time consuming and is prone to subjective differences between analysts. Recently it was demonstrated that artificial neural networks could be used to classify information within an electropherogram as allelic (i.e. representative of a DNA fragment present in the DNA extract) or as one of several different categories of artefactual fluorescence that arise as a result of generating an electropherogram. We extend that work here to demonstrate a series of algorithms and artificial neural networks that can be used to identify peaks on an electropherogram and classify them. We demonstrate the functioning of the system on several profiles and compare the results to a leading commercial DNA profile reading system. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. 13 CFR 124.204 - How does SBA process applications for 8(a) BD program admission?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false How does SBA process applications for 8(a) BD program admission? 124.204 Section 124.204 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION 8(a) BUSINESS DEVELOPMENT/SMALL DISADVANTAGED BUSINESS STATUS DETERMINATIONS 8(a) Business...

  20. MAGIC Computer Simulation. Volume 2: Analyst Manual, Part 1

    DTIC Science & Technology

    1971-05-01

    A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1971 4. TITLE AND SUBTITLE MAGIC Computer Simulation Analyst Manual Part 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...14. ABSTRACT The MAGIC computer simulation generates target description data consisting of item-by-item listings of the target’s components and air

  1. Evaluating the O*NET Occupational Analysis System for Army Competency Development

    DTIC Science & Technology

    2008-07-01

    Experts (SMEs) and collecting ability and skill ratings using trained analysts. The results showed that Army SMEs as well as other types of analysts could...Sciences 2511 Jefferson Davis Highway, Arlington, Virginia 22202-3926 4 July 2008 Army Project Number Personnel and Training 665803D730 Analysis...using trained analysts. SMEs were non-commissioned officers (NCOs) or officers with several years of experience in the Army and their occupations, and

  2. The Effects of Bug-in-Ear Coaching on Pre-Service Behavior Analysts' Use of Functional Communication Training.

    PubMed

    Artman-Meeker, Kathleen; Rosenberg, Nancy; Badgett, Natalie; Yang, Xueyan; Penney, Ashley

    2017-09-01

    Behavior analysts play an important role in supporting the behavior and learning of young children with disabilities in natural settings. However, there is very little research related specifically to developing the skills and competencies needed by pre-service behavior analysts. This study examined the effects of "bug-in-ear" (BIE) coaching on pre-service behavior analysts' implementation of functional communication training with pre-school children with autism in their classrooms. BIE coaching was associated with increases in the rate of functional communication training trials each intern initiated per session and in the fidelity with which interns implemented functional communication training. Adults created more intentional opportunities for children to communicate, and adults provided more systematic instruction around those opportunities.

  3. Physics-based and human-derived information fusion for analysts

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael

    2017-05-01

    Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.

  4. Army Program Value Added Analysis 90-97 (VAA 90-97)

    DTIC Science & Technology

    1991-08-01

    affordability or duplication of capability. The AHP process appears to hold the greatest possibilities in this regard. 1-11. OTHER KEY FINDINGS a. The...to provide the logical skeleton in which to build an alternative’s effectiveness value. The analytical hierarchy process ( AHP ) is particularly...likely to be, at first cut, very fuzzy . Thus, the issue clarification step is inherently iterative. As the analyst gathers more and more information in

  5. Analyst Performance Measures. Volume 1: Persistent Surveillance Data Processing, Storage and Retrieval

    DTIC Science & Technology

    2011-09-01

    solutions to address these important challenges . The Air Force is seeking innovative architectures to process and store massive data sets in a flexible...Google Earth, the Video LAN Client ( VLC ) media player, and the Environmental Systems Research Institute corporation‘s (ESRI) ArcGIS product — to...Earth, Quantum GIS, VLC Media Player, NASA WorldWind, ESRI ArcGIS and many others. Open source GIS and media visualization software can also be

  6. An Aspect-Oriented Framework for Business Process Improvement

    NASA Astrophysics Data System (ADS)

    Pourshahid, Alireza; Mussbacher, Gunter; Amyot, Daniel; Weiss, Michael

    Recently, many organizations invested in Business Process Management Systems (BPMSs) in order to automate and monitor their processes. Business Activity Monitoring is one of the essential modules of a BPMS as it provides the core monitoring capabilities. Although the natural step after process monitoring is process improvement, most of the existing systems do not provide the means to help users with the improvement step. In this paper, we address this issue by proposing an aspect-oriented framework that allows the impact of changes to business processes to be explored with what-if scenarios based on the most appropriate process redesign patterns among several possibilities. As the four cornerstones of a BPMS are process, goal, performance and validation views, these views need to be aligned automatically by any approach that intends to support automated improvement of business processes. Our framework therefore provides means to reflect process changes also in the other views of the business process. A health care case study presented as a proof of concept suggests that this novel approach is feasible.

  7. Leveraging Big-Data for Business Process Analytics

    ERIC Educational Resources Information Center

    Vera-Baquero, Alejandro; Colomo Palacios, Ricardo; Stantchev, Vladimir; Molloy, Owen

    2015-01-01

    Purpose: This paper aims to present a solution that enables organizations to monitor and analyse the performance of their business processes by means of Big Data technology. Business process improvement can drastically influence in the profit of corporations and helps them to remain viable. However, the use of traditional Business Intelligence…

  8. On the Risk Management and Auditing of SOA Based Business Processes

    NASA Astrophysics Data System (ADS)

    Orriens, Bart; Heuvel, Willem-Jan V./D.; Papazoglou, Mike

    SOA-enabled business processes stretch across many cooperating and coordinated systems, possibly crossing organizational boundaries, and technologies like XML and Web services are used for making system-to-system interactions commonplace. Business processes form the foundation for all organizations, and as such, are impacted by industry regulations. This requires organizations to review their business processes and ensure that they meet the compliance standards set forth in legislation. In this paper we sketch a SOA-based service risk management and auditing methodology including a compliance enforcement and verification system that assures verifiable business process compliance. This is done on the basis of a knowledge-based system that allows integration of internal control systems into business processes conform pre-defined compliance rules, monitor both the normal process behavior and those of the control systems during process execution, and log these behaviors to facilitate retrospective auditing.

  9. Developing cloud-based Business Process Management (BPM): a survey

    NASA Astrophysics Data System (ADS)

    Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh

    2018-03-01

    In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.

  10. Multivariate Statistics Applied to Seismic Phase Picking

    NASA Astrophysics Data System (ADS)

    Velasco, A. A.; Zeiler, C. P.; Anderson, D.; Pingitore, N. E.

    2008-12-01

    The initial effort of the Seismogram Picking Error from Analyst Review (SPEAR) project has been to establish a common set of seismograms to be picked by the seismological community. Currently we have 13 analysts from 4 institutions that have provided picks on the set of 26 seismograms. In comparing the picks thus far, we have identified consistent biases between picks from different institutions; effects of the experience of analysts; and the impact of signal-to-noise on picks. The institutional bias in picks brings up the important concern that picks will not be the same between different catalogs. This difference means less precision and accuracy when combing picks from multiple institutions. We also note that depending on the experience level of the analyst making picks for a catalog the error could fluctuate dramatically. However, the experience level is based off of number of years in picking seismograms and this may not be an appropriate criterion for determining an analyst's precision. The common data set of seismograms provides a means to test an analyst's level of precision and biases. The analyst is also limited by the quality of the signal and we show that the signal-to-noise ratio and pick error are correlated to the location, size and distance of the event. This makes the standard estimate of picking error based on SNR more complex because additional constraints are needed to accurately constrain the measurement error. We propose to extend the current measurement of error by adding the additional constraints of institutional bias and event characteristics to the standard SNR measurement. We use multivariate statistics to model the data and provide constraints to accurately assess earthquake location and measurement errors.

  11. Defining the Roles, Responsibilities, and Functions for Data Science Within the Defense Intelligence Agency

    DTIC Science & Technology

    2016-01-01

    of data science within DIA and ensure the activities assist and inform DIA’s decisionmakers, analysts , and operators. The research addressed two key...by an analyst or researcher . This type of identifi- cation can be time-consuming and potentially full of errors. GENIE learns from ana- 1 Interview... analysts . The protocol can be found in Appendix A. The protocol was intended to elicit information in five broad research areas. First, we asked a

  12. Grand Strategy: Contending Contemporary Analyst Views and Implications for the U.S. Navy

    DTIC Science & Technology

    2011-11-01

    Grand Strategy Contending Contemporary Analyst Views and Implications for the U.S. Navy Elbridge Colby CRM D0025423.A2/Final November...NOV 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Grand Strategy: Contending Contemporary Analyst Views...implications for the country, the U.S. armed forces, and the U.S. Navy. Two other categories—isolationism (an oft-mentioned contender in political

  13. Impact of Growing Business on Software Processes

    NASA Astrophysics Data System (ADS)

    Nikitina, Natalja; Kajko-Mattsson, Mira

    When growing their businesses, software organizations should not only put effort into developing and executing their business strategies, but also into managing and improving their internal software development processes and aligning them with business growth strategies. It is only in this way they may confirm that their businesses grow in a healthy and sustainable way. In this paper, we map out one software company's business growth on the course of its historical events and identify its impact on the company's software production processes and capabilities. The impact concerns benefits, challenges, problems and lessons learned. The most important lesson learned is that although business growth has become a stimulus for starting thinking and improving software processes, the organization lacked guidelines aiding it in and aligning it to business growth. Finally, the paper generates research questions providing a platform for future research.

  14. Concept for facilitating analyst-mediated interpretation of qualitative chromatographic-mass spectral data: an alternative to manual examination of extracted ion chromatograms.

    PubMed

    Borges, Chad R

    2007-07-01

    A chemometrics-based data analysis concept has been developed as a substitute for manual inspection of extracted ion chromatograms (XICs), which facilitates rapid, analyst-mediated interpretation of GC- and LC/MS(n) data sets from samples undergoing qualitative batchwise screening for prespecified sets of analytes. Automatic preparation of data into two-dimensional row space-derived scatter plots (row space plots) eliminates the need to manually interpret hundreds to thousands of XICs per batch of samples while keeping all interpretation of raw data directly in the hands of the analyst-saving great quantities of human time without loss of integrity in the data analysis process. For a given analyte, two analyte-specific variables are automatically collected by a computer algorithm and placed into a data matrix (i.e., placed into row space): the first variable is the ion abundance corresponding to scan number x and analyte-specific m/z value y, and the second variable is the ion abundance corresponding to scan number x and analyte-specific m/z value z (a second ion). These two variables serve as the two axes of the aforementioned row space plots. In order to collect appropriate scan number (retention time) information, it is necessary to analyze, as part of every batch, a sample containing a mixture of all analytes to be tested. When pure standard materials of tested analytes are unavailable, but representative ion m/z values are known and retention time can be approximated, data are evaluated based on two-dimensional scores plots from principal component analysis of small time range(s) of mass spectral data. The time-saving efficiency of this concept is directly proportional to the percentage of negative samples and to the total number of samples processed simultaneously.

  15. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  16. Extending BPM Environments of Your Choice with Performance Related Decision Support

    NASA Astrophysics Data System (ADS)

    Fritzsche, Mathias; Picht, Michael; Gilani, Wasif; Spence, Ivor; Brown, John; Kilpatrick, Peter

    What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools, process optimizations or a combination of such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.

  17. Use of Statechart Assertions for Modeling Human-in-the-Loop Security Analysis and Decision-Making Processes

    DTIC Science & Technology

    2012-06-01

    THIS PAGE INTENTIONALLY LEFT BLANK xv LIST OF ACRONYMS AND ABBREVIATIONS BPM Business Process Model BPMN Business Process Modeling Notation C&A...checking leads to an improvement in the quality and success of enterprise software development. Business Process Modeling Notation ( BPMN ) is an...emerging standard that allows business processes to be captured in a standardized format. BPMN lacks formal semantics which leaves many of its features

  18. The Analyst's "Use" of Theory or Theories: The Play of Theory.

    PubMed

    Cooper, Steven H

    2017-10-01

    Two clinical vignettes demonstrate a methodological approach that guides the analyst's attention to metaphors and surfaces that are the focus of different theories. Clinically, the use of different theories expands the metaphorical language with which the analyst tries to make contact with the patient's unconscious life. Metaphorical expressions may be said to relate to each other as the syntax of unconscious fantasy (Arlow 1979). The unconscious fantasy itself represents a metaphorical construction of childhood experience that has persisted, dynamically expressive and emergent into adult life. This persistence is evident in how, in some instances, long periods of an analysis focus on translating one or a few metaphors, chiefly because the manifest metaphorical expressions of a central theme regularly lead to better understanding of an unconscious fantasy. At times employing another model or theory assists in a level of self-reflection about clinical understanding and clinical decisions. The analyst's choice of theory or theories is unique to the analyst and is not prescriptive, except as illustrating a way to think about these issues. The use of multiple models in no way suggests or implies that theories may be integrated.

  19. 12 CFR 217.122 - Qualification requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... related processes; (ii) Have and document a process (which must capture business environment and internal... current business activities, risk profile, technological processes, and risk management processes; and (ii... assessment systems. (D) Business environment and internal control factors. The Board-regulated institution...

  20. 12 CFR 324.122 - Qualification requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... related processes; (ii) Have and document a process (which must capture business environment and internal... current business activities, risk profile, technological processes, and risk management processes; and (ii... assessment systems. (D) Business environment and internal control factors. The FDIC-supervised institution...

  1. Two years into the storm over pricing to and collecting from the uninsured--a hospital valuation expert examines the risk/return dynamics and asks: would fair pricing and fair medical debt repayment plans increase yields to hospitals and simultaneously mitigate these controversies?

    PubMed

    Unland, James J

    2005-01-01

    As the controversies over 501(c)(3) "charitable" hospitals' pricing, collections, and charity care practices that emerged in the winter and spring of 2003 continue unabated--now involving government officials from city councils and county boards to state attorneys general and Congress as well as numerous class action lawsuits--a hospital valuation expert and risk analyst looks at the fundamental economic and strategic issues, concluding that the risk/return dynamics are out of whack in that hospitals are facing mushrooming, multifaceted troubles over what has been a very low net yield patient population. After interviewing patient account representatives at hospitals and conducting other research, this analyst asks: Should attention have been focused at the national and state hospital association levels in 2003 to take steps to increase the net yield to hospitals from the uninsured population through more equitable pricing and better medical debt repayment terms, steps that might have mitigated these controversies? Many hospitals and hospital associations have been so intent on proving hospitals' legal right to charge "list price" to and sue the uninsured that they have overlooked a simple yet effective business premise that many hospital patient accounts representatives already fully know: Fair pricing and fair payment terms are actually good business. The author asserts that the controversies that emerged in 2003 actually represented a significant opportunity that, with a different approach, would likely have resulted in hospitals being able to collect significantly more money from the uninsured population while, at the same time, lessening or even avoiding the destructive ramifications that have occurred in the form of investigations, legislation, and lawsuits. To realize higher net yields from the uninsured, highly specific leadership steps need to be taken uniquely at national and state "association" levels in order to avoid the negative financial consequences of fragmented actions that can cause individual hospitals to become "magnets" for the uninsured. Steps at the individual hospital level need to be preceded by coordinated leadership at the "association" level if these difficult controversies are to be transformed into an opportunity for more revenue from the uninsured, an opportunity that existed in 2003 and before.

  2. Dynamic Graph Analytic Framework (DYGRAF): greater situation awareness through layered multi-modal network analysis

    NASA Astrophysics Data System (ADS)

    Margitus, Michael R.; Tagliaferri, William A., Jr.; Sudit, Moises; LaMonica, Peter M.

    2012-06-01

    Understanding the structure and dynamics of networks are of vital importance to winning the global war on terror. To fully comprehend the network environment, analysts must be able to investigate interconnected relationships of many diverse network types simultaneously as they evolve both spatially and temporally. To remove the burden from the analyst of making mental correlations of observations and conclusions from multiple domains, we introduce the Dynamic Graph Analytic Framework (DYGRAF). DYGRAF provides the infrastructure which facilitates a layered multi-modal network analysis (LMMNA) approach that enables analysts to assemble previously disconnected, yet related, networks in a common battle space picture. In doing so, DYGRAF provides the analyst with timely situation awareness, understanding and anticipation of threats, and support for effective decision-making in diverse environments.

  3. Video enhancement workbench: an operational real-time video image processing system

    NASA Astrophysics Data System (ADS)

    Yool, Stephen R.; Van Vactor, David L.; Smedley, Kirk G.

    1993-01-01

    Video image sequences can be exploited in real-time, giving analysts rapid access to information for military or criminal investigations. Video-rate dynamic range adjustment subdues fluctuations in image intensity, thereby assisting discrimination of small or low- contrast objects. Contrast-regulated unsharp masking enhances differentially shadowed or otherwise low-contrast image regions. Real-time removal of localized hotspots, when combined with automatic histogram equalization, may enhance resolution of objects directly adjacent. In video imagery corrupted by zero-mean noise, real-time frame averaging can assist resolution and location of small or low-contrast objects. To maximize analyst efficiency, lengthy video sequences can be screened automatically for low-frequency, high-magnitude events. Combined zoom, roam, and automatic dynamic range adjustment permit rapid analysis of facial features captured by video cameras recording crimes in progress. When trying to resolve small objects in murky seawater, stereo video places the moving imagery in an optimal setting for human interpretation.

  4. Visualization of multi-INT fusion data using Java Viewer (JVIEW)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Aved, Alex; Nagy, James; Scott, Stephen

    2014-05-01

    Visualization is important for multi-intelligence fusion and we demonstrate issues for presenting physics-derived (i.e., hard) and human-derived (i.e., soft) fusion results. Physics-derived solutions (e.g., imagery) typically involve sensor measurements that are objective, while human-derived (e.g., text) typically involve language processing. Both results can be geographically displayed for user-machine fusion. Attributes of an effective and efficient display are not well understood, so we demonstrate issues and results for filtering, correlation, and association of data for users - be they operators or analysts. Operators require near-real time solutions while analysts have the opportunities of non-real time solutions for forensic analysis. In a use case, we demonstrate examples using the JVIEW concept that has been applied to piloting, space situation awareness, and cyber analysis. Using the open-source JVIEW software, we showcase a big data solution for multi-intelligence fusion application for context-enhanced information fusion.

  5. Teleconsultation in school settings: linking classroom teachers and behavior analysts through web-based technology.

    PubMed

    Frieder, Jessica E; Peterson, Stephanie M; Woodward, Judy; Crane, Jaelee; Garner, Marlane

    2009-01-01

    This paper describes a technically driven, collaborative approach to assessing the function of problem behavior using web-based technology. A case example is provided to illustrate the process used in this pilot project. A school team conducted a functional analysis with a child who demonstrated challenging behaviors in a preschool setting. Behavior analysts at a university setting provided the school team with initial workshop trainings, on-site visits, e-mail and phone communication, as well as live web-based feedback on functional analysis sessions. The school personnel implemented the functional analysis with high fidelity and scored the data reliably. Outcomes of the project suggest that there is great potential for collaboration via the use of web-based technologies for ongoing assessment and development of effective interventions. However, an empirical evaluation of this model should be conducted before wide-scale adoption is recommended.

  6. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  7. Developing Analogy Cost Estimates for Space Missions

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2004-01-01

    The analogy approach in cost estimation combines actual cost data from similar existing systems, activities, or items with adjustments for a new project's technical, physical or programmatic differences to derive a cost estimate for the new system. This method is normally used early in a project cycle when there is insufficient design/cost data to use as a basis for (or insufficient time to perform) a detailed engineering cost estimate. The major limitation of this method is that it relies on the judgment and experience of the analyst/estimator. The analyst must ensure that the best analogy or analogies have been selected, and that appropriate adjustments have been made. While analogy costing is common, there is a dearth of advice in the literature on the 'adjustment methodology', especially for hardware projects. This paper discusses some potential approaches that can improve rigor and repeatability in the analogy costing process.

  8. Systematic drug repositioning through mining adverse event data in ClinicalTrials.gov.

    PubMed

    Su, Eric Wen; Sanger, Todd M

    2017-01-01

    Drug repositioning (i.e., drug repurposing) is the process of discovering new uses for marketed drugs. Historically, such discoveries were serendipitous. However, the rapid growth in electronic clinical data and text mining tools makes it feasible to systematically identify drugs with the potential to be repurposed. Described here is a novel method of drug repositioning by mining ClinicalTrials.gov. The text mining tools I2E (Linguamatics) and PolyAnalyst (Megaputer) were utilized. An I2E query extracts "Serious Adverse Events" (SAE) data from randomized trials in ClinicalTrials.gov. Through a statistical algorithm, a PolyAnalyst workflow ranks the drugs where the treatment arm has fewer predefined SAEs than the control arm, indicating that potentially the drug is reducing the level of SAE. Hypotheses could then be generated for the new use of these drugs based on the predefined SAE that is indicative of disease (for example, cancer).

  9. Study on automatic ECT data evaluation by using neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Komatsu, H.; Matsumoto, Y.; Badics, Z.

    1994-12-31

    At the in--service inspection of the steam generator (SG) tubings in Pressurized Water Reactor (PWR) plant, eddy current testing (ECT) has been widely used at each outage. At present, ECT data evaluation is mainly performed by ECT data analyst, therefore it has the following problems. Only ECT signal configuration on the impedance trajectory is used in the evaluation. It is an enormous time consuming process. The evaluation result is influenced by the ability and experience of the analyst. Especially, it is difficult to identify the true defect signal hidden in background signals such as lift--off noise and deposit signals. Inmore » this work, the authors performed the study on the possibility of the application of neural network to ECT data evaluation. It was demonstrated that the neural network proved to be effective to identify the nature of defect, by selecting several optimum input parameters to categorize the raw ECT signals.« less

  10. Sensory empathy and enactment.

    PubMed

    Zanocco, Giorgio; De Marchi, Alessandra; Pozzi, Francesco

    2006-02-01

    The authors propose the concept of sensory empathy which emerges through contact between analyst and patient as they get in touch with an area concerning the primary bond. This area is not so much based on thoughts and fantasies as it is on physical sensations. Sensory empathy has to do with that instrument described by Freud as pertaining to the unconscious of any human, which enables one person to interpret unconscious communications of another person. The authors link this concept to that of enactment precisely because the latter concerns unconscious, early elements that fi nd in the act a fi rst meaningful expression. It involves both analyst and patient. In other words, the authors wish to emphasize the importance of the analytical process maintaining contact with that immense field of human interaction that can be defined as primary sensory area and which becomes intertwined with the evolution of affects. Clinical examples are provided to clarify these hypotheses.

  11. Collaborative business process support in eHealth: integrating IHE profiles through ebXML business process specification language.

    PubMed

    Dogac, Asuman; Kabak, Yildiray; Namli, Tuncay; Okcan, Alper

    2008-11-01

    Integrating healthcare enterprise (IHE) specifies integration profiles describing selected real world use cases to facilitate the interoperability of healthcare information resources. While realizing a complex real-world scenario, IHE profiles are combined by grouping the related IHE actors. Grouping IHE actors implies that the associated business processes (IHE profiles) that the actors are involved must be combined, that is, the choreography of the resulting collaborative business process must be determined by deciding on the execution sequence of transactions coming from different profiles. There are many IHE profiles and each user or vendor may support a different set of IHE profiles that fits to its business need. However, determining the precedence of all the involved transactions manually for each possible combination of the profiles is a very tedious task. In this paper, we describe how to obtain the overall business process automatically when IHE actors are grouped. For this purpose, we represent the IHE profiles through a standard, machine-processable language, namely, Organization for the Advancement of Structured Information Standards (OASIS) ebusiness eXtensible Markup Language (ebXML) Business Process Specification (ebBP) Language. We define the precedence rules among the transactions of the IHE profiles, again, in a machine-processable way. Then, through a graphical tool, we allow users to select the actors to be grouped and automatically produce the overall business process in a machine-processable format.

  12. Training for spacecraft technical analysts

    NASA Technical Reports Server (NTRS)

    Ayres, Thomas J.; Bryant, Larry

    1989-01-01

    Deep space missions such as Voyager rely upon a large team of expert analysts who monitor activity in the various engineering subsystems of the spacecraft and plan operations. Senior teammembers generally come from the spacecraft designers, and new analysts receive on-the-job training. Neither of these methods will suffice for the creation of a new team in the middle of a mission, which may be the situation during the Magellan mission. New approaches are recommended, including electronic documentation, explicit cognitive modeling, and coached practice with archived data.

  13. What's in a name: what analyst and patient call each other.

    PubMed

    Barron, Grace Caroline

    2006-01-01

    Awkward moments often arise between patient and analyst involving the question, "What do we call each other?" The manner in which the dyad address each other contains material central to the patient's inner life. Names, like dreams, deserve a privileged status as providing a royal road into the paradoxical analytic relationship and the unconscious conflicts that feed it. Whether an analyst addresses the patient formally, informally, or not at all, awareness of the issues surrounding names is important.

  14. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  15. Enhancement of the Acquisition Process for a Combat System-A Case Study to Model the Workflow Processes for an Air Defense System Acquisition

    DTIC Science & Technology

    2009-12-01

    Business Process Modeling BPMN Business Process Modeling Notation SoA Service-oriented Architecture UML Unified Modeling Language CSP...system developers. Supporting technologies include Business Process Modeling Notation ( BPMN ), Unified Modeling Language (UML), model-driven architecture

  16. Separating Business Logic from Medical Knowledge in Digital Clinical Workflows Using Business Process Model and Notation and Arden Syntax.

    PubMed

    de Bruin, Jeroen S; Adlassnig, Klaus-Peter; Leitich, Harald; Rappelsberger, Andrea

    2018-01-01

    Evidence-based clinical guidelines have a major positive effect on the physician's decision-making process. Computer-executable clinical guidelines allow for automated guideline marshalling during a clinical diagnostic process, thus improving the decision-making process. Implementation of a digital clinical guideline for the prevention of mother-to-child transmission of hepatitis B as a computerized workflow, thereby separating business logic from medical knowledge and decision-making. We used the Business Process Model and Notation language system Activiti for business logic and workflow modeling. Medical decision-making was performed by an Arden-Syntax-based medical rule engine, which is part of the ARDENSUITE software. We succeeded in creating an electronic clinical workflow for the prevention of mother-to-child transmission of hepatitis B, where institution-specific medical decision-making processes could be adapted without modifying the workflow business logic. Separation of business logic and medical decision-making results in more easily reusable electronic clinical workflows.

  17. Medical Management: Process Analysis Study Report

    DTIC Science & Technology

    2011-10-28

    in Medical Management (care coordinator, case manager, PCM, clinic nurses , referral management shop, utilization management?, etc). The goal is to...Enterprise Nursing Procedure Manual, revealed that fact from the Navy’s perspective. An OASD(HA) TRICARE Management Activity (TMA) Senior...Requirements Analyst, Clinical Information Management (IM) and retired Army Colonel Nurse , Patricia Kinder, essentially told us no single application suite

  18. Moving Rivers, Shifting Streams: Perspectives on the Existence of a Policy Window

    ERIC Educational Resources Information Center

    Galligan, Ann M.; Burgess, Chris N.

    2005-01-01

    This article represents differing perspectives on the creation and establishment of the Rhode Island Arts Learning Network (ALN). At the heart of this discussion is whether or not the Rhode Island task force in charge of this process took advantage of what noted public policy analyst John Kingdon refers to as a "policy window" where…

  19. How is physiology relevant to behavior analysis?

    PubMed Central

    Reese, Hayne W.

    1996-01-01

    Physiology is an important biological science; but behavior analysis is not a biological science, and behavior analysts can safely ignore biological processes. However, ignoring products of biological processes might be a serious mistake. The important products include behavior, instinctive drift, behavior potentials, hunger, and many developmental milestones and events. Physiology deals with the sources of such products; behavior analysis can deal with how the products affect behavior, which can be understood without understanding their sources. PMID:22478240

  20. COMBATXXI, JDAFS, and LBC Integration Requirements for EASE

    DTIC Science & Technology

    2015-10-06

    process as linear and as new data is made available, any previous analysis is obsolete and has to start the process over again. Figure 2 proposes a...final line of the manifest file names the scenario file associated with the run. Under the usual practice, the analyst now starts the COMBATXXI...describes which events are to be logged. Finally the scenario is started with the click of a button. The simulation generates logs of a couple of sorts

  1. A Framework for Integrating Environmental Justice in Regulatory Analysis

    PubMed Central

    Nweke, Onyemaechi C.

    2011-01-01

    With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235

  2. RockFall analyst: A GIS extension for three-dimensional and spatially distributed rockfall hazard modeling

    NASA Astrophysics Data System (ADS)

    Lan, Hengxing; Derek Martin, C.; Lim, C. H.

    2007-02-01

    Geographic information system (GIS) modeling is used in combination with three-dimensional (3D) rockfall process modeling to assess rockfall hazards. A GIS extension, RockFall Analyst (RA), which is capable of effectively handling large amounts of geospatial information relative to rockfall behaviors, has been developed in ArcGIS using ArcObjects and C#. The 3D rockfall model considers dynamic processes on a cell plane basis. It uses inputs of distributed parameters in terms of raster and polygon features created in GIS. Two major components are included in RA: particle-based rockfall process modeling and geostatistics-based rockfall raster modeling. Rockfall process simulation results, 3D rockfall trajectories and their velocity features either for point seeders or polyline seeders are stored in 3D shape files. Distributed raster modeling, based on 3D rockfall trajectories and a spatial geostatistical technique, represents the distribution of spatial frequency, the flying and/or bouncing height, and the kinetic energy of falling rocks. A distribution of rockfall hazard can be created by taking these rockfall characteristics into account. A barrier analysis tool is also provided in RA to aid barrier design. An application of these modeling techniques to a case study is provided. The RA has been tested in ArcGIS 8.2, 8.3, 9.0 and 9.1.

  3. Spacelab data processing facility (SLDPF) quality assurance (QA)/data accounting (DA) expert systems - Transition from prototypes to operational systems

    NASA Technical Reports Server (NTRS)

    Basile, Lisa

    1988-01-01

    The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historical records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.

  4. Spacelab data processing facility (SLDPF) Quality Assurance (QA)/Data Accounting (DA) expert systems: Transition from prototypes to operational systems

    NASA Technical Reports Server (NTRS)

    Basile, Lisa

    1988-01-01

    The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historial records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.

  5. Conceptual framework for the mapping of management process with information technology in a business process.

    PubMed

    Rajarathinam, Vetrickarthick; Chellappa, Swarnalatha; Nagarajan, Asha

    2015-01-01

    This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified.

  6. Conceptual Framework for the Mapping of Management Process with Information Technology in a Business Process

    PubMed Central

    Chellappa, Swarnalatha; Nagarajan, Asha

    2015-01-01

    This study on component framework reveals the importance of management process and technology mapping in a business environment. We defined ERP as a software tool, which has to provide business solution but not necessarily an integration of all the departments. Any business process can be classified as management process, operational process and the supportive process. We have gone through entire management process and were enable to bring influencing components to be mapped with a technology for a business solution. Governance, strategic management, and decision making are thoroughly discussed and the need of mapping these components with the ERP is clearly explained. Also we suggest that implementation of this framework might reduce the ERP failures and especially the ERP misfit was completely rectified. PMID:25861688

  7. Language and the psychoanalytic process: psychoanalysis and Vygotskian psychology. II.

    PubMed

    Wilson, A; Weinstein, L

    1992-01-01

    This paper follows our previous one, where we described a psychoanalytic conception of language, thought, and internalization that is informed by the thinking of Lev Vygotsky. Here, several aspects of the analytic process which allow for the understanding of ineffable experiences in the analysand's history and the analytic situation are investigated: specifically, primal repression, metaphor, and the role of speech in free association. It is suggested that Freud's notion of primal repression be revived and redefined as one aspect of the descriptive unconscious. Some implications of primal repression for transference and resistance are explored. The metaphoric in its broad sense is examined as one example of how early dynamic experiences embedded in the process of language acquisition can be reached within the clinical situation. It is proposed that an understanding of free association is enhanced by awareness of distinctions between inner, egocentric, and social speech. The basic rule can be interpreted as an invitation for the analysand to use inner speech in collaboration with the analyst as best he or she can. Further, the aliveness and degree of superficiality of the analysis can be seen as a function of the analyst's ability to appreciate the properties of inner speech and foster the conditions in the analysis that allow for its unfolding.

  8. Teaching Business Process Management with Simulation in Graduate Business Programs: An Integrative Approach

    ERIC Educational Resources Information Center

    Saraswat, Satya Prakash; Anderson, Dennis M.; Chircu, Alina M.

    2014-01-01

    This paper describes the development and evaluation of a graduate level Business Process Management (BPM) course with process modeling and simulation as its integral component, being offered at an accredited business university in the Northeastern U.S. Our approach is similar to that found in other Information Systems (IS) education papers, and…

  9. Enhanced detection and visualization of anomalies in spectral imagery

    NASA Astrophysics Data System (ADS)

    Basener, William F.; Messinger, David W.

    2009-05-01

    Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.

  10. 77 FR 11617 - Data Collection Available for Public Comments and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-27

    ... the quality of the collection, to Sandra Johnston, Program Analyst, Office of Financial Assistance... CONTACT: Sandra Johnston, Program Analyst, 202- 205-7528, Sandra[email protected] Curtis B. Rich...

  11. Self-confidence in financial analysis: a study of younger and older male professional analysts.

    PubMed

    Webster, R L; Ellis, T S

    2001-06-01

    Measures of reported self-confidence in performing financial analysis by 59 professional male analysts, 31 born between 1946 and 1964 and 28 born between 1965 and 1976, were investigated and reported. Self-confidence in one's ability is important in the securities industry because it affects recommendations and decisions to buy, sell, and hold securities. The respondents analyzed a set of multiyear corporate financial statements and reported their self-confidence in six separate financial areas. Data from the 59 male financial analysts were tallied and analyzed using both univariate and multivariate statistical tests. Rated self-confidence was not significantly different for the younger and the older men. These results are not consistent with a similar prior study of female analysts in which younger women showed significantly higher self-confidence than older women.

  12. Seeing, mirroring, desiring: the impact of the analyst's pregnant body on the patient's body image.

    PubMed

    Yakeley, Jessica

    2013-08-01

    The paper explores the impact of the analyst's pregnant body on the course of two analyses, a young man, and a young woman, specifically focusing on how each patient's visual perception and affective experience of being with the analyst's pregnant body affected their own body image and subjective experience of their body. The pre-verbal or 'subsymbolic' material evoked in the analyses contributed to a greater understanding of the patients' developmental experiences in infancy and adolescence, which had resulted in both carrying a profoundly distorted body image into adulthood. The analyst's pregnancy offered a therapeutic window in which a shift in the patient's body image could be initiated. Clinical material is presented in detail with reference to the psychoanalytic literature on the pregnant analyst, and that of the development of the body image, particularly focusing on the role of visual communication and the face. The author proposes a theory of psychic change, drawing on Bucci's multiple code theory, in which the patients' unconscious or 'subsymbolic' awareness of her pregnancy, which were manifest in their bodily responses, feeling states and dreams, as well as in the analyst s countertransference, could gradually be verbalized and understood within the transference. Thus visual perception, or 'external seeing', could gradually become 'internal seeing', or insight into unconscious phantasies, leading to a shift in the patients internal object world towards a less persecutory state and more realistic appraisal of their body image. Copyright © 2013 Institute of Psychoanalysis.

  13. Business Process Reengineering in the Inventory Management to Improve Aircraft Maintenance Operations in the Indonesian Air Force

    DTIC Science & Technology

    2006-06-01

    Headquarters ( MABES TNI) for priority analysis. After that, MABES TNI submits the proposals to the DOD for procurement processes. (Republic of Indonesia... James E., Ernst and Young, “The New Industrial Engineering: Information Technology and Business Process Redesign.” In Business Process Reengineering...The Art of Balancing, Harvard Business Review, November-December 1993. Grover, Varun, Teng, James T.C., and Fiedler, Kirk D., “Technological and

  14. Beyond business process redesign: redefining Baxter's business network.

    PubMed

    Short, J E; Venkatraman, N

    1992-01-01

    Business process redesign has focused almost exclusively on improving the firm's internal operations. Although internal efficiency and effectiveness are important objectives, the authors argue that business network redesign--reconceptualizing the role of the firm and its key business processes in the larger business network--is of greater strategic importance. To support their argument, they analyze the evolution of Baxter's ASAP system, one of the most publicized but inadequately understood strategic information systems of the 1980s. They conclude by examining whether ASAP's early successes have positioned the firm well for the changing hospital supplies marketplace of the 1990s.

  15. Impact of peculiar features of construction of transport infrastructure on the choice of tools for reengineering of business processes

    NASA Astrophysics Data System (ADS)

    Khripko, Elena

    2017-10-01

    In the present article we study the issues of organizational resistance to reengineering of business processes in construction of transport infrastructure. Reengineering in a company of transport sector is, first and foremost, an innovative component of business strategy. We analyze the choice of forward and reverse reengineering tools and terms of their application in connection with organizational resistance. Reengineering is defined taking into account four aspects: fundamentality, radicality, abruptness, business process. We describe the stages of reengineering and analyze key requirements to newly created business processes.

  16. The RiverFish Approach to Business Process Modeling: Linking Business Steps to Control-Flow Patterns

    NASA Astrophysics Data System (ADS)

    Zuliane, Devanir; Oikawa, Marcio K.; Malkowski, Simon; Alcazar, José Perez; Ferreira, João Eduardo

    Despite the recent advances in the area of Business Process Management (BPM), today’s business processes have largely been implemented without clearly defined conceptual modeling. This results in growing difficulties for identification, maintenance, and reuse of rules, processes, and control-flow patterns. To mitigate these problems in future implementations, we propose a new approach to business process modeling using conceptual schemas, which represent hierarchies of concepts for rules and processes shared among collaborating information systems. This methodology bridges the gap between conceptual model description and identification of actual control-flow patterns for workflow implementation. We identify modeling guidelines that are characterized by clear phase separation, step-by-step execution, and process building through diagrams and tables. The separation of business process modeling in seven mutually exclusive phases clearly delimits information technology from business expertise. The sequential execution of these phases leads to the step-by-step creation of complex control-flow graphs. The process model is refined through intuitive table and diagram generation in each phase. Not only does the rigorous application of our modeling framework minimize the impact of rule and process changes, but it also facilitates the identification and maintenance of control-flow patterns in BPM-based information system architectures.

  17. An Information System Development Method Combining Business Process Modeling with Executable Modeling and its Evaluation by Prototyping

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao

    The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.

  18. Process-orientated psychoanalytic work in initial interviews and the importance of the opening scene.

    PubMed

    Wegner, Peter

    2014-06-01

    From the very first moment of the initial interview to the end of a long course of psychoanalysis, the unconscious exchange between analysand and analyst, and the analysis of the relationship between transference and countertransference, are at the heart of psychoanalytic work. Drawing on initial interviews with a psychosomatically and depressively ill student, a psychoanalytic understanding of initial encounters is worked out. The opening scene of the first interview already condenses the central psychopathology - a clinging to the primary object because it was never securely experienced as present by the patient. The author outlines the development of some psychoanalytic theories concerning the initial interview and demonstrates their specific importance as background knowledge for the clinical situation in the following domains: the 'diagnostic position', the 'therapeutic position', the 'opening scene', the 'countertransference' and the 'analyst's free-floating introspectiveness'. More recent investigations refer to 'process qualities' of the analytic relationship, such as 'synchronization' and 'self-efficacy'. The latter seeks to describe after how much time between the interview sessions constructive or destructive inner processes gain ground in the patient and what significance this may have for the decision about the treatment that follows. All these factors combined can lead to establishing a differential process-orientated indication that also takes account of the fact that being confronted with the fear of unconscious processes of exchange is specific to the psychoanalytic profession. Copyright © 2014 Institute of Psychoanalysis.

  19. A literature review on business process modelling: new frontiers of reusability

    NASA Astrophysics Data System (ADS)

    Aldin, Laden; de Cesare, Sergio

    2011-08-01

    Business process modelling (BPM) has become fundamental for modern enterprises due to the increasing rate of organisational change. As a consequence, business processes need to be continuously (re-)designed as well as subsequently aligned with the corresponding enterprise information systems. One major problem associated with the design of business processes is reusability. Reuse of business process models has the potential of increasing the efficiency and effectiveness of BPM. This article critically surveys the existing literature on the problem of BPM reusability and more specifically on that State-of-the-Art research that can provide or suggest the 'elements' required for the development of a methodology aimed at discovering reusable conceptual artefacts in the form of patterns. The article initially clarifies the definitions of business process and business process model; then, it sets out to explore the previous research conducted in areas that have an impact on reusability in BPM. The article concludes by distilling directions for future research towards the development of apatterns-based approach to BPM; an approach that brings together the contributions made by the research community in the areas of process mining and discovery, declarative approaches and ontologies.

  20. 29 CFR 570.130 - Employment of certain youth inside and outside of places of business that use power-driven...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... business that use power-driven machinery to process wood products. 570.130 Section 570.130 Labor... youth inside and outside of places of business that use power-driven machinery to process wood products... business that use power-driven machinery to process wood products. The provisions of this exemption are...

Top