Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-01
.... Securities Offering. Series 86 Research Analyst--Analysis..... From $160 to $175. Series 87 Research Analyst... Order Processing Assistant Representatives, Research Analysts and Operations Professionals, respectively... examination.\\7\\ \\6\\ PROCTOR is a computer system that is specifically designed for the administration and...
Characteristics of the Navy Laboratory Warfare Center Technical Workforce
2013-09-29
Mathematics and Information Science (M&IS) Actuarial Science 1510 Computer Science 1550 Gen. Math & Statistics 1501 Mathematics 1520 Operations...Admin. Network Systems & Data Communication Analysts Actuaries Mathematicians Operations Research Analyst Statisticians Social Science (SS...workforce was sub-divided into six broad occupational groups: Life Science , Physical Science , Engineering, Mathematics, Computer Science and Information
Pike, William A; Riensche, Roderick M; Best, Daniel M; Roberts, Ian E; Whyatt, Marie V; Hart, Michelle L; Carr, Norman J; Thomas, James J
2012-09-18
Systems and computer-implemented processes for storage and management of information artifacts collected by information analysts using a computing device. The processes and systems can capture a sequence of interactive operation elements that are performed by the information analyst, who is collecting an information artifact from at least one of the plurality of software applications. The information artifact can then be stored together with the interactive operation elements as a snippet on a memory device, which is operably connected to the processor. The snippet comprises a view from an analysis application, data contained in the view, and the sequence of interactive operation elements stored as a provenance representation comprising operation element class, timestamp, and data object attributes for each interactive operation element in the sequence.
5 CFR 551.210 - Computer employees.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...
5 CFR 551.210 - Computer employees.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...
5 CFR 551.210 - Computer employees.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...
5 CFR 551.210 - Computer employees.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...
ERIC Educational Resources Information Center
Haga, Wayne; Moreno, Abel; Segall, Mark
2012-01-01
In this paper, we compare the performance of Computer Information Systems (CIS) majors on the Information Systems Analyst (ISA) Certification Exam. The impact that the form of delivery of information systems coursework may have on the exam score is studied. Using a sample that spans three years, we test for significant differences between scores…
Working conditions, visual fatigue, and mental health among systems analysts in São Paulo, Brazil
Rocha, L; Debert-Ribeiro, M
2004-01-01
Aims: To evaluate the association between working conditions and visual fatigue and mental health among systems analysts living in São Paulo, Brazil. Methods: A cross sectional study was carried out by a multidisciplinary team. It included: ergonomic analysis of work, individual and group interviews, and 553 self applied questionnaires in two enterprises. The comparison population numbered 136 workers in different occupations. Results: The study population mainly comprised young males. Among systems analysts, visual fatigue was associated with mental workload, inadequate equipment and workstation, low level of worker participation, being a woman, and subject's attitude of fascination by the computer. Nervousness and intellectual performance were associated with mental workload, inadequate equipment, work environment, and tools. Continuing education and leisure were protective factors. Work interfering in family life was associated with mental workload, difficulties with clients, strict deadlines, subject's attitude of fascination by the computer, and finding solutions of work problems outside work. Family support, satisfaction in life and work, and adequate work environment and tools were protective factors. Work interfering in personal life was associated with subject's attitude of fascination by the computer, strict deadlines, inadequate equipment, and high level of work participation. Satisfaction in life and work and continuing education were protective factors. The comparison population did not share common working factors with the systems analysts in the regression analysis. Conclusions: The main health effects of systems analysts' work were expressed by machine anthropomorphism, being very demanding, mental acceleration, mental absorption, and difficulty in dealing with emotions. PMID:14691269
Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool
ERIC Educational Resources Information Center
Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.
2011-01-01
This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…
29 CFR 541.400 - General rule for computer employees.
Code of Federal Regulations, 2011 CFR
2011-07-01
... machine operating systems; or (4) A combination of the aforementioned duties, the performance of which... systems analysts, computer programmers, software engineers or other similarly skilled workers in the... computer employees whose primary duty consists of: (1) The application of systems analysis techniques and...
29 CFR 541.400 - General rule for computer employees.
Code of Federal Regulations, 2010 CFR
2010-07-01
... machine operating systems; or (4) A combination of the aforementioned duties, the performance of which... systems analysts, computer programmers, software engineers or other similarly skilled workers in the... computer employees whose primary duty consists of: (1) The application of systems analysis techniques and...
29 CFR 541.400 - General rule for computer employees.
Code of Federal Regulations, 2013 CFR
2013-07-01
... machine operating systems; or (4) A combination of the aforementioned duties, the performance of which... systems analysts, computer programmers, software engineers or other similarly skilled workers in the... computer employees whose primary duty consists of: (1) The application of systems analysis techniques and...
29 CFR 541.400 - General rule for computer employees.
Code of Federal Regulations, 2012 CFR
2012-07-01
... machine operating systems; or (4) A combination of the aforementioned duties, the performance of which... systems analysts, computer programmers, software engineers or other similarly skilled workers in the... computer employees whose primary duty consists of: (1) The application of systems analysis techniques and...
29 CFR 541.400 - General rule for computer employees.
Code of Federal Regulations, 2014 CFR
2014-07-01
... machine operating systems; or (4) A combination of the aforementioned duties, the performance of which... systems analysts, computer programmers, software engineers or other similarly skilled workers in the... computer employees whose primary duty consists of: (1) The application of systems analysis techniques and...
Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.
Stolper, Charles D; Perer, Adam; Gotz, David
2014-12-01
As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2013 CFR
2013-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2014 CFR
2014-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
49 CFR 1245.5 - Classification of job titles.
Code of Federal Regulations, 2012 CFR
2012-10-01
..., Computer Programmer, Computer Analyst, Market Analyst, Pricing Analyst, Employment Supervisor, Research..., Traveling Auditors or Accountants Title is descriptive Traveling Auditor, Accounting Specialist Auditors... 21; adds new titles. 207 Supervising and Chief Claim Agents Title is descriptive Chief Claim Agent...
MAGIC Computer Simulation. Volume 2: Analyst Manual, Part 1
1971-05-01
A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1971 4. TITLE AND SUBTITLE MAGIC Computer Simulation Analyst Manual Part 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...14. ABSTRACT The MAGIC computer simulation generates target description data consisting of item-by-item listings of the target’s components and air
Collaborative human-machine analysis using a controlled natural language
NASA Astrophysics Data System (ADS)
Mott, David H.; Shemanski, Donald R.; Giammanco, Cheryl; Braines, Dave
2015-05-01
A key aspect of an analyst's task in providing relevant information from data is the reasoning about the implications of that data, in order to build a picture of the real world situation. This requires human cognition, based upon domain knowledge about individuals, events and environmental conditions. For a computer system to collaborate with an analyst, it must be capable of following a similar reasoning process to that of the analyst. We describe ITA Controlled English (CE), a subset of English to represent analyst's domain knowledge and reasoning, in a form that it is understandable by both analyst and machine. CE can be used to express domain rules, background data, assumptions and inferred conclusions, thus supporting human-machine interaction. A CE reasoning and modeling system can perform inferences from the data and provide the user with conclusions together with their rationale. We present a logical problem called the "Analysis Game", used for training analysts, which presents "analytic pitfalls" inherent in many problems. We explore an iterative approach to its representation in CE, where a person can develop an understanding of the problem solution by incremental construction of relevant concepts and rules. We discuss how such interactions might occur, and propose that such techniques could lead to better collaborative tools to assist the analyst and avoid the "pitfalls".
Neurotechnology for intelligence analysts
NASA Astrophysics Data System (ADS)
Kruse, Amy A.; Boyd, Karen C.; Schulman, Joshua J.
2006-05-01
Geospatial Intelligence Analysts are currently faced with an enormous volume of imagery, only a fraction of which can be processed or reviewed in a timely operational manner. Computer-based target detection efforts have failed to yield the speed, flexibility and accuracy of the human visual system. Rather than focus solely on artificial systems, we hypothesize that the human visual system is still the best target detection apparatus currently in use, and with the addition of neuroscience-based measurement capabilities it can surpass the throughput of the unaided human severalfold. Using electroencephalography (EEG), Thorpe et al1 described a fast signal in the brain associated with the early detection of targets in static imagery using a Rapid Serial Visual Presentation (RSVP) paradigm. This finding suggests that it may be possible to extract target detection signals from complex imagery in real time utilizing non-invasive neurophysiological assessment tools. To transform this phenomenon into a capability for defense applications, the Defense Advanced Research Projects Agency (DARPA) currently is sponsoring an effort titled Neurotechnology for Intelligence Analysts (NIA). The vision of the NIA program is to revolutionize the way that analysts handle intelligence imagery, increasing both the throughput of imagery to the analyst and overall accuracy of the assessments. Successful development of a neurobiologically-based image triage system will enable image analysts to train more effectively and process imagery with greater speed and precision.
2016-11-01
Display Design, Methods , and Results for a User Study by Christopher J Garneau and Robert F Erbacher Approved for public...NOV 2016 US Army Research Laboratory Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods ...January 2013–September 2015 4. TITLE AND SUBTITLE Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.
Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V
2014-07-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology
Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.
2014-01-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914
Employment Trends in Computer Occupations. Bulletin 2101.
ERIC Educational Resources Information Center
Howard, H. Philip; Rothstein, Debra E.
In 1980 1,455,000 persons worked in computer occupations. Two in five were systems analysts or programmers; one in five was a keypunch operator; one in 20 was a computer service technician; and more than one in three were computer and peripheral equipment operators. Employment was concentrated in major urban centers in four major industry…
ERIC Educational Resources Information Center
Strober, Myra H.; Arnold, Carolyn L.
This discussion of the impact of new computer occupations on women's employment patterns is divided into four major sections. The first section describes the six computer-related occupations to be analyzed: (1) engineers; (2) computer scientists and systems analysts; (3) programmers; (4) electronic technicians; (5) computer operators; and (6) data…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pike, Bill
Data—lots of data—generated in seconds and piling up on the internet, streaming and stored in countless databases. Big data is important for commerce, society and our nation’s security. Yet the volume, velocity, variety and veracity of data is simply too great for any single analyst to make sense of alone. It requires advanced, data-intensive computing. Simply put, data-intensive computing is the use of sophisticated computers to sort through mounds of information and present analysts with solutions in the form of graphics, scenarios, formulas, new hypotheses and more. This scientific capability is foundational to PNNL’s energy, environment and security missions. Seniormore » Scientist and Division Director Bill Pike and his team are developing analytic tools that are used to solve important national challenges, including cyber systems defense, power grid control systems, intelligence analysis, climate change and scientific exploration.« less
Coordinated Displays to Assist Cyber Defenders
2016-09-23
suspicious activity, such as the occurrence of a network event that is similar to a known attack signature, the system generates an alert which is then...presented to a human computer network defense analyst, or more succinctly, a network analyst, who must evaluate the veracity of that alert . To...display and select an alert to investigate further. Though alerts generally include some information about the nature of a potential threat, the
The Outlook for Computer Professions: 1985 Rewrites the Program.
ERIC Educational Resources Information Center
Drake, Larry
1986-01-01
The author states that graduates of junior college programs who learn COBOL will continue to find jobs, but employers will increasingly seek college graduates when filling positions for computer programers and systems analysts. Areas of growth for computer applications (services, military, data communications, and artificial intelligence) are…
ERIC Educational Resources Information Center
Sargent, John
The Office of Technology Policy analyzed Bureau of Labor Statistics' growth projections for the core occupational classifications of IT (information technology) workers to assess future demand in the United States. Classifications studied were computer engineers, systems analysts, computer programmers, database administrators, computer support…
29 CFR 541.0 - Introductory statement.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR REGULATIONS DEFINING AND... secondary schools), or in the capacity of an outside sales employee, as such terms are defined and delimited... requirements for computer systems analysts, computer programmers, software engineers, and other similarly...
29 CFR 541.0 - Introductory statement.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR REGULATIONS DEFINING AND... secondary schools), or in the capacity of an outside sales employee, as such terms are defined and delimited... requirements for computer systems analysts, computer programmers, software engineers, and other similarly...
29 CFR 541.0 - Introductory statement.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR REGULATIONS DEFINING AND... secondary schools), or in the capacity of an outside sales employee, as such terms are defined and delimited... requirements for computer systems analysts, computer programmers, software engineers, and other similarly...
29 CFR 541.0 - Introductory statement.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Regulations Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR REGULATIONS DEFINING AND... secondary schools), or in the capacity of an outside sales employee, as such terms are defined and delimited... requirements for computer systems analysts, computer programmers, software engineers, and other similarly...
One decade of the Data Fusion Information Group (DFIG) model
NASA Astrophysics Data System (ADS)
Blasch, Erik
2015-05-01
The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.
Collaborative interactive visualization: exploratory concept
NASA Astrophysics Data System (ADS)
Mokhtari, Marielle; Lavigne, Valérie; Drolet, Frédéric
2015-05-01
Dealing with an ever increasing amount of data is a challenge that military intelligence analysts or team of analysts face day to day. Increased individual and collective comprehension goes through collaboration between people. Better is the collaboration, better will be the comprehension. Nowadays, various technologies support and enhance collaboration by allowing people to connect and collaborate in settings as varied as across mobile devices, over networked computers, display walls, tabletop surfaces, to name just a few. A powerful collaboration system includes traditional and multimodal visualization features to achieve effective human communication. Interactive visualization strengthens collaboration because this approach is conducive to incrementally building a mental assessment of the data meaning. The purpose of this paper is to present an overview of the envisioned collaboration architecture and the interactive visualization concepts underlying the Sensemaking Support System prototype developed to support analysts in the context of the Joint Intelligence Collection and Analysis Capability project at DRDC Valcartier. It presents the current version of the architecture, discusses future capabilities to help analyst(s) in the accomplishment of their tasks and finally recommends collaboration and visualization technologies allowing to go a step further both as individual and as a team.
IMAT (Integrated Multidisciplinary Analysis Tool) user's guide for the VAX/VMS computer
NASA Technical Reports Server (NTRS)
Meissner, Frances T. (Editor)
1988-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system for the VAX/VMS computer developed at the Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Center on Education and Training for Employment.
This publication contains 25 subjects appropriate for use in a competency list for the occupation of computer programmer/analyst, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 25 units are as…
MetaboAnalyst 3.0--making metabolomics more meaningful.
Xia, Jianguo; Sinelnikov, Igor V; Han, Beomsoo; Wishart, David S
2015-07-01
MetaboAnalyst (www.metaboanalyst.ca) is a web server designed to permit comprehensive metabolomic data analysis, visualization and interpretation. It supports a wide range of complex statistical calculations and high quality graphical rendering functions that require significant computational resources. First introduced in 2009, MetaboAnalyst has experienced more than a 50X growth in user traffic (>50 000 jobs processed each month). In order to keep up with the rapidly increasing computational demands and a growing number of requests to support translational and systems biology applications, we performed a substantial rewrite and major feature upgrade of the server. The result is MetaboAnalyst 3.0. By completely re-implementing the MetaboAnalyst suite using the latest web framework technologies, we have been able substantially improve its performance, capacity and user interactivity. Three new modules have also been added including: (i) a module for biomarker analysis based on the calculation of receiver operating characteristic curves; (ii) a module for sample size estimation and power analysis for improved planning of metabolomics studies and (iii) a module to support integrative pathway analysis for both genes and metabolites. In addition, popular features found in existing modules have been significantly enhanced by upgrading the graphical output, expanding the compound libraries and by adding support for more diverse organisms. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Some Observations on the Current Status of Performing Finite Element Analyses
NASA Technical Reports Server (NTRS)
Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.
2015-01-01
Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.
Computer-Aided Group Problem Solving for Unified Life Cycle Engineering (ULCE)
1989-02-01
defining the problem, generating alternative solutions, evaluating alternatives, selecting alternatives, and implementing the solution. Systems...specialist in group dynamics, assists the group in formulating the problem and selecting a model framework. The analyst provides the group with computer...allocating resources, evaluating and selecting options, making judgments explicit, and analyzing dynamic systems. c. University of Rhode Island Drs. Geoffery
Proactive human-computer collaboration for information discovery
NASA Astrophysics Data System (ADS)
DiBona, Phil; Shilliday, Andrew; Barry, Kevin
2016-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.
NASA Astrophysics Data System (ADS)
Davenport, Jack H.
2016-05-01
Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey; McNeese, Michael; Hall, David
2013-05-01
Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.
Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.
Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A
2018-01-01
Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.
User's guide to the Reliability Estimation System Testbed (REST)
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam
1992-01-01
The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.
SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1994-01-01
SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any SAMSAN algorithm; however, it is generally agreed by experienced users, and in the numerical error analysis literature, that computation with non-symmetric matrices of order greater than about 200 should be avoided or treated with extreme care. SAMSAN attempts to support the needs of application oriented analysis by providing: 1) a methodology with unlimited growth potential, 2) a methodology to insure that associated documentation is current and available "on demand", 3) a foundation of basic computational algorithms that most controls analysis procedures are based upon, 4) a set of check out and evaluation programs which demonstrate usage of the algorithms on a series of problems which are structured to expose the limits of each algorithm's applicability, and 5) capabilities which support both a priori and a posteriori error analysis for the computational algorithms provided. The SAMSAN algorithms are coded in FORTRAN 77 for batch or interactive execution and have been implemented on a DEC VAX computer under VMS 4.7. An effort was made to assure that the FORTRAN source code was portable and thus SAMSAN may be adaptable to other machine environments. The documentation is included on the distribution tape or can be purchased separately at the price below. SAMSAN version 2.0 was developed in 1982 and updated to version 3.0 in 1988.
Uncertainty Quantification Techniques of SCALE/TSUNAMI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Mueller, Don
2011-01-01
The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less
GRODY - GAMMA RAY OBSERVATORY DYNAMICS SIMULATOR IN ADA
NASA Technical Reports Server (NTRS)
Stark, M.
1994-01-01
Analysts use a dynamics simulator to test the attitude control system algorithms used by a satellite. The simulator must simulate the hardware, dynamics, and environment of the particular spacecraft and provide user services which enable the analyst to conduct experiments. Researchers at Goddard's Flight Dynamics Division developed GRODY alongside GROSS (GSC-13147), a FORTRAN simulator which performs the same functions, in a case study to assess the feasibility and effectiveness of the Ada programming language for flight dynamics software development. They used popular object-oriented design techniques to link the simulator's design with its function. GRODY is designed for analysts familiar with spacecraft attitude analysis. The program supports maneuver planning as well as analytical testing and evaluation of the attitude determination and control system used on board the Gamma Ray Observatory (GRO) satellite. GRODY simulates the GRO on-board computer and Control Processor Electronics. The analyst/user sets up and controls the simulation. GRODY allows the analyst to check and update parameter values and ground commands, obtain simulation status displays, interrupt the simulation, analyze previous runs, and obtain printed output of simulation runs. The video terminal screen display allows visibility of command sequences, full-screen display and modification of parameters using input fields, and verification of all input data. Data input available for modification includes alignment and performance parameters for all attitude hardware, simulation control parameters which determine simulation scheduling and simulator output, initial conditions, and on-board computer commands. GRODY generates eight types of output: simulation results data set, analysis report, parameter report, simulation report, status display, plots, diagnostic output (which helps the user trace any problems that have occurred during a simulation), and a permanent log of all runs and errors. The analyst can send results output in graphical or tabular form to a terminal, disk, or hardcopy device, and can choose to have any or all items plotted against time or against each other. Goddard researchers developed GRODY on a VAX 8600 running VMS version 4.0. For near real time performance, GRODY requires a VAX at least as powerful as a model 8600 running VMS 4.0 or a later version. To use GRODY, the VAX needs an Ada Compilation System (ACS), Code Management System (CMS), and 1200K memory. GRODY is written in Ada and FORTRAN.
Mander, Luke; Baker, Sarah J.; Belcher, Claire M.; Haselhorst, Derek S.; Rodriguez, Jacklyn; Thorn, Jessica L.; Tiwari, Shivangi; Urrego, Dunia H.; Wesseln, Cassandra J.; Punyasena, Surangi W.
2014-01-01
• Premise of the study: Humans frequently identify pollen grains at a taxonomic rank above species. Grass pollen is a classic case of this situation, which has led to the development of computational methods for identifying grass pollen species. This paper aims to provide context for these computational methods by quantifying the accuracy and consistency of human identification. • Methods: We measured the ability of nine human analysts to identify 12 species of grass pollen using scanning electron microscopy images. These are the same images that were used in computational identifications. We have measured the coverage, accuracy, and consistency of each analyst, and investigated their ability to recognize duplicate images. • Results: Coverage ranged from 87.5% to 100%. Mean identification accuracy ranged from 46.67% to 87.5%. The identification consistency of each analyst ranged from 32.5% to 87.5%, and each of the nine analysts produced considerably different identification schemes. The proportion of duplicate image pairs that were missed ranged from 6.25% to 58.33%. • Discussion: The identification errors made by each analyst, which result in a decline in accuracy and consistency, are likely related to psychological factors such as the limited capacity of human memory, fatigue and boredom, recency effects, and positivity bias. PMID:25202649
Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.
1989-01-01
The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.
BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis, Version III
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W. III.
1981-06-01
This report is a condensed documentation for VERSION III of the BOLD VENTURE COMPUTATION SYSTEM for nuclear reactor core analysis. An experienced analyst should be able to use this system routinely for solving problems by referring to this document. Individual reports must be referenced for details. This report covers basic input instructions and describes recent extensions to the modules as well as to the interface data file specifications. Some application considerations are discussed and an elaborate sample problem is used as an instruction aid. Instructions for creating the system on IBM computers are also given.
Assessing the performance of regional landslide early warning models: the EDuMaP method
NASA Astrophysics Data System (ADS)
Calvello, M.; Piciullo, L.
2015-10-01
The paper proposes the evaluation of the technical performance of a regional landslide early warning system by means of an original approach, called EDuMaP method, comprising three successive steps: identification and analysis of the Events (E), i.e. landslide events and warning events derived from available landslides and warnings databases; definition and computation of a Duration Matrix (DuMa), whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model Performance (P) by means of performance criteria and indicators applied to the duration matrix. During the first step, the analyst takes into account the features of the warning model by means of ten input parameters, which are used to identify and classify landslide and warning events according to their spatial and temporal characteristics. In the second step, the analyst computes a time-based duration matrix having a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The proposed method is based on a framework clearly distinguishing between local and regional landslide early warning systems as well as among correlation laws, warning models and warning systems. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warnings data from the municipal early warning system operating in Rio de Janeiro (Brazil).
Assessing the performance of regional landslide early warning models: the EDuMaP method
NASA Astrophysics Data System (ADS)
Calvello, M.; Piciullo, L.
2016-01-01
A schematic of the components of regional early warning systems for rainfall-induced landslides is herein proposed, based on a clear distinction between warning models and warning systems. According to this framework an early warning system comprises a warning model as well as a monitoring and warning strategy, a communication strategy and an emergency plan. The paper proposes the evaluation of regional landslide warning models by means of an original approach, called the "event, duration matrix, performance" (EDuMaP) method, comprising three successive steps: identification and analysis of the events, i.e., landslide events and warning events derived from available landslides and warnings databases; definition and computation of a duration matrix, whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model performance by means of performance criteria and indicators applied to the duration matrix. During the first step the analyst identifies and classifies the landslide and warning events, according to their spatial and temporal characteristics, by means of a number of model parameters. In the second step, the analyst computes a time-based duration matrix with a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warning data from the municipal early warning system operating in Rio de Janeiro (Brazil).
Ceci n'est pas une micromachine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yarberry, Victor R.; Diegert, Carl F.
2010-03-01
The image created in reflected light DIC can often be interpreted as a true three-dimensional representation of the surface geometry, provided a clear distinction can be realized between raised and lowered regions in the specimen. It may be helpful if our definition of saliency embraces work on the human visual system (HVS) as well as the more abstract work on saliency, as it is certain that understanding by humans will always stand between recording of a useful signal from all manner of sensors and so-called actionable intelligence. A DARPA/DSO program lays down this requirement in a current program (Kruse 2010):more » The vision for the Neurotechnology for Intelligence Analysts (NIA) Program is to revolutionize the way that analysts handle intelligence imagery, increasing both the throughput of imagery to the analyst and overall accuracy of the assessments. Current computer-based target detection capabilities cannot process vast volumes of imagery with the speed, flexibility, and precision of the human visual system.« less
Advanced Technology Lifecycle Analysis System (ATLAS)
NASA Technical Reports Server (NTRS)
O'Neil, Daniel A.; Mankins, John C.
2004-01-01
Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is satisfied with the system configurations, technology portfolios, and deployment strategies, he or she can present the concepts to a team, which will conduct a detailed, discipline-oriented analysis within a CEE. An analog to this approach is the music industry where a songwriter creates the lyrics and music before entering a recording studio.
Accounting for Systems Analysts in the 21st Century
ERIC Educational Resources Information Center
Giordano, Thomas; McAleer, Brenda; Szakas, Joseph S.
2010-01-01
Computer Information System (CIS) majors are required to successfully complete an introductory accounting course. Given the current forces in the financial world, the appropriateness of this course warrants scrutiny as to whether it properly serves the student, and the degree to which it continues to meet the IS 2002 outcomes. The current business…
User’s Guide for the SAS (Stand-Off Attack Simulation) Computer Model.
1982-01-15
A99QAXFD000-01 Albuquerque, New Mexico 87110 I1. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Director 15 January 1982 Defense Nuclear Aqency 13...computer model. SAS is an effective survivability and security system design tool which allows an analyst to compare the relative effectiveness of selected...mounted against other systems during uploading for dispersal or for non -emergency relocation. GLCM and LANCE must be mobilized and formed into convoys
Instrument Systems Analysis and Verification Facility (ISAVF) users guide
NASA Technical Reports Server (NTRS)
Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.
1985-01-01
The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.
User's guide for the thermal analyst's help desk expert system
NASA Technical Reports Server (NTRS)
Ormsby, Rachel A.
1994-01-01
A guide for users of the Thermal Analyst's Help Desk is provided. Help Desk is an expert system that runs on a DOS based personal computer and operates within the EXSYS expert system shell. Help Desk is an analysis tool designed to provide users having various degrees of experience with the capability to determine first approximations of thermal capacity for spacecraft and instruments. The five analyses supported in Help Desk are: surface area required for a radiating surface, equilibrium temperature of a surface, enclosure temperature and heat loads for a defined position in orbit, enclosure temperature and heat loads over a complete orbit, and selection of appropriate surface properties. The two geometries supported by Help Desk are a single flat plate and a rectangular box enclosure.
NASA Technical Reports Server (NTRS)
Shooman, Martin L.
1991-01-01
Many of the most challenging reliability problems of our present decade involve complex distributed systems such as interconnected telephone switching computers, air traffic control centers, aircraft and space vehicles, and local area and wide area computer networks. In addition to the challenge of complexity, modern fault-tolerant computer systems require very high levels of reliability, e.g., avionic computers with MTTF goals of one billion hours. Most analysts find that it is too difficult to model such complex systems without computer aided design programs. In response to this need, NASA has developed a suite of computer aided reliability modeling programs beginning with CARE 3 and including a group of new programs such as: HARP, HARP-PC, Reliability Analysts Workbench (Combination of model solvers SURE, STEM, PAWS, and common front-end model ASSIST), and the Fault Tree Compiler. The HARP program is studied and how well the user can model systems using this program is investigated. One of the important objectives will be to study how user friendly this program is, e.g., how easy it is to model the system, provide the input information, and interpret the results. The experiences of the author and his graduate students who used HARP in two graduate courses are described. Some brief comparisons were made with the ARIES program which the students also used. Theoretical studies of the modeling techniques used in HARP are also included. Of course no answer can be any more accurate than the fidelity of the model, thus an Appendix is included which discusses modeling accuracy. A broad viewpoint is taken and all problems which occurred in the use of HARP are discussed. Such problems include: computer system problems, installation manual problems, user manual problems, program inconsistencies, program limitations, confusing notation, long run times, accuracy problems, etc.
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
ERIC Educational Resources Information Center
Jackson, James; Dixon, Mark R.
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows MOBLE operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection…
AGARD Flight Test Techniques Series. Volume 2. Identification of Dynamic Systems
1985-01-01
should not depend upon it to solve the problem autonomously. The analyst’s strong point is in formulating the problem; the computer’s strength is in...of derivation for the output-error method is to reduce the problem to the static form of Chapter 5. We will see that the dinamic system make- the
1989-01-01
the FAA Computing Environment 7. Author(s) S. Performing Organization Report No. MT/O1-89. Al 9. Performing Organization Name and Address 10. Work Unit...him in advance by analysts and developers -- an electronic3 version of the Performance Indicators report. Ease of Use. pcEXPRESS has an automatic link...overcome within the required timeframe. I These advanced features of the EXPRESS system allow the fastest possible response to changing executive information
The Use of Object-Oriented Analysis Methods in Surety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.
1999-05-01
Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less
ERIC Educational Resources Information Center
Goldman, Charles I.
The manual is part of a series to assist in planning procedures for local and State vocational agencies. It details steps required to process a local education agency's data after the data have been coded onto keypunch forms. Program, course, and overhead data are input into a computer data base and error checks are performed. A computer model is…
Automatic cloud tracking applied to GOES and Meteosat observations
NASA Technical Reports Server (NTRS)
Endlich, R. M.; Wolf, D. E.
1981-01-01
An improved automatic processing method for the tracking of cloud motions as revealed by satellite imagery is presented and applications of the method to GOES observations of Hurricane Eloise and Meteosat water vapor and infrared data are presented. The method is shown to involve steps of picture smoothing, target selection and the calculation of cloud motion vectors by the matching of a group at a given time with its best likeness at a later time, or by a cross-correlation computation. Cloud motion computations can be made in as many as four separate layers simultaneously. For data of 4 and 8 km resolution in the eye of Hurricane Eloise, the automatic system is found to provide results comparable in accuracy and coverage to those obtained by NASA analysts using the Atmospheric and Oceanographic Information Processing System, with results obtained by the pattern recognition and cross correlation computations differing by only fractions of a pixel. For Meteosat water vapor data from the tropics and midlatitudes, the automatic motion computations are found to be reliable only in areas where the water vapor fields contained small-scale structure, although excellent results are obtained using Meteosat IR data in the same regions. The automatic method thus appears to be competitive in accuracy and coverage with motion determination by human analysts.
Training Knowledge Bots for Physics-Based Simulations Using Artificial Neural Networks
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.; Wong, Jay Ming
2014-01-01
Millions of complex physics-based simulations are required for design of an aerospace vehicle. These simulations are usually performed by highly trained and skilled analysts, who execute, monitor, and steer each simulation. Analysts rely heavily on their broad experience that may have taken 20-30 years to accumulate. In addition, the simulation software is complex in nature, requiring significant computational resources. Simulations of system of systems become even more complex and are beyond human capacity to effectively learn their behavior. IBM has developed machines that can learn and compete successfully with a chess grandmaster and most successful jeopardy contestants. These machines are capable of learning some complex problems much faster than humans can learn. In this paper, we propose using artificial neural network to train knowledge bots to identify the idiosyncrasies of simulation software and recognize patterns that can lead to successful simulations. We examine the use of knowledge bots for applications of computational fluid dynamics (CFD), trajectory analysis, commercial finite-element analysis software, and slosh propellant dynamics. We will show that machine learning algorithms can be used to learn the idiosyncrasies of computational simulations and identify regions of instability without including any additional information about their mathematical form or applied discretization approaches.
Quest: The Interactive Test Analysis System.
ERIC Educational Resources Information Center
Adams, Raymond J.; Khoo, Siek-Toon
The Quest program offers a comprehensive test and questionnaire analysis environment by providing a data analyst (a computer program) with access to the most recent developments in Rasch measurement theory, as well as a range of traditional analysis procedures. This manual helps the user use Quest to construct and validate variables based on…
2013-11-01
by existing cyber-attack detection tools far exceeds the analysts’ cognitive capabilities. Grounded in perceptual and cognitive theory , many visual...Processes Inspired by the sense-making theory discussed earlier, we model the analytical reasoning process of cyber analysts using three key...analyst are called “working hypotheses”); each hypothesis could trigger further actions to confirm or disconfirm it. New actions will lead to new
Model documentation report: Transportation sector model of the National Energy Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-03-01
This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. This document serves three purposes. First, it is a reference document providing a detailed description of TRAN for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, 57(b)(1)). Third, it permits continuity inmore » model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements.« less
Analyst-to-Analyst Variability in Simulation-Based Prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glickman, Matthew R.; Romero, Vicente J.
This report describes findings from the culminating experiment of the LDRD project entitled, "Analyst-to-Analyst Variability in Simulation-Based Prediction". For this experiment, volunteer participants solving a given test problem in engineering and statistics were interviewed at different points in their solution process. These interviews are used to trace differing solutions to differing solution processes, and differing processes to differences in reasoning, assumptions, and judgments. The issue that the experiment was designed to illuminate -- our paucity of understanding of the ways in which humans themselves have an impact on predictions derived from complex computational simulations -- is a challenging and openmore » one. Although solution of the test problem by analyst participants in this experiment has taken much more time than originally anticipated, and is continuing past the end of this LDRD, this project has provided a rare opportunity to explore analyst-to-analyst variability in significant depth, from which we derive evidence-based insights to guide further explorations in this important area.« less
Chinese-English Automation and Computer Technology Dictionary, Volume 2.
1980-08-01
The purpose of the series is to provide rapid reference tools for translators, abstractors, and research analysts concerned with scientific and...tansuo search; searching; 25 exploration; explore: research ; hunting; trace; seek tansuo dianxian tracer wire 26 884 tansuxd . ., heuristic 01 tansuofa...xunwen -v Ij ;I system interrogation 22 xitong yanjiu A f IC system research 23 xitong yinqyong chengxiud -A, i M 1 R N- #1 , system utility program 24
Integrated multidisciplinary analysis tool IMAT users' guide
NASA Technical Reports Server (NTRS)
Meissner, Frances T. (Editor)
1988-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.
Creating an iPhone Application for Collecting Continuous ABC Data
ERIC Educational Resources Information Center
Whiting, Seth W.; Dixon, Mark R.
2012-01-01
This paper provides an overview and task analysis for creating a continuous ABC data- collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to…
ERIC Educational Resources Information Center
Carlin, Anna; Manson, Daniel P.; Zhu, Jake
2010-01-01
With the projected higher demand for Network Systems Analysts and increasing computer crime, network security specialists are an organization's first line of defense. The principle function of this paper is to provide the evolution of Collegiate Cyber Defense Competitions (CCDC), event planning required, soliciting sponsors, recruiting personnel…
Quantifying Security Threats and Their Impact
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aissa, Anis Ben; Abercrombie, Robert K; Sheldon, Frederick T
In earlier works, we present a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper we illustrate this infrastructure by means of a sample example involving an e-commerce application.
Quantifying Security Threats and Their Potential Impacts: A Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aissa, Anis Ben; Abercrombie, Robert K; Sheldon, Frederick T
In earlier works, we present a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we illustrate this infrastructure by means of an e-commerce application.
Methods for nuclear air-cleaning-system accident-consequence assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.
1982-01-01
This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptionsmore » of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.« less
IT Security Support for the Spaceport Command Control System Development
NASA Technical Reports Server (NTRS)
Varise, Brian
2014-01-01
My job title is IT Security support for the Spaceport Command & Control System Development. As a cyber-security analyst it is my job to ensure NASA's information stays safe from cyber threats, such as, viruses, malware and denial-of-service attacks by establishing and enforcing system access controls. Security is very important in the world of technology and it is used everywhere from personal computers to giant networks ran by Government agencies worldwide. Without constant monitoring analysis, businesses, public organizations and government agencies are vulnerable to potential harmful infiltration of their computer information system. It is my responsibility to ensure authorized access by examining improper access, reporting violations, revoke access, monitor information request by new programming and recommend improvements. My department oversees the Launch Control System and networks. An audit will be conducted for the LCS based on compliance with the Federal Information Security Management Act (FISMA) and The National Institute of Standards and Technology (NIST). I recently finished analyzing the SANS top 20 critical controls to give cost effective recommendations on various software and hardware products for compliance. Upon my completion of this internship, I will have successfully completed my duties as well as gain knowledge that will be helpful to my career in the future as a Cyber Security Analyst.
Payload Operations Control Center (POCC). [spacelab flight operations
NASA Technical Reports Server (NTRS)
Shipman, D. L.; Noneman, S. R.; Terry, E. S.
1981-01-01
The Spacelab payload operations control center (POCC) timeline analysis program which is used to provide POCC activity and resource information as a function of mission time is described. This program is fully automated and interactive, and is equipped with tutorial displays. The tutorial displays are sufficiently detailed for use by a program analyst having no computer experience. The POCC timeline analysis program is designed to operate on the VAX/VMS version V2.1 computer system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Jared M; Ferber, Aaron E; Greenlee, Elliot D
Akatosh is a highly configurable system based on the integration of the capabilities of one or more Intrusion Detection Systems (IDS) and automated forensic analysis. Akatosh reduces the false positive rates of IDSs and alleviates costs of incident response by pointing forensic personnel to the root cause of an incident on affected endpoint devices. Akatosh is able to analyze a computer system in near real-time and provide operations and forensic analyst personnel with continuous feedback on the impact of malware and software on deployed systems. Additionally, Akatosh provides the ability to look back into any prior state in the historymore » of the computer system along with the ability to compare one or more prior system states with any other prior state.« less
SAFARI, an On-Line Text-Processing System User's Manual.
ERIC Educational Resources Information Center
Chapin, P.G.; And Others.
This report describes for the potential user a set of procedures for processing textual materials on-line. In this preliminary model an information analyst can scan through messages, reports, and other documents on a display scope and select relevant facts, which are processed linguistically and then stored in the computer in the form of logical…
Hot Jobs for the 21st Century. Facts on Working Women No. 97-4.
ERIC Educational Resources Information Center
Women's Bureau (DOL), Washington, DC.
Between 1994 and 2005, employment in the United States will rise to 144.7 million from 172 million, an increase of 14 percent, with women's labor force growth expected to be twice that of men. Growing occupations requiring a Bachelor's degree or above include the following: lawyers, physicians, systems analysts, computer engineers, management…
Developing an intelligence analysis process through social network analysis
NASA Astrophysics Data System (ADS)
Waskiewicz, Todd; LaMonica, Peter
2008-04-01
Intelligence analysts are tasked with making sense of enormous amounts of data and gaining an awareness of a situation that can be acted upon. This process can be extremely difficult and time consuming. Trying to differentiate between important pieces of information and extraneous data only complicates the problem. When dealing with data containing entities and relationships, social network analysis (SNA) techniques can be employed to make this job easier. Applying network measures to social network graphs can identify the most significant nodes (entities) and edges (relationships) and help the analyst further focus on key areas of concern. Strange developed a model that identifies high value targets such as centers of gravity and critical vulnerabilities. SNA lends itself to the discovery of these high value targets and the Air Force Research Laboratory (AFRL) has investigated several network measures such as centrality, betweenness, and grouping to identify centers of gravity and critical vulnerabilities. Using these network measures, a process for the intelligence analyst has been developed to aid analysts in identifying points of tactical emphasis. Organizational Risk Analyzer (ORA) and Terrorist Modus Operandi Discovery System (TMODS) are the two applications used to compute the network measures and identify the points to be acted upon. Therefore, the result of leveraging social network analysis techniques and applications will provide the analyst and the intelligence community with more focused and concentrated analysis results allowing them to more easily exploit key attributes of a network, thus saving time, money, and manpower.
1988-11-01
system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and
Multi-brain fusion and applications to intelligence analysis
NASA Astrophysics Data System (ADS)
Stoica, A.; Matran-Fernandez, A.; Andreou, D.; Poli, R.; Cinel, C.; Iwashita, Y.; Padgett, C.
2013-05-01
In a rapid serial visual presentation (RSVP) images are shown at an extremely rapid pace. Yet, the images can still be parsed by the visual system to some extent. In fact, the detection of specific targets in a stream of pictures triggers a characteristic electroencephalography (EEG) response that can be recognized by a brain-computer interface (BCI) and exploited for automatic target detection. Research funded by DARPA's Neurotechnology for Intelligence Analysts program has achieved speed-ups in sifting through satellite images when adopting this approach. This paper extends the use of BCI technology from individual analysts to collaborative BCIs. We show that the integration of information in EEGs collected from multiple operators results in performance improvements compared to the single-operator case.
The Allocation of Visual Attention in Multimedia Search Interfaces
ERIC Educational Resources Information Center
Hughes, Edith Allen
2017-01-01
Multimedia analysts are challenged by the massive numbers of unconstrained video clips generated daily. Such clips can include any possible scene and events, and generally have limited quality control. Analysts who must work with such data are overwhelmed by its volume and lack of computational tools to probe it effectively. Even with advances…
NASA Technical Reports Server (NTRS)
1974-01-01
Computer program listings as well as graphical and tabulated data needed by the analyst to perform a BRAVO analysis were examined. Graphical aid which can be used to determine the earth coverage of satellites in synchronous equatorial orbits was described. A listing for satellite synthesis computer program as well as a sample printout for the DSCS-11 satellite program and a listing of the symbols used in the program were included. The APL language listing for the payload program cost estimating computer program was given. This language is compatible with many of the time sharing remote terminals computers used in the United States. Data on the intelsat communications network was studied. Costs for telecommunications systems leasing, line of sight microwave relay communications systems, submarine telephone cables, and terrestrial power generation systems were also described.
1982-08-01
though the two groups were different in terms of SC!I scientific interests and academic orientation scores (the aviation supply sample scored higher on...51 Chemists/Physicists 50 MARINE OFFICERS- COMUNICATION 49 MARINE OFFICERS-DATA SYSTEMS 48 Engineers 47 Biologists 46 Systems Analysts/Computer...Base ( Scientific and Technical Information Office) Commander, Air Force Human Resources Laboratory, Lowry Air Force Base (Technical Training Branch
NASA Astrophysics Data System (ADS)
El Bekri, Nadia; Angele, Susanne; Ruckhäberle, Martin; Peinsipp-Byma, Elisabeth; Haelke, Bruno
2015-10-01
This paper introduces an interactive recognition assistance system for imaging reconnaissance. This system supports aerial image analysts on missions during two main tasks: Object recognition and infrastructure analysis. Object recognition concentrates on the classification of one single object. Infrastructure analysis deals with the description of the components of an infrastructure and the recognition of the infrastructure type (e.g. military airfield). Based on satellite or aerial images, aerial image analysts are able to extract single object features and thereby recognize different object types. It is one of the most challenging tasks in the imaging reconnaissance. Currently, there are no high potential ATR (automatic target recognition) applications available, as consequence the human observer cannot be replaced entirely. State-of-the-art ATR applications cannot assume in equal measure human perception and interpretation. Why is this still such a critical issue? First, cluttered and noisy images make it difficult to automatically extract, classify and identify object types. Second, due to the changed warfare and the rise of asymmetric threats it is nearly impossible to create an underlying data set containing all features, objects or infrastructure types. Many other reasons like environmental parameters or aspect angles compound the application of ATR supplementary. Due to the lack of suitable ATR procedures, the human factor is still important and so far irreplaceable. In order to use the potential benefits of the human perception and computational methods in a synergistic way, both are unified in an interactive assistance system. RecceMan® (Reconnaissance Manual) offers two different modes for aerial image analysts on missions: the object recognition mode and the infrastructure analysis mode. The aim of the object recognition mode is to recognize a certain object type based on the object features that originated from the image signatures. The infrastructure analysis mode pursues the goal to analyze the function of the infrastructure. The image analyst extracts visually certain target object signatures, assigns them to corresponding object features and is finally able to recognize the object type. The system offers him the possibility to assign the image signatures to features given by sample images. The underlying data set contains a wide range of objects features and object types for different domains like ships or land vehicles. Each domain has its own feature tree developed by aerial image analyst experts. By selecting the corresponding features, the possible solution set of objects is automatically reduced and matches only the objects that contain the selected features. Moreover, we give an outlook of current research in the field of ground target analysis in which we deal with partly automated methods to extract image signatures and assign them to the corresponding features. This research includes methods for automatically determining the orientation of an object and geometric features like width and length of the object. This step enables to reduce automatically the possible object types offered to the image analyst by the interactive recognition assistance system.
Rethinking Visual Analytics for Streaming Data Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris
In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less
2015-01-01
field effective command and control sys- tems within the framework of current policies and processes. Cost Considerations in Cloud Computing ...www.rand.org/t/PE113 Finds that cloud provider costs can vary compared with tradi- tional information system alternatives because of different cost structures...for analysts evaluating new cloud investments. U.S. Army photo by Staff Sgt. Christopher Calvert FOCUS ON Capabilities Development and Acquisition
Estimating two-way tables based on forest surveys
Charles T. Scott
2000-01-01
Forest survey analysts usually are interested in tables of values rather than single point estimates. A common error is to include only plots on which nonzero values of the attribute were observed when computing the variance of a mean. Similarly, analysts often exclude nonforest plots from the analysis. The development of the correct estimates of forest area, attribute...
ERIC Educational Resources Information Center
Giacobe, Nicklaus A.
2013-01-01
Cyber-security involves the monitoring a complex network of inter-related computers to prevent, identify and remediate from undesired actions. This work is performed in organizations by human analysts. These analysts monitor cyber-security sensors to develop and maintain situation awareness (SA) of both normal and abnormal activities that occur on…
Composable Analytic Systems for next-generation intelligence analysis
NASA Astrophysics Data System (ADS)
DiBona, Phil; Llinas, James; Barry, Kevin
2015-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.
An intelligent interface for satellite operations: Your Orbit Determination Assistant (YODA)
NASA Technical Reports Server (NTRS)
Schur, Anne
1988-01-01
An intelligent interface is often characterized by the ability to adapt evaluation criteria as the environment and user goals change. Some factors that impact these adaptations are redefinition of task goals and, hence, user requirements; time criticality; and system status. To implement adaptations affected by these factors, a new set of capabilities must be incorporated into the human-computer interface design. These capabilities include: (1) dynamic update and removal of control states based on user inputs, (2) generation and removal of logical dependencies as change occurs, (3) uniform and smooth interfacing to numerous processes, databases, and expert systems, and (4) unobtrusive on-line assistance to users of concepts were applied and incorporated into a human-computer interface using artificial intelligence techniques to create a prototype expert system, Your Orbit Determination Assistant (YODA). YODA is a smart interface that supports, in real teime, orbit analysts who must determine the location of a satellite during the station acquisition phase of a mission. Also described is the integration of four knowledge sources required to support the orbit determination assistant: orbital mechanics, spacecraft specifications, characteristics of the mission support software, and orbit analyst experience. This initial effort is continuing with expansion of YODA's capabilities, including evaluation of results of the orbit determination task.
Animation of multi-flexible body systems and its use in control system design
NASA Technical Reports Server (NTRS)
Juengst, Carl; Stahlberg, Ron
1993-01-01
Animation can greatly assist the structural dynamicist and control system analyst with better understanding of how multi-flexible body systems behave. For multi-flexible body systems, the structural characteristics (mode frequencies, mode shapes, and damping) change, sometimes dramatically with large angles of rotation between bodies. With computer animation, the analyst can visualize these changes and how the system responds to active control forces and torques. A characterization of the type of system we wish to animate is presented. The lack of clear understanding of the above effects was a key element leading to the development of a multi-flexible body animation software package. The resulting animation software is described in some detail here, followed by its application to the control system analyst. Other applications of this software can be determined on an individual need basis. A number of software products are currently available that make the high-speed rendering of rigid body mechanical system simulation possible. However, such options are not available for use in rendering flexible body mechanical system simulations. The desire for a high-speed flexible body visualization tool led to the development of the Flexible Or Rigid Mechanical System (FORMS) software. This software was developed at the Center for Simulation and Design Optimization of Mechanical Systems at the University of Iowa. FORMS provides interactive high-speed rendering of flexible and/or rigid body mechanical system simulations, and combines geometry and motion information to produce animated output. FORMS is designed to be both portable and flexible, and supports a number of different user interfaces and graphical display devices. Additional features have been added to FORMS that allow special visualization results related to the nature of the flexible body geometric representations.
Decision support methods for the detection of adverse events in post-marketing data.
Hauben, M; Bate, A
2009-04-01
Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.
NASA Technical Reports Server (NTRS)
Cooke, C. H.
1975-01-01
STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.
Department of the Air Force Information Technology Program FY 95 President’s Budget
1994-03-01
2095 2200 552 900 1032 Description: Contractor hardware maintenan support, systems analyst support software development and maintenance, and off -the...hardware maintenance support, systems analyst support, operations support, configuration management, test support, and off -the-shelf software license...2419 2505 2594 Description: Contractor hardware maintenance support, systems analyst support, operations support, and off -the-shelf software license
Analyst-centered models for systems design, analysis, and development
NASA Technical Reports Server (NTRS)
Bukley, A. P.; Pritchard, Richard H.; Burke, Steven M.; Kiss, P. A.
1988-01-01
Much has been written about the possible use of Expert Systems (ES) technology for strategic defense system applications, particularly for battle management algorithms and mission planning. It is proposed that ES (or more accurately, Knowledge Based System (KBS)) technology can be used in situations for which no human expert exists, namely to create design and analysis environments that allow an analyst to rapidly pose many different possible problem resolutions in game like fashion and to then work through the solution space in search of the optimal solution. Portions of such an environment exist for expensive AI hardware/software combinations such as the Xerox LOOPS and Intellicorp KEE systems. Efforts are discussed to build an analyst centered model (ACM) using an ES programming environment, ExperOPS5 for a simple missile system tradeoff study. By analyst centered, it is meant that the focus of learning is for the benefit of the analyst, not the model. The model's environment allows the analyst to pose a variety of what if questions without resorting to programming changes. Although not an ES per se, the ACM would allow for a design and analysis environment that is much superior to that of current technologies.
The Pacor 2 expert system: A case-based reasoning approach to troubleshooting
NASA Technical Reports Server (NTRS)
Sary, Charisse
1994-01-01
The Packet Processor 2 (Pacor 2) Data Capture Facility (DCF) acquires, captures, and performs level-zero processing of packet telemetry for spaceflight missions that adhere to communication services recommendations established by the Consultative Committee for Space Data Systems (CCSDS). A major goal of this project is to reduce life-cycle costs. One way to achieve this goal is to increase automation. Through automation, using expert systems, and other technologies, staffing requirements will remain static, which will enable the same number of analysts to support more missions. Analysts provide packet telemetry data evaluation and analysis services for all data received. Data that passes this evaluation is forwarded to the Data Distribution Facility (DDF) and released to scientists. Through troubleshooting, data that fails this evaluation is dumped and analyzed to determine if its quality can be improved before it is released. This paper describes a proof-of-concept prototype that troubleshoots data quality problems. The Pacor 2 expert system prototype uses the case-based reasoning (CBR) approach to development, an alternative to a rule-based approach. Because Pacor 2 is not operational, the prototype has been developed using cases that describe existing troubleshooting experience from currently operating missions. Through CBR, this experience will be available to analysts when Pacor 2 becomes operational. As Pacor 2 unique experience is gained, analysts will update the case base. In essence, analysts are training the system as they learn. Once the system has learned the cases most likely to recur, it can serve as an aide to inexperienced analysts, a refresher to experienced analysts for infrequently occurring problems, or a training tool for new analysts. The Expert System Development Methodology (ESDM) is being used to guide development.
NASA Technical Reports Server (NTRS)
Mcmurtry, G. J.; Petersen, G. W. (Principal Investigator)
1975-01-01
The author has identified the following significant results. It was found that the high speed man machine interaction capability is a distinct advantage of the image 100; however, the small size of the digital computer in the system is a definite limitation. The system can be highly useful in an analysis mode in which it complements a large general purpose computer. The image 100 was found to be extremely valuable in the analysis of aircraft MSS data where the spatial resolution begins to approach photographic quality and the analyst can exercise interpretation judgements and readily interact with the machine.
COMET-AR User's Manual: COmputational MEchanics Testbed with Adaptive Refinement
NASA Technical Reports Server (NTRS)
Moas, E. (Editor)
1997-01-01
The COMET-AR User's Manual provides a reference manual for the Computational Structural Mechanics Testbed with Adaptive Refinement (COMET-AR), a software system developed jointly by Lockheed Palo Alto Research Laboratory and NASA Langley Research Center under contract NAS1-18444. The COMET-AR system is an extended version of an earlier finite element based structural analysis system called COMET, also developed by Lockheed and NASA. The primary extensions are the adaptive mesh refinement capabilities and a new "object-like" database interface that makes COMET-AR easier to extend further. This User's Manual provides a detailed description of the user interface to COMET-AR from the viewpoint of a structural analyst.
The Generic Spacecraft Analyst Assistant (gensaa): a Tool for Developing Graphical Expert Systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M.
1993-01-01
During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real-time data. The analysts must watch for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As the satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At NASA GSFC, fault-isolation expert systems are in operation supporting this data monitoring task. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will readily support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.
Payload crew training scheduler (PACTS) user's manual
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1980-01-01
The operation of the payload specialist training scheduler (PACTS) is discussed in this user's manual which is used to schedule payload specialists for mission training on the Spacelab experiments. The PACTS program is a fully automated interactive, computerized scheduling program equipped with tutorial displays. The tutorial displays are sufficiently detailed for use by a program analyst having no computer experience. The PACTS program is designed to operate on the UNIVAC 1108 computer system, and has the capability to load output into a PDP 11/45 Interactive Graphics Display System for printing schedules. The program has the capacity to handle up to three overlapping Spacelab missions.
Guidebook for Imputation of Missing Data. Technical Report No. 17.
ERIC Educational Resources Information Center
Wise, Lauress L.; McLaughlin, Donald H.
This guidebook is designed for data analysts who are working with computer data files that contain records with incomplete data. It indicates choices the analyst must make and the criteria for making those choices in regard to the following questions: (1) What resources are available for performing the imputation? (2) How big is the data file? (3)…
Design of Mariner 9 Science Sequences using Interactive Graphics Software
NASA Technical Reports Server (NTRS)
Freeman, J. E.; Sturms, F. M, Jr.; Webb, W. A.
1973-01-01
This paper discusses the analyst/computer system used to design the daily science sequences required to carry out the desired Mariner 9 science plan. The Mariner 9 computer environment, the development and capabilities of the science sequence design software, and the techniques followed in the daily mission operations are discussed. Included is a discussion of the overall mission operations organization and the individual components which played an essential role in the sequence design process. A summary of actual sequences processed, a discussion of problems encountered, and recommendations for future applications are given.
USDA analyst review of the LACIE IMAGE-100 hybrid system test
NASA Technical Reports Server (NTRS)
Ashburn, P.; Buelow, K.; Hansen, H. L.; May, G. A. (Principal Investigator)
1979-01-01
Fifty operational segments from the U.S.S.R., 40 test segments from Canada, and 24 test segments from the United States were used to provide a wide range of geographic conditions for USDA analysts during a test to determine the effectiveness of labeling single pixel training fields (dots) using Procedure 1 on the 1-100 hybrid system, and clustering and classifying on the Earth Resources Interactive Processing System. The analysts had additional on-line capabilities such as interactive dot labeling, class or cluster map overlay flickers, and flashing of all dots of equal spectral value. Results on the 1-100 hybrid system are described and analyst problems and recommendations are discussed.
Tutorial: Parallel Computing of Simulation Models for Risk Analysis.
Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D
2016-10-01
Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.
NASA Technical Reports Server (NTRS)
Fournelle, John; Carpenter, Paul
2006-01-01
Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.
CREATING AN IPHONE APPLICATION FOR COLLECTING CONTINUOUS ABC DATA
Whiting, Seth W; Dixon, Mark R
2012-01-01
This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs. PMID:23060682
Creating an iPhone application for collecting continuous ABC data.
Whiting, Seth W; Dixon, Mark R
2012-01-01
This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Sheldon, Frederick T; Grimaila, Michael R
2010-01-01
In earlier works, we presented a computational infrastructure that allows an analyst to estimate the security of a system in terms of the loss that each stakeholder stands to sustain as a result of security breakdowns. In this paper, we discuss how this infrastructure can be used in the subject domain of mission assurance as defined as the full life-cycle engineering process to identify and mitigate design, production, test, and field support deficiencies of mission success. We address the opportunity to apply the Cyberspace Security Econometrics System (CSES) to Carnegie Mellon University and Software Engineering Institute s Mission Assurance Analysismore » Protocol (MAAP) in this context.« less
Red gaming in support of the war on terrorism : Sandia Red Game report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Judy Hennessey; Whitley, John B.; Craft, Richard Layne, II
2004-02-01
The Advanced Concepts Group (ACG) at Sandia National Laboratories is exploring the use of Red Teaming to help intelligence analysts with two key processes: determining what a piece or pieces of information might imply and deciding what other pieces of information need to be found to support or refute hypotheses about what actions a suspected terrorist organization might be pursuing. In support of this effort, the ACG hosted a terrorism red gaming event in Albuquerque on July 22-24, 2003. The game involved two 'red teams' playing the roles of two terrorist cells - one focused on implementing an RDD attackmore » on the DC subway system and one focused on a bio attack against the same target - and two 'black teams' playing the role of the intelligence collection system and of intelligence analysts trying to decide what plans the red teams might be pursuing. This exercise successfully engaged human experts to seed a proposed compute engine with detailed operational plans for hypothetical terrorist scenarios.« less
Occupational Conversion Index: Enlisted/Officer/Civilian
1993-09-01
14A/B OMA Integrated Weapons Technician AT 7975 APQ-126 FLR IMA Technician AT 7976 C-8!85 ASCU IMA Tecnnician AT 7978 APM-446 Radar System Test...Operator/Maintainer AT 6705 CASS Test Station Inter Main, Calibratiort/Adv Maint Technician AT 7923 ASCU & Tactical Computer SSE IMA Technician...GENERAL ARMY 93F Field Artillery Meteorological Crewmember NAVY AG Aerographer’s Mate AG 7412 Analyst-Forecaster MARINE CORPS 0847 a Artillery
Visually Exploring Transportation Schedules.
Palomo, Cesar; Guo, Zhan; Silva, Cláudio T; Freire, Juliana
2016-01-01
Public transportation schedules are designed by agencies to optimize service quality under multiple constraints. However, real service usually deviates from the plan. Therefore, transportation analysts need to identify, compare and explain both eventual and systemic performance issues that must be addressed so that better timetables can be created. The purely statistical tools commonly used by analysts pose many difficulties due to the large number of attributes at trip- and station-level for planned and real service. Also challenging is the need for models at multiple scales to search for patterns at different times and stations, since analysts do not know exactly where or when relevant patterns might emerge and need to compute statistical summaries for multiple attributes at different granularities. To aid in this analysis, we worked in close collaboration with a transportation expert to design TR-EX, a visual exploration tool developed to identify, inspect and compare spatio-temporal patterns for planned and real transportation service. TR-EX combines two new visual encodings inspired by Marey's Train Schedule: Trips Explorer for trip-level analysis of frequency, deviation and speed; and Stops Explorer for station-level study of delay, wait time, reliability and performance deficiencies such as bunching. To tackle overplotting and to provide a robust representation for a large numbers of trips and stops at multiple scales, the system supports variable kernel bandwidths to achieve the level of detail required by users for different tasks. We justify our design decisions based on specific analysis needs of transportation analysts. We provide anecdotal evidence of the efficacy of TR-EX through a series of case studies that explore NYC subway service, which illustrate how TR-EX can be used to confirm hypotheses and derive new insights through visual exploration.
Seymour, A. C.; Dale, J.; Hammill, M.; Halpin, P. N.; Johnston, D. W.
2017-01-01
Estimating animal populations is critical for wildlife management. Aerial surveys are used for generating population estimates, but can be hampered by cost, logistical complexity, and human risk. Additionally, human counts of organisms in aerial imagery can be tedious and subjective. Automated approaches show promise, but can be constrained by long setup times and difficulty discriminating animals in aggregations. We combine unmanned aircraft systems (UAS), thermal imagery and computer vision to improve traditional wildlife survey methods. During spring 2015, we flew fixed-wing UAS equipped with thermal sensors, imaging two grey seal (Halichoerus grypus) breeding colonies in eastern Canada. Human analysts counted and classified individual seals in imagery manually. Concurrently, an automated classification and detection algorithm discriminated seals based upon temperature, size, and shape of thermal signatures. Automated counts were within 95–98% of human estimates; at Saddle Island, the model estimated 894 seals compared to analyst counts of 913, and at Hay Island estimated 2188 seals compared to analysts’ 2311. The algorithm improves upon shortcomings of computer vision by effectively recognizing seals in aggregations while keeping model setup time minimal. Our study illustrates how UAS, thermal imagery, and automated detection can be combined to efficiently collect population data critical to wildlife management. PMID:28338047
NASA Astrophysics Data System (ADS)
Seymour, A. C.; Dale, J.; Hammill, M.; Halpin, P. N.; Johnston, D. W.
2017-03-01
Estimating animal populations is critical for wildlife management. Aerial surveys are used for generating population estimates, but can be hampered by cost, logistical complexity, and human risk. Additionally, human counts of organisms in aerial imagery can be tedious and subjective. Automated approaches show promise, but can be constrained by long setup times and difficulty discriminating animals in aggregations. We combine unmanned aircraft systems (UAS), thermal imagery and computer vision to improve traditional wildlife survey methods. During spring 2015, we flew fixed-wing UAS equipped with thermal sensors, imaging two grey seal (Halichoerus grypus) breeding colonies in eastern Canada. Human analysts counted and classified individual seals in imagery manually. Concurrently, an automated classification and detection algorithm discriminated seals based upon temperature, size, and shape of thermal signatures. Automated counts were within 95-98% of human estimates; at Saddle Island, the model estimated 894 seals compared to analyst counts of 913, and at Hay Island estimated 2188 seals compared to analysts’ 2311. The algorithm improves upon shortcomings of computer vision by effectively recognizing seals in aggregations while keeping model setup time minimal. Our study illustrates how UAS, thermal imagery, and automated detection can be combined to efficiently collect population data critical to wildlife management.
41 CFR 51-8.12 - Fee schedule.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... (c) Instances in which fees may not be charged are as follows: (1) No charge shall be made for the... salary. When the services of a computer programmer or a computer program analyst are required in...
41 CFR 51-8.12 - Fee schedule.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... (c) Instances in which fees may not be charged are as follows: (1) No charge shall be made for the... salary. When the services of a computer programmer or a computer program analyst are required in...
NASA Astrophysics Data System (ADS)
Chen, Siyue; Leung, Henry; Dondo, Maxwell
2014-05-01
As computer network security threats increase, many organizations implement multiple Network Intrusion Detection Systems (NIDS) to maximize the likelihood of intrusion detection and provide a comprehensive understanding of intrusion activities. However, NIDS trigger a massive number of alerts on a daily basis. This can be overwhelming for computer network security analysts since it is a slow and tedious process to manually analyse each alert produced. Thus, automated and intelligent clustering of alerts is important to reveal the structural correlation of events by grouping alerts with common features. As the nature of computer network attacks, and therefore alerts, is not known in advance, unsupervised alert clustering is a promising approach to achieve this goal. We propose a joint optimization technique for feature selection and clustering to aggregate similar alerts and to reduce the number of alerts that analysts have to handle individually. More precisely, each identified feature is assigned a binary value, which reflects the feature's saliency. This value is treated as a hidden variable and incorporated into a likelihood function for clustering. Since computing the optimal solution of the likelihood function directly is analytically intractable, we use the Expectation-Maximisation (EM) algorithm to iteratively update the hidden variable and use it to maximize the expected likelihood. Our empirical results, using a labelled Defense Advanced Research Projects Agency (DARPA) 2000 reference dataset, show that the proposed method gives better results than the EM clustering without feature selection in terms of the clustering accuracy.
The QUELCE Method: Using Change Drivers to Estimate Program Costs
2016-08-01
QUELCE computes a distribution of program costs based on Monte Carlo analysis of program cost drivers—assessed via analyses of dependency structure...possible scenarios. These include a dependency structure matrix to understand the interaction of change drivers for a specific project a...performed by the SEI or by company analysts. From the workshop results, analysts create a dependency structure matrix (DSM) of the change drivers
NASA Technical Reports Server (NTRS)
Birisan, Mihnea; Beling, Peter
2011-01-01
New generations of surveillance drones are being outfitted with numerous high definition cameras. The rapid proliferation of fielded sensors and supporting capacity for processing and displaying data will translate into ever more capable platforms, but with increased capability comes increased complexity and scale that may diminish the usefulness of such platforms to human operators. We investigate methods for alleviating strain on analysts by automatically retrieving content specific to their current task using a machine learning technique known as Multi-Instance Learning (MIL). We use MIL to create a real time model of the analysts' task and subsequently use the model to dynamically retrieve relevant content. This paper presents results from a pilot experiment in which a computer agent is assigned analyst tasks such as identifying caravanning vehicles in a simulated vehicle traffic environment. We compare agent performance between MIL aided trials and unaided trials.
Nuclear reactor descriptions for space power systems analysis
NASA Technical Reports Server (NTRS)
Mccauley, E. W.; Brown, N. J.
1972-01-01
For the small, high performance reactors required for space electric applications, adequate neutronic analysis is of crucial importance, but in terms of computational time consumed, nuclear calculations probably yield the least amount of detail for mission analysis study. It has been found possible, after generation of only a few designs of a reactor family in elaborate thermomechanical and nuclear detail to use simple curve fitting techniques to assure desired neutronic performance while still performing the thermomechanical analysis in explicit detail. The resulting speed-up in computation time permits a broad detailed examination of constraints by the mission analyst.
ERIC Educational Resources Information Center
Shaw, David C.; Johnson, Dorothy M.
The complete comprehension of this paper requires a firm grasp of both mathematical demography and FORTRAN programming. The paper aims at the establishment of a language with which complex demographic manipulations can be briefly expressed in a form intelligible both to demographic analysts and to computers. The Demographic Computer Library (DCL)…
Anomalous event diagnosis for environmental satellite systems
NASA Technical Reports Server (NTRS)
Ramsay, Bruce H.
1993-01-01
The National Oceanic and Atmospheric Administration's (NOAA) National Environmental Satellite, Data, and Information Service (NESDIS) is responsible for the operation of the NOAA geostationary and polar orbiting satellites. NESDIS provides a wide array of operational meteorological and oceanographic products and services and operates various computer and communication systems on a 24-hour, seven days per week schedule. The Anomaly Reporting System contains a database of anomalous events regarding the operations of the Geostationary Operational Environmental Satellite (GOES), communication, or computer systems that have degraded or caused the loss of GOES imagery. Data is currently entered manually via an automated query user interface. There are 21 possible symptoms (e.g., No Data), and 73 possible causes (e.g., Sectorizer - World Weather Building) of an anomalous event. The determination of an event's cause(s) is made by the on-duty computer operator, who enters the event in a paper based daily log, and by the analyst entering the data into the reporting system. The determination of the event's cause(s) impacts both the operational status of these systems, and the performance evaluation of the on-site computer and communication operations contractor.
Computer Rehabilitation Training for the Severely Disabled.
ERIC Educational Resources Information Center
Louisiana State Univ., Baton Rouge.
The Computer Rehabilitation Training Program for the Severely Disabled is a job-oriented training program to prepare physically handicapped persons to become computer programmers and analysts. The program is operated by: a nonprofit organization of Baton Rouge-area business people interested in data processing; the Department of Social Services,…
Micro-based fact collection tool user's manual
NASA Technical Reports Server (NTRS)
Mayer, Richard
1988-01-01
A procedure designed for use by an analyst to assist in the collection and organization of data gathered during the interview processes associated with system analysis and modeling task is described. The basic concept behind the development of this tool is that during the interview process an analyst is presented with assertions of facts by the domain expert. The analyst also makes observations of the domain. These facts need to be collected and preserved in such a way as to allow them to serve as the basis for a number of decision making processes throughout the system development process. This tool can be thought of as a computerization of the analysts's notebook.
Syntax-directed content analysis of videotext: application to a map detection recognition system
NASA Astrophysics Data System (ADS)
Aradhye, Hrishikesh; Herson, James A.; Myers, Gregory
2003-01-01
Video is an increasingly important and ever-growing source of information to the intelligence and homeland defense analyst. A capability to automatically identify the contents of video imagery would enable the analyst to index relevant foreign and domestic news videos in a convenient and meaningful way. To this end, the proposed system aims to help determine the geographic focus of a news story directly from video imagery by detecting and geographically localizing political maps from news broadcasts, using the results of videotext recognition in lieu of a computationally expensive, scale-independent shape recognizer. Our novel method for the geographic localization of a map is based on the premise that the relative placement of text superimposed on a map roughly corresponds to the geographic coordinates of the locations the text represents. Our scheme extracts and recognizes videotext, and iteratively identifies the geographic area, while allowing for OCR errors and artistic freedom. The fast and reliable recognition of such maps by our system may provide valuable context and supporting evidence for other sources, such as speech recognition transcripts. The concepts of syntax-directed content analysis of videotext presented here can be extended to other content analysis systems.
Operating experience with LEAP from the perspective of the computing applications analyst
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, W.E. III; Horwedel, J.E.; McAdoo, J.W.
1981-05-01
The Long-Term Energy Analysis Program (LEAP), which was used for the energy price-quantity projections in the 1978 Annual Report to Congress (ARC '78) and used in an ORNL research program to develop and demonstrate a procedure for evaluating energy-economic modeling computer codes and the important results derived therefrom, is discussed. The LEAP system used in the ORNL research, the mechanics of executing LEAP, and the personnel skills required to execute the system are described. In addition, a LEAP sample problem, subroutine hierarchical flowcharts, and input tables for the ARC '78 energy-economic model are included. Results of a study to testmore » the capability of the LEAP system used in the ORNL research to reproduce the ARC '78 results credited to LEAP are presented.« less
Setting analyst: A practical harvest planning technique
Olivier R.M. Halleux; W. Dale Greene
2001-01-01
Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...
A Graphical User-Interface for Propulsion System Analysis
NASA Technical Reports Server (NTRS)
Curlett, Brian P.; Ryall, Kathleen
1992-01-01
NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.
A graphical user-interface for propulsion system analysis
NASA Technical Reports Server (NTRS)
Curlett, Brian P.; Ryall, Kathleen
1993-01-01
NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.
Thruput Analysis of AFLC CYBER 73 Computers.
1981-12-01
Ref 2:14). This decision permitted a fast conversion effort with minimum programmer/analyst experience (Ref 34). Recently, as the conversion effort...converted (Ref 1:2). 2 . i i i II I i4 Moreover, many of the large data-file and machine-time- consuming systems were not included in the earlier...by LMT personnel revealed that during certain periods i.e., 0000-0800, the machine is normally reserved for the large 3 4 resource- consuming programs
NASA Technical Reports Server (NTRS)
Basile, Lisa
1988-01-01
The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historical records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.
NASA Technical Reports Server (NTRS)
Basile, Lisa
1988-01-01
The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historial records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.
Jade: using on-demand cloud analysis to give scientists back their flow
NASA Astrophysics Data System (ADS)
Robinson, N.; Tomlinson, J.; Hilson, A. J.; Arribas, A.; Powell, T.
2017-12-01
The UK's Met Office generates 400 TB weather and climate data every day by running physical models on its Top 20 supercomputer. As data volumes explode, there is a danger that analysis workflows become dominated by watching progress bars, and not thinking about science. We have been researching how we can use distributed computing to allow analysts to process these large volumes of high velocity data in a way that's easy, effective and cheap.Our prototype analysis stack, Jade, tries to encapsulate this. Functionality includes: An under-the-hood Dask engine which parallelises and distributes computations, without the need to retrain analysts Hybrid compute clusters (AWS, Alibaba, and local compute) comprising many thousands of cores Clusters which autoscale up/down in response to calculation load using Kubernetes, and balances the cluster across providers based on the current price of compute Lazy data access from cloud storage via containerised OpenDAP This technology stack allows us to perform calculations many orders of magnitude faster than is possible on local workstations. It is also possible to outperform dedicated local compute clusters, as cloud compute can, in principle, scale to much larger scales. The use of ephemeral compute resources also makes this implementation cost efficient.
Dust in the wind: challenges for urban aerodynamics
NASA Astrophysics Data System (ADS)
Boris, Jay P.
2007-04-01
The fluid dynamics of airflow through a city controls the transport and dispersion of airborne contaminants. This is urban aerodynamics, not meteorology. The average flow, large-scale fluctuations and turbulence are closely coupled to the building geometry. Buildings create large "rooster-tail" wakes; there are systematic fountain flows up the backs of tall buildings; and dust in the wind can move perpendicular to or even against the locally prevailing wind. Requirements for better prediction accuracy demand time-dependent, three-dimensional CFD computations that include solar heating and buoyancy, complete landscape and building geometry specification including foliage and, realistic wind fluctuations. This fundamental prediction capability is necessary to assess urban visibility and line-of-sight sensor performance in street canyons and rugged terrain. Computing urban aerodynamics accurately is clearly a time-dependent High Performance Computing (HPC) problem. In an emergency, on the other hand, prediction technology to assess crisis information, sensor performance, and obscured line-of-sight propagation in the face of industrial spills, transportation accidents, or terrorist attacks has very tight time requirements that suggest simple approximations which tend to produce inaccurate results. In the past we have had to choose one or the other: a fast, inaccurate model or a slow accurate model. Using new fluid-dynamic principles, an urban-oriented emergency assessment system called CT-Analyst® was invented that solves this dilemma. It produces HPC-quality results for airborne contaminant scenarios nearly instantly and has unique new capabilities suited to sensor optimization. This presentation treats the design and use of CT-Analyst and discusses the developments needed for widespread use with advanced sensor and communication systems.
Computer-aided tracking and characterization of homicides and sexual assaults (CATCH)
NASA Astrophysics Data System (ADS)
Kangas, Lars J.; Terrones, Kristine M.; Keppel, Robert D.; La Moria, Robert D.
1999-03-01
When a serial offender strikes, it usually means that the investigation is unprecedented for that police agency. The volume of incoming leads and pieces of information in the case(s) can be overwhelming as evidenced by the thousands of leads gathered in the Ted Bundy Murders, Atlanta Child Murders, and the Green River Murders. Serial cases can be long term investigations in which the suspect remains unknown and continues to perpetrate crimes. With state and local murder investigative systems beginning to crop up, it will become important to manage that information in a timely and efficient way by developing computer programs to assist in that task. One vital function will be to compare violent crime cases from different jurisdictions so investigators can approach the investigation knowing that similar cases exist. CATCH (Computer Aided Tracking and Characterization of Homicides) is being developed to assist crime investigations by assessing likely characteristics of unknown offenders, by relating a specific crime case to other cases, and by providing a tool for clustering similar cases that may be attributed to the same offenders. CATCH is a collection of tools that assist the crime analyst in the investigation process by providing advanced data mining and visualization capabilities.These tools include clustering maps, query tools, geographic maps, timelines, etc. Each tool is designed to give the crime analyst a different view of the case data. The clustering tools in CATCH are based on artificial neural networks (ANNs). The ANNs learn to cluster similar cases from approximately 5000 murders and 3000 sexual assaults residing in a database. The clustering algorithm is applied to parameters describing modus operandi (MO), signature characteristics of the offenders, and other parameters describing the victim and offender. The proximity of cases within a two-dimensional representation of the clusters allows the analyst to identify similar or serial murders and sexual assaults.
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
Jackson, James; Dixon, Mark R
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows Moble operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection system. The program will allow the user to select the type of behavior to be recorded, choose between interval and frequency data collection, and summarize data for graphing and analysis. We also provide suggestions for customizing the data-collection system for idiosyncratic research and clinical needs. PMID:17624078
HAL/SM language specification. [programming languages and computer programming for space shuttles
NASA Technical Reports Server (NTRS)
Williams, G. P. W., Jr.; Ross, C.
1975-01-01
A programming language is presented for the flight software of the NASA Space Shuttle program. It is intended to satisfy virtually all of the flight software requirements of the space shuttle. To achieve this, it incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks. It is a higher order language designed to allow programmers, analysts, and engineers to communicate with the computer in a form approximating natural mathematical expression. Parts of the English language are combined with standard notation to provide a tool that readily encourages programming without demanding computer hardware expertise. Block diagrams and flow charts are included. The semantics of the language is discussed.
Dynamics of analyst forecasts and emergence of complexity: Role of information disparity
Ahn, Kwangwon
2017-01-01
We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts’ forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems. PMID:28498831
User's manual for the Graphical Constituent Loading Analysis System (GCLAS)
Koltun, G.F.; Eberle, Michael; Gray, J.R.; Glysson, G.D.
2006-01-01
This manual describes the Graphical Constituent Loading Analysis System (GCLAS), an interactive cross-platform program for computing the mass (load) and average concentration of a constituent that is transported in stream water over a period of time. GCLAS computes loads as a function of an equal-interval streamflow time series and an equal- or unequal-interval time series of constituent concentrations. The constituent-concentration time series may be composed of measured concentrations or a combination of measured and estimated concentrations. GCLAS is not intended for use in situations where concentration data (or an appropriate surrogate) are collected infrequently or where an appreciable amount of the concentration values are censored. It is assumed that the constituent-concentration time series used by GCLAS adequately represents the true time-varying concentration. Commonly, measured constituent concentrations are collected at a frequency that is less than ideal (from a load-computation standpoint), so estimated concentrations must be inserted in the time series to better approximate the expected chemograph. GCLAS provides tools to facilitate estimation and entry of instantaneous concentrations for that purpose. Water-quality samples collected for load computation frequently are collected in a single vertical or at single point in a stream cross section. Several factors, some of which may vary as a function of time and (or) streamflow, can affect whether the sample concentrations are representative of the mean concentration in the cross section. GCLAS provides tools to aid the analyst in assessing whether concentrations in samples collected in a single vertical or at single point in a stream cross section exhibit systematic bias with respect to the mean concentrations. In cases where bias is evident, the analyst can construct coefficient relations in GCLAS to reduce or eliminate the observed bias. GCLAS can export load and concentration data in formats suitable for entry into the U.S. Geological Survey's National Water Information System. GCLAS can also import and export data in formats that are compatible with various commonly used spreadsheet and statistics programs.
Mission planning for space based satellite surveillance experiments with the MSX
NASA Technical Reports Server (NTRS)
Sridharan, R.; Fishman, T.; Robinson, E.; Viggh, H.; Wiseman, A.
1994-01-01
The Midcourse Space Experiment is a BMDO-sponsored scientific satellite set for launch within the year. The satellite will collect phenomenology data on missile targets, plumes, earth limb backgrounds and deep space backgrounds in the LWIR, visible and ultra-violet spectral bands. It will also conduct functional demonstrations for space-based space surveillance. The Space-Based Visible sensor, built by Lincoln Laboratory, Massachusetts Institute of Technology, is the primary sensor on board the MSX for demonstration of space surveillance. The SBV Processing, Operations and Control Center (SPOCC) is the mission planning and commanding center for all space surveillance experiments using the SBV and other MSX instruments. The guiding principle in the SPOCC Mission Planning System was that all routine functions be automated. Manual analyst input should be minimal. Major concepts are: (I) A high level language, called SLED, for user interface to the system; (2) A group of independent software processes which would generally be run in a pipe-line mode for experiment commanding but can be run independently for analyst assessment; (3) An integrated experiment cost computation function that permits assessment of the feasibility of the experiment. This paper will report on the design, implementation and testing of the Mission Planning System.
System and method for measuring residual stress
Prime, Michael B.
2002-01-01
The present invention is a method and system for determining the residual stress within an elastic object. In the method, an elastic object is cut along a path having a known configuration. The cut creates a portion of the object having a new free surface. The free surface then deforms to a contour which is different from the path. Next, the contour is measured to determine how much deformation has occurred across the new free surface. Points defining the contour are collected in an empirical data set. The portion of the object is then modeled in a computer simulator. The points in the empirical data set are entered into the computer simulator. The computer simulator then calculates the residual stress along the path which caused the points within the object to move to the positions measured in the empirical data set. The calculated residual stress is then presented in a useful format to an analyst.
ERIC Educational Resources Information Center
Ammentorp, William
There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…
Effects of Motivation: Rewarding Hackers for Undetected Attacks Cause Analysts to Perform Poorly.
Maqbool, Zahid; Makhijani, Nidhi; Pammi, V S Chandrasekhar; Dutt, Varun
2017-05-01
The aim of this study was to determine how monetary motivations influence decision making of humans performing as security analysts and hackers in a cybersecurity game. Cyberattacks are increasing at an alarming rate. As cyberattacks often cause damage to existing cyber infrastructures, it is important to understand how monetary rewards may influence decision making of hackers and analysts in the cyber world. Currently, only limited attention has been given to this area. In an experiment, participants were randomly assigned to three between-subjects conditions ( n = 26 for each condition): equal payoff, where the magnitude of monetary rewards for hackers and defenders was the same; rewarding hacker, where the magnitude of monetary reward for hacker's successful attack was 10 times the reward for analyst's successful defense; and rewarding analyst, where the magnitude of monetary reward for analyst's successful defense was 10 times the reward for hacker's successful attack. In all conditions, half of the participants were human hackers playing against Nash analysts and half were human analysts playing against Nash hackers. Results revealed that monetary rewards for human hackers and analysts caused a decrease in attack and defend actions compared with the baseline. Furthermore, rewarding human hackers for undetected attacks made analysts deviate significantly from their optimal behavior. If hackers are rewarded for their undetected attack actions, then this causes analysts to deviate from optimal defend proportions. Thus, analysts need to be trained not become overenthusiastic in defending networks. Applications of our results are to networks where the influence of monetary rewards may cause information theft and system damage.
Short term evaluation of harvesting systems for ecosystem management
Michael D. Erickson; Penn Peters; Curt Hassler
1995-01-01
Continuous time/motion studies have traditionally been the basis for productivity estimates of timber harvesting systems. The detailed data from such studies permits the researcher or analyst to develop mathematical relationships based on stand, system, and stem attributes for describing machine cycle times. The resulting equation(s) allow the analyst to estimate...
1990-09-01
to own a Mercedes Benz . However, based on the market determined price few people are willing to pay for one. Yet, we don’t hear complaints that there...of Electrical and Electronic En§±neers, Fox Morris Personnel Consultants, The San Francisco Newspapers Advertising department and other sources. The...conducted in 12 December 1980 among 20 top metro markets revealed that companies across the board plan to increase their hiring in 1981 by anywhere from 10
A data base processor semantics specification package
NASA Technical Reports Server (NTRS)
Fishwick, P. A.
1983-01-01
A Semantics Specification Package (DBPSSP) for the Intel Data Base Processor (DBP) is defined. DBPSSP serves as a collection of cross assembly tools that allow the analyst to assemble request blocks on the host computer for passage to the DBP. The assembly tools discussed in this report may be effectively used in conjunction with a DBP compatible data communications protocol to form a query processor, precompiler, or file management system for the database processor. The source modules representing the components of DBPSSP are fully commented and included.
Fault Tree in the Trenches, A Success Story
NASA Technical Reports Server (NTRS)
Long, R. Allen; Goodson, Amanda (Technical Monitor)
2000-01-01
Getting caught up in the explanation of Fault Tree Analysis (FTA) minutiae is easy. In fact, most FTA literature tends to address FTA concepts and methodology. Yet there seems to be few articles addressing actual design changes resulting from the successful application of fault tree analysis. This paper demonstrates how fault tree analysis was used to identify and solve a potentially catastrophic mechanical problem at a rocket motor manufacturer. While developing the fault tree given in this example, the analyst was told by several organizations that the piece of equipment in question had been evaluated by several committees and organizations, and that the analyst was wasting his time. The fault tree/cutset analysis resulted in a joint-redesign of the control system by the tool engineering group and the fault tree analyst, as well as bragging rights for the analyst. (That the fault tree found problems where other engineering reviews had failed was not lost on the other engineering groups.) Even more interesting was that this was the analyst's first fault tree which further demonstrates how effective fault tree analysis can be in guiding (i.e., forcing) the analyst to take a methodical approach in evaluating complex systems.
Risk Assessment Methodology Based on the NISTIR 7628 Guidelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Sheldon, Frederick T; Hauser, Katie R
2013-01-01
Earlier work describes computational models of critical infrastructure that allow an analyst to estimate the security of a system in terms of the impact of loss per stakeholder resulting from security breakdowns. Here, we consider how to identify, monitor and estimate risk impact and probability for different smart grid stakeholders. Our constructive method leverages currently available standards and defined failure scenarios. We utilize the National Institute of Standards and Technology (NIST) Interagency or Internal Reports (NISTIR) 7628 as a basis to apply Cyberspace Security Econometrics system (CSES) for comparing design principles and courses of action in making security-related decisions.
Computational Control Workstation: Users' perspectives
NASA Technical Reports Server (NTRS)
Roithmayr, Carlos M.; Straube, Timothy M.; Tave, Jeffrey S.
1993-01-01
A Workstation has been designed and constructed for rapidly simulating motions of rigid and elastic multibody systems. We examine the Workstation from the point of view of analysts who use the machine in an industrial setting. Two aspects of the device distinguish it from other simulation programs. First, one uses a series of windows and menus on a computer terminal, together with a keyboard and mouse, to provide a mathematical and geometrical description of the system under consideration. The second hallmark is a facility for animating simulation results. An assessment of the amount of effort required to numerically describe a system to the Workstation is made by comparing the process to that used with other multibody software. The apparatus for displaying results as a motion picture is critiqued as well. In an effort to establish confidence in the algorithms that derive, encode, and solve equations of motion, simulation results from the Workstation are compared to answers obtained with other multibody programs. Our study includes measurements of computational speed.
NASA Technical Reports Server (NTRS)
Young, Gerald W.; Clemons, Curtis B.
2004-01-01
The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
... information used mainly by government and private analysts to project future population growth, to analyze.... All interviews are conducted using computer-assisted interviewing III. Data OMB Control Number: 0607-0610. Form Number: There are no forms. We conduct all interviewing on computers. Type of Review...
Better Incident Response with SCOT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruner, Todd
2015-04-01
SCOT is an incident response management system and knowledge base designed for incident responders by incident responders. SCOT increases the effectiveness of the team without adding undue burdens. Focused on reducing the friction between analysts and their tools, SCOT enables analysts to document and share their research and response efforts in near real time. Automatically identifying indicators and correlating those indicators, SCOT helps analysts discover and respond to advanced threats.
NASA Astrophysics Data System (ADS)
Lawrence, G.; Barnard, C.; Viswanathan, V.
1986-11-01
Historically, wave optics computer codes have been paraxial in nature. Folded systems could be modeled by "unfolding" the optical system. Calculation of optical aberrations is, in general, left for the analyst to do with off-line codes. While such paraxial codes were adequate for the simpler systems being studied 10 years ago, current problems such as phased arrays, ring resonators, coupled resonators, and grazing incidence optics require a major advance in analytical capability. This paper describes extension of the physical optics codes GLAD and GLAD V to include a global coordinate system and exact ray aberration calculations. The global coordinate system allows components to be positioned and rotated arbitrarily. Exact aberrations are calculated for components in aligned or misaligned configurations by using ray tracing to compute optical path differences and diffraction propagation. Optical path lengths between components and beam rotations in complex mirror systems are calculated accurately so that coherent interactions in phased arrays and coupled devices may be treated correctly.
MD-11 PCA - Research flight team photo
NASA Technical Reports Server (NTRS)
1995-01-01
On Aug. 30, 1995, a the McDonnell Douglas MD-11 transport aircraft landed equipped with a computer-assisted engine control system that has the potential to increase flight safety. In landings at NASA Dryden Flight Research Center, Edwards, California, on August 29 and 30, the aircraft demonstrated software used in the aircraft's flight control computer that essentially landed the MD-11 without a need for the pilot to manipulate the flight controls significantly. In partnership with McDonnell Douglas Aerospace (MDA), with Pratt & Whitney and Honeywell helping to design the software, NASA developed this propulsion-controlled aircraft (PCA) system following a series of incidents in which hydraulic failures resulted in the loss of flight controls. This new system enables a pilot to operate and land the aircraft safely when its normal, hydraulically-activated control surfaces are disabled. This August 29, 1995, photo shows the MD-11 team. Back row, left to right: Tim Dingen, MDA pilot; John Miller, MD-11 Chief pilot (MDA); Wayne Anselmo, MD-11 Flight Test Engineer (MDA); Gordon Fullerton, PCA Project pilot; Bill Burcham, PCA Chief Engineer; Rudey Duran, PCA Controls Engineer (MDA); John Feather, PCA Controls Engineer (MDA); Daryl Townsend, Crew Chief; Henry Hernandez, aircraft mechanic; Bob Baron, PCA Project Manager; Don Hermann, aircraft mechanic; Jerry Cousins, aircraft mechanic; Eric Petersen, PCA Manager (Honeywell); Trindel Maine, PCA Data Engineer; Jeff Kahler, PCA Software Engineer (Honeywell); Steve Goldthorpe, PCA Controls Engineer (MDA). Front row, left to right: Teresa Hass, Senior Project Management Analyst; Hollie Allingham (Aguilera), Senior Project Management Analyst; Taher Zeglum, PCA Data Engineer (MDA); Drew Pappas, PCA Project Manager (MDA); John Burken, PCA Control Engineer.
NASA Technical Reports Server (NTRS)
Wharton, S. W.
1980-01-01
An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. The algorithm interfaces the rapid numerical processing capacity of a computer with the human ability to integrate qualitative information. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters and the analyst, who evaluate and elect to modify the cluster structure. Clusters can be deleted or lumped pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The ICAP was implemented in APL (A Programming Language), an interactive computer language. The flexibility of the algorithm was evaluated using data from different LANDSAT scenes to simulate two situations: one in which the analyst is assumed to have no prior knowledge about the data and wishes to have the clusters formed more or less automatically; and the other in which the analyst is assumed to have some knowledge about the data structure and wishes to use that information to closely supervise the clustering process. For comparison, an existing clustering method was also applied to the two data sets.
The N-BOD2 user's and programmer's manual
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1978-01-01
A general purpose digital computer program was developed and designed to aid in the analysis of spacecraft attitude dynamics. The program provides the analyst with the capability of automatically deriving and numerically solving the equations of motion of any system that can be modeled as a topological tree of coupled rigid bodies, flexible bodies, point masses, and symmetrical momentum wheels. Two modes of output are available. The composite system equations of motion may be outputted on a line printer in a symbolic form that may be easily translated into common vector-dyadic notation, or the composite system equations of motion may be solved numerically and any desirable set of system state variables outputted as a function of time.
NACA Computer at the Lewis Flight Propulsion Laboratory
1951-02-21
A female computer at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory with a slide rule and Friden adding machine to make computations. The computer staff was introduced during World War II to relieve short-handed research engineers of some of the tedious computational work. The Computing Section was staffed by “computers,” young female employees, who often worked overnight when most of the tests were run. The computers obtained test data from the manometers and other instruments, made the initial computations, and plotted the data graphically. Researchers then analyzed the data and summarized the findings in a report or made modifications and ran the test again. There were over 400 female employees at the laboratory in 1944, including 100 computers. The use of computers was originally planned only for the duration of the war. The system was so successful that it was extended into the 1960s. The computers and analysts were located in the Altitude Wind Tunnel Shop and Office Building office wing during the 1940s and transferred to the new 8- by 6-Foot Supersonic Wind Tunnel in 1948.
NASA Technical Reports Server (NTRS)
White, Allan L.; Palumbo, Daniel L.
1991-01-01
Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.
Directed area search using socio-biological vision algorithms and cognitive Bayesian reasoning
NASA Astrophysics Data System (ADS)
Medasani, S.; Owechko, Y.; Allen, D.; Lu, T. C.; Khosla, D.
2010-04-01
Volitional search systems that assist the analyst by searching for specific targets or objects such as vehicles, factories, airports, etc in wide area overhead imagery need to overcome multiple problems present in current manual and automatic approaches. These problems include finding targets hidden in terabytes of information, relatively few pixels on targets, long intervals between interesting regions, time consuming analysis requiring many analysts, no a priori representative examples or templates of interest, detecting multiple classes of objects, and the need for very high detection rates and very low false alarm rates. This paper describes a conceptual analyst-centric framework that utilizes existing technology modules to search and locate occurrences of targets of interest (e.g., buildings, mobile targets of military significance, factories, nuclear plants, etc.), from video imagery of large areas. Our framework takes simple queries from the analyst and finds the queried targets with relatively minimum interaction from the analyst. It uses a hybrid approach that combines biologically inspired bottom up attention, socio-biologically inspired object recognition for volitionally recognizing targets, and hierarchical Bayesian networks for modeling and representing the domain knowledge. This approach has the benefits of high accuracy, low false alarm rate and can handle both low-level visual information and high-level domain knowledge in a single framework. Such a system would be of immense help for search and rescue efforts, intelligence gathering, change detection systems, and other surveillance systems.
Ethical Guidelines for Computer Security Researchers: "Be Reasonable"
NASA Astrophysics Data System (ADS)
Sassaman, Len
For most of its existence, the field of computer science has been lucky enough to avoid ethical dilemmas by virtue of its relatively benign nature. The subdisciplines of programming methodology research, microprocessor design, and so forth have little room for the greater questions of human harm. Other, more recently developed sub-disciplines, such as data mining, social network analysis, behavioral profiling, and general computer security, however, open the door to abuse of users by practitioners and researchers. It is therefore the duty of the men and women who chart the course of these fields to set rules for themselves regarding what sorts of actions on their part are to be considered acceptable and what should be avoided or handled with caution out of ethical concerns. This paper deals solely with the issues faced by computer security researchers, be they vulnerability analysts, privacy system designers, malware experts, or reverse engineers.
Training augmentation device for the Air Force satellite Control Network
NASA Technical Reports Server (NTRS)
Shoates, Keith B.
1993-01-01
From the 1960's and into the early 1980's satellite operations and control were conducted by Air Force Systems Command (AFSC), now Air Force Materiel Command (AFMC), out of the Satellite Control Facility at Onizuka AFB, CA. AFSC was responsible for acquiring satellite command and control systems and conducting routine satellite operations. The daily operations, consisting of satellite health and status contacts and station keeping activities, were performed for AFSC by a Mission Control Team (MCT) staffed by civilian contractors who were responsible for providing their own technically 'qualified' personnel as satellite operators. An MCT consists of five positions: mission planner, ground controller, planner analyst, orbit analyst, and ranger controller. Most of the training consisted of On-the-Job-Training (OJT) with junior personnel apprenticed to senior personnel until they could demonstrate job proficiency. With most of the satellite operators having 15 to 25 years of experience, there was minimal risk to the mission. In the mid 1980's Air Force Space Command (AFSPACOM) assumed operational responsibility for a newly established control node at Falcon AFB (FAFB) in CO. The satellites and ground system program offices (SPO's) are organized under AFSC's Space and Missiles Systems Center (SMC) to function as a systems engineering and acquisition agency for AFSPACECOM. The collection of the satellite control nodes, ground tracking stations, computer processing equipment, and connecting communications links is referred to as the Air Force Satellite Control Network (AFSCN).
A review method for UML requirements analysis model employing system-side prototyping.
Ogata, Shinpei; Matsuura, Saeko
2013-12-01
User interface prototyping is an effective method for users to validate the requirements defined by analysts at an early stage of a software development. However, a user interface prototype system offers weak support for the analysts to verify the consistency of the specifications about internal aspects of a system such as business logic. As the result, the inconsistency causes a lot of rework costs because the inconsistency often makes the developers impossible to actualize the system based on the specifications. For verifying such consistency, functional prototyping is an effective method for the analysts, but it needs a lot of costs and more detailed specifications. In this paper, we propose a review method so that analysts can verify the consistency among several different kinds of diagrams in UML efficiently by employing system-side prototyping without the detailed model. The system-side prototype system does not have any functions to achieve business logic, but visualizes the results of the integration among the diagrams in UML as Web pages. The usefulness of our proposal was evaluated by applying our proposal into a development of Library Management System (LMS) for a laboratory. This development was conducted by a group. As the result, our proposal was useful for discovering the serious inconsistency caused by the misunderstanding among the members of the group.
SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series
Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory
2018-03-07
This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.
Evaluating the O*NET Occupational Analysis System for Army Competency Development
2008-07-01
Experts (SMEs) and collecting ability and skill ratings using trained analysts. The results showed that Army SMEs as well as other types of analysts could...Sciences 2511 Jefferson Davis Highway, Arlington, Virginia 22202-3926 4 July 2008 Army Project Number Personnel and Training 665803D730 Analysis...using trained analysts. SMEs were non-commissioned officers (NCOs) or officers with several years of experience in the Army and their occupations, and
Interdisciplinary analysis procedures in the modeling and control of large space-based structures
NASA Technical Reports Server (NTRS)
Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.
1987-01-01
The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.
Innovative Solution to Video Enhancement
NASA Technical Reports Server (NTRS)
2001-01-01
Through a licensing agreement, Intergraph Government Solutions adapted a technology originally developed at NASA's Marshall Space Flight Center for enhanced video imaging by developing its Video Analyst(TM) System. Marshall's scientists developed the Video Image Stabilization and Registration (VISAR) technology to help FBI agents analyze video footage of the deadly 1996 Olympic Summer Games bombing in Atlanta, Georgia. VISAR technology enhanced nighttime videotapes made with hand-held camcorders, revealing important details about the explosion. Intergraph's Video Analyst System is a simple, effective, and affordable tool for video enhancement and analysis. The benefits associated with the Video Analyst System include support of full-resolution digital video, frame-by-frame analysis, and the ability to store analog video in digital format. Up to 12 hours of digital video can be stored and maintained for reliable footage analysis. The system also includes state-of-the-art features such as stabilization, image enhancement, and convolution to help improve the visibility of subjects in the video without altering underlying footage. Adaptable to many uses, Intergraph#s Video Analyst System meets the stringent demands of the law enforcement industry in the areas of surveillance, crime scene footage, sting operations, and dash-mounted video cameras.
Modernizing the Military Retirement System
2011-05-01
Patrick Gross, David Langstaff, Philip Odeen, Mark Ronald, Robert Stein, and Jack Zoeller. Catherine Whittington served as the Board Staff Analyst...Chair) Patrick Gross David Langstaff Philip Odeen Mark Ronald Robert Stein Jack Zoeller DBB Staff Analyst Catherine Whittington Methodology
Space Debris Surfaces - Probability of no penetration versus impact velocity and obliquity
NASA Technical Reports Server (NTRS)
Elfer, N.; Meibaum, R.; Olsen, G.
1992-01-01
A collection of computer codes called Space Debris Surfaces (SD-SURF), have been developed to assist in the design and analysis of space debris protection systems. An SD-SURF analysis will show which obliquities and velocities are most likely to cause a penetration to help the analyst select a shield design best suited to the predominant penetration mechanism. Examples of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration are presented.
Two-Dimensional Grids About Airfoils and Other Shapes
NASA Technical Reports Server (NTRS)
Sorenson, R.
1982-01-01
GRAPE computer program generates two-dimensional finite-difference grids about airfoils and other shapes by use of Poisson differential equation. GRAPE can be used with any boundary shape, even one specified by tabulated points and including limited number of sharp corners. Numerically stable and computationally fast, GRAPE provides aerodynamic analyst with efficient and consistant means of grid generation.
NASA Astrophysics Data System (ADS)
Tibi, R.; Young, C. J.; Gonzales, A.; Ballard, S.; Encarnacao, A. V.
2016-12-01
The matched filtering technique involving the cross-correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive, and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an Approximate Nearest Neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation without requiring a complex distributed computing system. Our method begins with a projection into a reduced dimensionality space based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors is accomplished by using randomized K-dimensional trees. We used the approach to search for matches to each of 2700 analyst-reviewed signal detections reported for May 2010 for the IMS station MKAR. The template library in this case consists of a dataset of more than 200,000 analyst-reviewed signal detections for the same station from 2002-2014 (excluding May 2010). Of these signal detections, 60% are teleseismic first P, and 15% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer shows that the proposed approach performs the search of the large template libraries about 20 times faster than the standard full linear search, while achieving recall rates greater than 80%, with the recall rate increasing for higher correlation values. To decide whether to confirm a match, we use a hybrid method involving a cluster approach for queries with two or more matches, and correlation score for single matches. Of the signal detections that passed our confirmation process, 52% were teleseismic first P, and 30% were regional phases.
NASA Astrophysics Data System (ADS)
Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin
2015-05-01
The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.
Using the living laboratory framework as a basis for understanding next-generation analyst work
NASA Astrophysics Data System (ADS)
McNeese, Michael D.; Mancuso, Vincent; McNeese, Nathan; Endsley, Tristan; Forster, Pete
2013-05-01
The preparation of next generation analyst work requires alternative levels of understanding and new methodological departures from the way current work transpires. Current work practices typically do not provide a comprehensive approach that emphasizes the role of and interplay between (a) cognition, (b) emergent activities in a shared situated context, and (c) collaborative teamwork. In turn, effective and efficient problem solving fails to take place, and practice is often composed of piecemeal, techno-centric tools that isolate analysts by providing rigid, limited levels of understanding of situation awareness. This coupled with the fact that many analyst activities are classified produces a challenging situation for researching such phenomena and designing and evaluating systems to support analyst cognition and teamwork. Through our work with cyber, image, and intelligence analysts we have realized that there is more required of researchers to study human-centered designs to provide for analyst's needs in a timely fashion. This paper identifies and describes how The Living Laboratory Framework can be utilized as a means to develop a comprehensive, human-centric, and problem-focused approach to next generation analyst work, design, and training. We explain how the framework is utilized for specific cases in various applied settings (e.g., crisis management analysis, image analysis, and cyber analysis) to demonstrate its value and power in addressing an area of utmost importance to our national security. Attributes of analyst work settings are delineated to suggest potential design affordances that could help improve cognitive activities and awareness. Finally, the paper puts forth a research agenda for the use of the framework for future work that will move the analyst profession in a viable manner to address the concerns identified.
The application test system: Experiences to date and future plans
NASA Technical Reports Server (NTRS)
May, G. A.; Ashburn, P.; Hansen, H. L. (Principal Investigator)
1979-01-01
The ATS analysis component is presented focusing on methods by which the varied data sources are used by the ATS analyst. Analyst training and initial processing of data is discussed along with short and long plans for the ATS.
DataQs analyst guide : best practices for federal and state agency users.
DOT National Transportation Integrated Search
2014-12-01
The DataQs Analyst Guide provides practical guidance and : best practices to address and resolve Requests for Data : Reviews (RDRs) submitted electronically to FMCSA by motor : carriers, commercial drivers, and other persons using the : DataQs system...
Degree counting and Shadow system for Toda system of rank two: One bubbling
NASA Astrophysics Data System (ADS)
Lee, Youngae; Lin, Chang-Shou; Wei, Juncheng; Yang, Wen
2018-04-01
We initiate the program for computing the Leray-Schauder topological degree for Toda systems of rank two. This program still contains a lot of challenging problems for analysts. As the first step, we prove that if a sequence of solutions (u1k ,u2k) blows up, then one of hje ujk/∫Mhje ujk dvg, j = 1 , 2 tends to a sum of Dirac measures. This is so-called the phenomena of weak concentration. Our purposes in this article are (i) to introduce the shadow system due to the bubbling phenomena when one of parameters ρi crosses 4π and ρj ∉ 4 πN where 1 ≤ i ≠ j ≤ 2; (ii) to show how to calculate the topological degree of Toda systems by computing the topological degree of the general shadow systems; (iii) to calculate the topological degree of the shadow system for one point blow up. We believe that the degree counting formula for the shadow system would be useful in other problems.
The TIGER system: a Census Bureau innovation serving data analysts.
Carbaugh, L W; Marx, R W
1990-01-01
This article describes the U.S. Census Bureau's TIGER (Topologically Integrated Geographic Encoding and Referencing) system, an automated geographic data base. The emphasis is on the availability of file extracts and their usefulness to data analysts. In addition to describing the available files, it mentions various applications for the data, explains the data limitations, and notes problems encountered to date.
Travelogue--a newcomer encounters statistics and the computer.
Bruce, Peter
2011-11-01
Computer-intensive methods have revolutionized statistics, giving rise to new areas of analysis and expertise in predictive analytics, image processing, pattern recognition, machine learning, genomic analysis, and more. Interest naturally centers on the new capabilities the computer allows the analyst to bring to the table. This article, instead, focuses on the account of how computer-based resampling methods, with their relative simplicity and transparency, enticed one individual, untutored in statistics or mathematics, on a long journey into learning statistics, then teaching it, then starting an education institution.
17 CFR 200.17 - Chief Management Analyst.
Code of Federal Regulations, 2010 CFR
2010-04-01
...; CONDUCT AND ETHICS; AND INFORMATION AND REQUESTS Organization and Program Management General Organization...) Organizational structures and delegations of authority; (d) Management information systems and concepts; and (e... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false Chief Management Analyst. 200...
Preparing Florida for deployment of SafetyAnalyst for all roads.
DOT National Transportation Integrated Search
2012-05-01
SafetyAnalyst is an advanced software system designed to provide the state and local highway agencies with a comprehensive set of tools to enhance their programming of site-specific highway safety improvements. As one of the 27 states that sponsored ...
Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.
1981-01-01
Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.
Computer Applications and Virtual Environments (CAVE)
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall SPace Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).
ComputerApplications and Virtual Environments (CAVE)
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Center (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability providedgeneral visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.
ComputerApplications and Virtual Environments (CAVE)
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.
Tibi, Rigobert; Young, Christopher; Gonzales, Antonio; ...
2017-07-04
The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this paper, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset ofmore » the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ~2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). Finally, the analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tibi, Rigobert; Young, Christopher; Gonzales, Antonio
The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this paper, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset ofmore » the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ~2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). Finally, the analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.« less
NASA Technical Reports Server (NTRS)
Hughes, Peter M.; Luczak, Edward C.
1991-01-01
Flight Operations Analysts (FOAs) in the Payload Operations Control Center (POCC) are responsible for monitoring a satellite's health and safety. As satellites become more complex and data rates increase, FOAs are quickly approaching a level of information saturation. The FOAs in the spacecraft control center for the COBE (Cosmic Background Explorer) satellite are currently using a fault isolation expert system named the Communications Link Expert Assistance Resource (CLEAR), to assist in isolating and correcting communications link faults. Due to the success of CLEAR and several other systems in the control center domain, many other monitoring and fault isolation expert systems will likely be developed to support control center operations during the early 1990s. To facilitate the development of these systems, a project was initiated to develop a domain specific tool, named the Generic Spacecraft Analyst Assistant (GenSAA). GenSAA will enable spacecraft analysts to easily build simple real-time expert systems that perform spacecraft monitoring and fault isolation functions. Lessons learned during the development of several expert systems at Goddard, thereby establishing the foundation of GenSAA's objectives and offering insights in how problems may be avoided in future project, are described. This is followed by a description of the capabilities, architecture, and usage of GenSAA along with a discussion of its application to future NASA missions.
The 1991 version of the plume impingement computer program. Volume 1: Description
NASA Technical Reports Server (NTRS)
Bender, Robert L.; Somers, Richard E.; Prendergast, Maurice J.; Clayton, Joseph P.; Smith, Sheldon D.
1991-01-01
The objective of this contract was to continue development of a vacuum plume impingement evaluator to provide an analyst with a capability for rapid assessment of thruster plume impingement scenarios. The research was divided into three areas: Plume Impingement Computer Program (PLIMP) modification/validation; graphics development; and documentation in the form of a Plume Handbook and PLIMP Input Guide.
R&D100 Finalist: Neuromorphic Cyber Microscope
DOE Office of Scientific and Technical Information (OSTI.GOV)
Follett, David; Naegle, John; Suppona, Roger
The Neuromorphic Cyber Microscope provides security analysts with unprecedented visibility of their network, computer and storage assets. This processor is the world's first practical implementation of neuromorphic technology to a major computer science mission. Working with Lewis Rhodes Labs, engineers at Sandia National Laboratories have created a device that is orders of magnitude faster at analyzing data to identify cyber-attacks.
Software for Partly Automated Recognition of Targets
NASA Technical Reports Server (NTRS)
Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark; Selinsky, T.
2002-01-01
The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user's tendencies while the user is selecting targets and to increase the user's productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
2015-12-01
Army Training and Doctrine Command (TRADOC) Analysis Center (TRAC) to the Philippines for Operation ENDURING FREEDOM – Philippines (OEF-P). PROJECT...management, doctrine and force development, training management, system testing, system acquisition, decision analysis, and resource management, as...influenced procurement decisions and reshaped Army doctrine . Additionally, CAA itself has benefited in numerous ways. Combat experience provides analysts
PitScan: Computer-Assisted Feature Detection
NASA Astrophysics Data System (ADS)
Wagner, R. V.; Robinson, M. S.
2018-04-01
We developed PitScan to assist in searching the very large LROC image dataset for pits — unusual <200m wide vertical-walled holes in the Moon's surface. PitScan reduces analysts' workload by pre-filtering images to identify possible pits.
Intelligent Data Analysis in the 21st Century
NASA Astrophysics Data System (ADS)
Cohen, Paul; Adams, Niall
When IDA began, data sets were small and clean, data provenance and management were not significant issues, workflows and grid computing and cloud computing didn’t exist, and the world was not populated with billions of cellphone and computer users. The original conception of intelligent data analysis — automating some of the reasoning of skilled data analysts — has not been updated to account for the dramatic changes in what skilled data analysis means, today. IDA might update its mission to address pressing problems in areas such as climate change, habitat loss, education, and medicine. It might anticipate data analysis opportunities five to ten years out, such as customizing educational trajectories to individual students, and personalizing medical protocols. Such developments will elevate the conference and our community by shifting our focus from arbitrary measures of the performance of isolated algorithms to the practical, societal value of intelligent data analysis systems.
Design of Computer-Related Workstations in Relation to Job Functions and Productivity.
1984-12-01
I nadequat e Neut ra I Adequat e Mana,’,ement 36.6% 17. 3 46.2 C Compiter Prog. 45.1 25.3 29.7 Syst ems Analyst 44.6 20.0 35.4 FPmctional Analyst...2~ der md prJgrvTm~r C S 4 0 6 I 0 I S I S g tS * Ar . ± " S . l _ - ---- ,E- , ! ---- - -.-- _EA E L 2 . ... .. __., ____ I - / 1: - i - - " I...there ;ire lrina van :ibl -e; which .0 tect fati I I ty desi gn and l ayout. In a professional * oputr-rlatd ’vi ronimInt , sat i stact 110 with
GenSAA: A tool for advancing satellite monitoring with graphical expert systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M.; Luczak, Edward C.
1993-01-01
During numerous contacts with a satellite each day, spacecraft analysts must closely monitor real time data for combinations of telemetry parameter values, trends, and other indications that may signify a problem or failure. As satellites become more complex and the number of data items increases, this task is becoming increasingly difficult for humans to perform at acceptable performance levels. At the NASA Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these initial efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analyst Assistant (GenSAA) is being developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. Although initially domain-specific in nature, this powerful tool will support the development of highly graphical expert systems for data monitoring purposes throughout the space and commercial industry.
Combining factual and heuristic knowledge in knowledge acquisition
NASA Technical Reports Server (NTRS)
Gomez, Fernando; Hull, Richard; Karr, Clark; Hosken, Bruce; Verhagen, William
1992-01-01
A knowledge acquisition technique that combines heuristic and factual knowledge represented as two hierarchies is described. These ideas were applied to the construction of a knowledge acquisition interface to the Expert System Analyst (OPERA). The goal of OPERA is to improve the operations support of the computer network in the space shuttle launch processing system. The knowledge acquisition bottleneck lies in gathering knowledge from human experts and transferring it to OPERA. OPERA's knowledge acquisition problem is approached as a classification problem-solving task, combining this approach with the use of factual knowledge about the domain. The interface was implemented in a Symbolics workstation making heavy use of windows, pull-down menus, and other user-friendly devices.
Development of a Comprehensive Database System for Safety Analyst
Paz, Alexander; Veeramisti, Naveen; Khanal, Indira; Baker, Justin
2015-01-01
This study addressed barriers associated with the use of Safety Analyst, a state-of-the-art tool that has been developed to assist during the entire Traffic Safety Management process but that is not widely used due to a number of challenges as described in this paper. As part of this study, a comprehensive database system and tools to provide data to multiple traffic safety applications, with a focus on Safety Analyst, were developed. A number of data management tools were developed to extract, collect, transform, integrate, and load the data. The system includes consistency-checking capabilities to ensure the adequate insertion and update of data into the database. This system focused on data from roadways, ramps, intersections, and traffic characteristics for Safety Analyst. To test the proposed system and tools, data from Clark County, which is the largest county in Nevada and includes the cities of Las Vegas, Henderson, Boulder City, and North Las Vegas, was used. The database and Safety Analyst together help identify the sites with the potential for safety improvements. Specifically, this study examined the results from two case studies. The first case study, which identified sites having a potential for safety improvements with respect to fatal and all injury crashes, included all roadway elements and used default and calibrated Safety Performance Functions (SPFs). The second case study identified sites having a potential for safety improvements with respect to fatal and all injury crashes, specifically regarding intersections; it used default and calibrated SPFs as well. Conclusions were developed for the calibration of safety performance functions and the classification of site subtypes. Guidelines were provided about the selection of a particular network screening type or performance measure for network screening. PMID:26167531
Developing Guidelines for Assessing Visual Analytics Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean
2011-07-01
In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less
MAGIC Computer Simulation. Volume 1: User Manual
1970-07-01
vulnerability and MAGIC programs. A three-digit code is assigned to each component of the target, such as armor, gun tube; and a two-digit code is assigned to...A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1970 4. TITLE AND SUBTITLE MAGIC Computer Simulation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT
NASA Technical Reports Server (NTRS)
Aucoin, P. J.; Stewart, J.; Mckay, M. F. (Principal Investigator)
1980-01-01
This document presents instructions for analysts who use the EOD-LARSYS as programmed on the Purdue University IBM 370/148 (recently replaced by the IBM 3031) computer. It presents sample applications, control cards, and error messages for all processors in the system and gives detailed descriptions of the mathematical procedures and information needed to execute the system and obtain the desired output. EOD-LARSYS is the JSC version of an integrated batch system for analysis of multispectral scanner imagery data. The data included is designed for use with the as built documentation (volume 3) and the program listings (volume 4). The system is operational from remote terminals at Johnson Space Center under the virtual machine/conversational monitor system environment.
Cutter Resource Effectiveness Evaluation (CREE) Program : A Guide for Users and Analysts
DOT National Transportation Integrated Search
1978-03-01
The Cutter Resource Effectiveness Evaluation (CREE) project has developed a sophisticated, user-oriented computer model which can evaluate the effectiveness of any existing Coast Guard craft, or the effectiveness of any of a number of proposed altern...
Overview of the land analysis system (LAS)
Quirk, Bruce K.; Olseson, Lyndon R.
1987-01-01
The Land Analysis System (LAS) is a fully integrated digital analysis system designed to support remote sensing, image processing, and geographic information systems research. LAS is being developed through a cooperative effort between the National Aeronautics and Space Administration Goddard Space Flight Center and the U. S. Geological Survey Earth Resources Observation Systems (EROS) Data Center. LAS has over 275 analysis modules capable to performing input and output, radiometric correction, geometric registration, signal processing, logical operations, data transformation, classification, spatial analysis, nominal filtering, conversion between raster and vector data types, and display manipulation of image and ancillary data. LAS is currently implant using the Transportable Applications Executive (TAE). While TAE was designed primarily to be transportable, it still provides the necessary components for a standard user interface, terminal handling, input and output services, display management, and intersystem communications. With TAE the analyst uses the same interface to the processing modules regardless of the host computer or operating system. LAS was originally implemented at EROS on a Digital Equipment Corporation computer system under the Virtual Memorial System operating system with DeAnza displays and is presently being converted to run on a Gould Power Node and Sun workstation under the Berkeley System Distribution UNIX operating system.
From Franchise to Programming: Jobs in Cable Television.
ERIC Educational Resources Information Center
Stanton, Michael
1985-01-01
This article takes a look at some of the key jobs at every level of the cable industry. It discusses winning a franchise, building and running the system, and programing and production. Job descriptions include engineer, market analyst, programers, financial analysts, strand mappers, customer service representatives, access coordinator, and studio…
2016-01-07
news. Both of these resemble typical activities of intelligence analysts in OSINT processing and production applications. We assessed two task...intelligence analysts in a number of OSINT processing and production applications. (5) Summary of the most important results In both settings
DOE Office of Scientific and Technical Information (OSTI.GOV)
BERG, MICHAEL; RILEY, MARSHALL
System assessments typically yield large quantities of data from disparate sources for an analyst to scrutinize for issues. Netmeld is used to parse input from different file formats, store the data in a common format, allow users to easily query it, and enable analysts to tie different analysis tools together using a common back-end.
28 CFR 16.96 - Exemption of Federal Bureau of Investigation Systems-limited access.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) would limit the ability of trained investigators and intelligence analysts to exercise their judgment in reporting on investigations and impede the development of criminal intelligence necessary for effective law... subsection (e)(5) would limit the ability of trained investigators and intelligence analysts to exercise...
An agile acquisition decision-support workbench for evaluating ISR effectiveness
NASA Astrophysics Data System (ADS)
Stouch, Daniel W.; Champagne, Valerie; Mow, Christopher; Rosenberg, Brad; Serrin, Joshua
2011-06-01
The U.S. Air Force is consistently evolving to support current and future operations through the planning and execution of intelligence, surveillance and reconnaissance (ISR) missions. However, it is a challenge to maintain a precise awareness of current and emerging ISR capabilities to properly prepare for future conflicts. We present a decisionsupport tool for acquisition managers to empirically compare ISR capabilities and approaches to employing them, thereby enabling the DoD to acquire ISR platforms and sensors that provide the greatest return on investment. We have developed an analysis environment to perform modeling and simulation-based experiments to objectively compare alternatives. First, the analyst specifies an operational scenario for an area of operations by providing terrain and threat information; a set of nominated collections; sensor and platform capabilities; and processing, exploitation, and dissemination (PED) capacities. Next, the analyst selects and configures ISR collection strategies to generate collection plans. The analyst then defines customizable measures of effectiveness or performance to compute during the experiment. Finally, the analyst empirically compares the efficacy of each solution and generates concise reports to document their conclusions, providing traceable evidence for acquisition decisions. Our capability demonstrates the utility of using a workbench environment for analysts to design and run experiments. Crafting impartial metrics enables the acquisition manager to focus on evaluating solutions based on specific military needs. Finally, the metric and collection plan visualizations provide an intuitive understanding of the suitability of particular solutions. This facilitates a more agile acquisition strategy that handles rapidly changing technology in response to current military needs.
ERIC Educational Resources Information Center
Baker, Jason R.
2017-01-01
The goals of the present action research study were to understand intelligence analysts' perceptions of weapon systems visual recognition ("vis-recce") training and to determine the impact of a Critical Thinking Training (CTT) Seminar and Formative Assessments on unit-level intelligence analysts' "vis-recce" performance at a…
Improving sensor data analysis through diverse data source integration
NASA Astrophysics Data System (ADS)
Casper, Jennifer; Albuquerque, Ronald; Hyland, Jeremy; Leveille, Peter; Hu, Jing; Cheung, Eddy; Mauer, Dan; Couture, Ronald; Lai, Barry
2009-05-01
Daily sensor data volumes are increasing from gigabytes to multiple terabytes. The manpower and resources needed to analyze the increasing amount of data are not growing at the same rate. Current volumes of diverse data, both live streaming and historical, are not fully analyzed. Analysts are left mostly to analyzing the individual data sources manually. This is both time consuming and mentally exhausting. Expanding data collections only exacerbate this problem. Improved data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Improved techniques are needed to reduce an analysts decision response time and to enable more intelligent and immediate situation awareness. This paper describes the Sensor Data and Analysis Framework (SDAF) system built to provide analysts with the ability to pose integrated queries on diverse live and historical data sources, and plug in needed algorithms for upstream processing and filtering. The SDAF system was inspired by input and feedback from field analysts and experts. This paper presents SDAF's capabilities, implementation, and reasoning behind implementation decisions. Finally, lessons learned from preliminary tests and deployments are captured for future work.
RAVE: Rapid Visualization Environment
NASA Technical Reports Server (NTRS)
Klumpar, D. M.; Anderson, Kevin; Simoudis, Avangelos
1994-01-01
Visualization is used in the process of analyzing large, multidimensional data sets. However, the selection and creation of visualizations that are appropriate for the characteristics of a particular data set and the satisfaction of the analyst's goals is difficult. The process consists of three tasks that are performed iteratively: generate, test, and refine. The performance of these tasks requires the utilization of several types of domain knowledge that data analysts do not often have. Existing visualization systems and frameworks do not adequately support the performance of these tasks. In this paper we present the RApid Visualization Environment (RAVE), a knowledge-based system that interfaces with commercial visualization frameworks and assists a data analyst in quickly and easily generating, testing, and refining visualizations. RAVE was used for the visualization of in situ measurement data captured by spacecraft.
Locating CVBEM collocation points for steady state heat transfer problems
Hromadka, T.V.
1985-01-01
The Complex Variable Boundary Element Method or CVBEM provides a highly accurate means of developing numerical solutions to steady state two-dimensional heat transfer problems. The numerical approach exactly solves the Laplace equation and satisfies the boundary conditions at specified points on the boundary by means of collocation. The accuracy of the approximation depends upon the nodal point distribution specified by the numerical analyst. In order to develop subsequent, refined approximation functions, four techniques for selecting additional collocation points are presented. The techniques are compared as to the governing theory, representation of the error of approximation on the problem boundary, the computational costs, and the ease of use by the numerical analyst. ?? 1985.
NASA Astrophysics Data System (ADS)
Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.
2017-10-01
Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.
Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less
Human-machine interaction to disambiguate entities in unstructured text and structured datasets
NASA Astrophysics Data System (ADS)
Ward, Kevin; Davenport, Jack
2017-05-01
Creating entity network graphs is a manual, time consuming process for an intelligence analyst. Beyond the traditional big data problems of information overload, individuals are often referred to by multiple names and shifting titles as they advance in their organizations over time which quickly makes simple string or phonetic alignment methods for entities insufficient. Conversely, automated methods for relationship extraction and entity disambiguation typically produce questionable results with no way for users to vet results, correct mistakes or influence the algorithm's future results. We present an entity disambiguation tool, DRADIS, which aims to bridge the gap between human-centric and machinecentric methods. DRADIS automatically extracts entities from multi-source datasets and models them as a complex set of attributes and relationships. Entities are disambiguated across the corpus using a hierarchical model executed in Spark allowing it to scale to operational sized data. Resolution results are presented to the analyst complete with sourcing information for each mention and relationship allowing analysts to quickly vet the correctness of results as well as correct mistakes. Corrected results are used by the system to refine the underlying model allowing analysts to optimize the general model to better deal with their operational data. Providing analysts with the ability to validate and correct the model to produce a system they can trust enables them to better focus their time on producing higher quality analysis products.
NASA Astrophysics Data System (ADS)
Gendron, Marlin Lee
During Mine Warfare (MIW) operations, MIW analysts perform change detection by visually comparing historical sidescan sonar imagery (SSI) collected by a sidescan sonar with recently collected SSI in an attempt to identify objects (which might be explosive mines) placed at sea since the last time the area was surveyed. This dissertation presents a data structure and three algorithms, developed by the author, that are part of an automated change detection and classification (ACDC) system. MIW analysts at the Naval Oceanographic Office, to reduce the amount of time to perform change detection, are currently using ACDC. The dissertation introductory chapter gives background information on change detection, ACDC, and describes how SSI is produced from raw sonar data. Chapter 2 presents the author's Geospatial Bitmap (GB) data structure, which is capable of storing information geographically and is utilized by the three algorithms. This chapter shows that a GB data structure used in a polygon-smoothing algorithm ran between 1.3--48.4x faster than a sparse matrix data structure. Chapter 3 describes the GB clustering algorithm, which is the author's repeatable, order-independent method for clustering. Results from tests performed in this chapter show that the time to cluster a set of points is not affected by the distribution or the order of the points. In Chapter 4, the author presents his real-time computer-aided detection (CAD) algorithm that automatically detects mine-like objects on the seafloor in SSI. The author ran his GB-based CAD algorithm on real SSI data, and results of these tests indicate that his real-time CAD algorithm performs comparably to or better than other non-real-time CAD algorithms. The author presents his computer-aided search (CAS) algorithm in Chapter 5. CAS helps MIW analysts locate mine-like features that are geospatially close to previously detected features. A comparison between the CAS and a great circle distance algorithm shows that the CAS performs geospatial searching 1.75x faster on large data sets. Finally, the concluding chapter of this dissertation gives important details on how the completed ACDC system will function, and discusses the author's future research to develop additional algorithms and data structures for ACDC.
An application of computer aided requirements analysis to a real time deep space system
NASA Technical Reports Server (NTRS)
Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.
1981-01-01
The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bogen, Paul Logasa; McKenzie, Amber T; Gillen, Rob
Forensic document analysis has become an important aspect of investigation of many different kinds of crimes from money laundering to fraud and from cybercrime to smuggling. The current workflow for analysts includes powerful tools, such as Palantir and Analyst s Notebook, for moving from evidence to actionable intelligence and tools for finding documents among the millions of files on a hard disk, such as FTK. However, the analysts often leave the process of sorting through collections of seized documents to filter out the noise from the actual evidence to a highly labor-intensive manual effort. This paper presents the Redeye Analysismore » Workbench, a tool to help analysts move from manual sorting of a collection of documents to performing intelligent document triage over a digital library. We will discuss the tools and techniques we build upon in addition to an in-depth discussion of our tool and how it addresses two major use cases we observed analysts performing. Finally, we also include a new layout algorithm for radial graphs that is used to visualize clusters of documents in our system.« less
A Method for Aircraft Concept Selection Using Multicriteria Interactive Genetic Algorithms
NASA Technical Reports Server (NTRS)
Buonanno, Michael; Mavris, Dimitri
2005-01-01
The problem of aircraft concept selection has become increasingly difficult in recent years as a result of a change from performance as the primary evaluation criteria of aircraft concepts to the current situation in which environmental effects, economics, and aesthetics must also be evaluated and considered in the earliest stages of the decision-making process. This has prompted a shift from design using historical data regression techniques for metric prediction to the use of physics-based analysis tools that are capable of analyzing designs outside of the historical database. The use of optimization methods with these physics-based tools, however, has proven difficult because of the tendency of optimizers to exploit assumptions present in the models and drive the design towards a solution which, while promising to the computer, may be infeasible due to factors not considered by the computer codes. In addition to this difficulty, the number of discrete options available at this stage may be unmanageable due to the combinatorial nature of the concept selection problem, leading the analyst to arbitrarily choose a sub-optimum baseline vehicle. These concept decisions such as the type of control surface scheme to use, though extremely important, are frequently made without sufficient understanding of their impact on the important system metrics because of a lack of computational resources or analysis tools. This paper describes a hybrid subjective/quantitative optimization method and its application to the concept selection of a Small Supersonic Transport. The method uses Genetic Algorithms to operate on a population of designs and promote improvement by varying more than sixty parameters governing the vehicle geometry, mission, and requirements. In addition to using computer codes for evaluation of quantitative criteria such as gross weight, expert input is also considered to account for criteria such as aeroelasticity or manufacturability which may be impossible or too computationally expensive to consider explicitly in the analysis. Results indicate that concepts resulting from the use of this method represent designs which are promising to both the computer and the analyst, and that a mapping between concepts and requirements that would not otherwise be apparent is revealed.
QuEST for malware type-classification
NASA Astrophysics Data System (ADS)
Vaughan, Sandra L.; Mills, Robert F.; Grimaila, Michael R.; Peterson, Gilbert L.; Oxley, Mark E.; Dube, Thomas E.; Rogers, Steven K.
2015-05-01
Current cyber-related security and safety risks are unprecedented, due in no small part to information overload and skilled cyber-analyst shortages. Advances in decision support and Situation Awareness (SA) tools are required to support analysts in risk mitigation. Inspired by human intelligence, research in Artificial Intelligence (AI) and Computational Intelligence (CI) have provided successful engineering solutions in complex domains including cyber. Current AI approaches aggregate large volumes of data to infer the general from the particular, i.e. inductive reasoning (pattern-matching) and generally cannot infer answers not previously programmed. Whereas humans, rarely able to reason over large volumes of data, have successfully reached the top of the food chain by inferring situations from partial or even partially incorrect information, i.e. abductive reasoning (pattern-completion); generating a hypothetical explanation of observations. In order to achieve an engineering advantage in computational decision support and SA we leverage recent research in human consciousness, the role consciousness plays in decision making, modeling the units of subjective experience which generate consciousness, qualia. This paper introduces a novel computational implementation of a Cognitive Modeling Architecture (CMA) which incorporates concepts of consciousness. We apply our model to the malware type-classification task. The underlying methodology and theories are generalizable to many domains.
Analysis of field test data on residential heating and cooling
NASA Astrophysics Data System (ADS)
Talbert, S. G.
1980-12-01
The computer program using field site data collected on 48 homes located in six cities in different climatic regions of the United States is discussed. In addition, a User's Guide was prepared for the computer program which is contained in a separate two-volume document entitled User's Guide for REAP: Residential Energy Analysis Program. Feasibility studies were conducted pertaining to potential improvements for REAP, including: the addition of an oil-furnace model; improving the infiltration subroutine; adding active and/or passive solar subroutines; incorporating a thermal energy storage model; and providing dual HVAC systems (e.g., heat pump-gas furnace). The purpose of REAP is to enable building designers and energy analysts to evaluate how such factors as building design, weather conditions, internal heat loads, and HVAC equipment performance, influence the energy requirements of residential buildings.
GASP- General Aviation Synthesis Program. Volume 1: Main program. Part 1: Theoretical development
NASA Technical Reports Server (NTRS)
Hague, D.
1978-01-01
The General Aviation synthesis program performs tasks generally associated with aircraft preliminary design and allows an analyst the capability of performing parametric studies in a rapid manner. GASP emphasizes small fixed-wing aircraft employing propulsion systems varying froma single piston engine with fixed pitch propeller through twin turboprop/ turbofan powered business or transport type aircraft. The program, which may be operated from a computer terminal in either the batch or interactive graphic mode, is comprised of modules representing the various technical disciplines integrated into a computational flow which ensures that the interacting effects of design variables are continuously accounted for in the aircraft sizing procedure. The model is a useful tool for comparing configurations, assessing aircraft performance and economics, performing tradeoff and sensitivity studies, and assessing the impact of advanced technologies on aircraft performance and economics.
Learning patterns of life from intelligence analyst chat
NASA Astrophysics Data System (ADS)
Schneider, Michael K.; Alford, Mark; Babko-Malaya, Olga; Blasch, Erik; Chen, Lingji; Crespi, Valentino; HandUber, Jason; Haney, Phil; Nagy, Jim; Richman, Mike; Von Pless, Gregory; Zhu, Howie; Rhodes, Bradley J.
2016-05-01
Our Multi-INT Data Association Tool (MIDAT) learns patterns of life (POL) of a geographical area from video analyst observations called out in textual reporting. Typical approaches to learning POLs from video make use of computer vision algorithms to extract locations in space and time of various activities. Such approaches are subject to the detection and tracking performance of the video processing algorithms. Numerous examples of human analysts monitoring live video streams annotating or "calling out" relevant entities and activities exist, such as security analysis, crime-scene forensics, news reports, and sports commentary. This user description typically corresponds with textual capture, such as chat. Although the purpose of these text products is primarily to describe events as they happen, organizations typically archive the reports for extended periods. This archive provides a basis to build POLs. Such POLs are useful for diagnosis to assess activities in an area based on historical context, and for consumers of products, who gain an understanding of historical patterns. MIDAT combines natural language processing, multi-hypothesis tracking, and Multi-INT Activity Pattern Learning and Exploitation (MAPLE) technologies in an end-to-end lab prototype that processes textual products produced by video analysts, infers POLs, and highlights anomalies relative to those POLs with links to "tracks" of related activities performed by the same entity. MIDAT technologies perform well, achieving, for example, a 90% F1-value on extracting activities from the textual reports.
Intra- and inter-rater reliability of digital image analysis for skin color measurement
Sommers, Marilyn; Beacham, Barbara; Baker, Rachel; Fargo, Jamison
2013-01-01
Background We determined the intra- and inter-rater reliability of data from digital image color analysis between an expert and novice analyst. Methods Following training, the expert and novice independently analyzed 210 randomly ordered images. Both analysts used Adobe® Photoshop lasso or color sampler tools based on the type of image file. After color correction with Pictocolor® in camera software, they recorded L*a*b* (L*=light/dark; a*=red/green; b*=yellow/blue) color values for all skin sites. We computed intra-rater and inter-rater agreement within anatomical region, color value (L*, a*, b*), and technique (lasso, color sampler) using a series of one-way intra-class correlation coefficients (ICCs). Results Results of ICCs for intra-rater agreement showed high levels of internal consistency reliability within each rater for the lasso technique (ICC ≥ 0.99) and somewhat lower, yet acceptable, level of agreement for the color sampler technique (ICC = 0.91 for expert, ICC = 0.81 for novice). Skin L*, skin b*, and labia L* values reached the highest level of agreement (ICC ≥ 0.92) and skin a*, labia b*, and vaginal wall b* were the lowest (ICC ≥ 0.64). Conclusion Data from novice analysts can achieve high levels of agreement with data from expert analysts with training and the use of a detailed, standard protocol. PMID:23551208
Intra- and inter-rater reliability of digital image analysis for skin color measurement.
Sommers, Marilyn; Beacham, Barbara; Baker, Rachel; Fargo, Jamison
2013-11-01
We determined the intra- and inter-rater reliability of data from digital image color analysis between an expert and novice analyst. Following training, the expert and novice independently analyzed 210 randomly ordered images. Both analysts used Adobe(®) Photoshop lasso or color sampler tools based on the type of image file. After color correction with Pictocolor(®) in camera software, they recorded L*a*b* (L*=light/dark; a*=red/green; b*=yellow/blue) color values for all skin sites. We computed intra-rater and inter-rater agreement within anatomical region, color value (L*, a*, b*), and technique (lasso, color sampler) using a series of one-way intra-class correlation coefficients (ICCs). Results of ICCs for intra-rater agreement showed high levels of internal consistency reliability within each rater for the lasso technique (ICC ≥ 0.99) and somewhat lower, yet acceptable, level of agreement for the color sampler technique (ICC = 0.91 for expert, ICC = 0.81 for novice). Skin L*, skin b*, and labia L* values reached the highest level of agreement (ICC ≥ 0.92) and skin a*, labia b*, and vaginal wall b* were the lowest (ICC ≥ 0.64). Data from novice analysts can achieve high levels of agreement with data from expert analysts with training and the use of a detailed, standard protocol. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Software for Partly Automated Recognition of Targets
NASA Technical Reports Server (NTRS)
Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark
2003-01-01
The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user s tendencies while the user is selecting targets and to increase the user s productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.
Ko, Sungahn; Zhao, Jieqiong; Xia, Jing; Afzal, Shehzad; Wang, Xiaoyu; Abram, Greg; Elmqvist, Niklas; Kne, Len; Van Riper, David; Gaither, Kelly; Kennedy, Shaun; Tolone, William; Ribarsky, William; Ebert, David S
2014-12-01
We present VASA, a visual analytics platform consisting of a desktop application, a component model, and a suite of distributed simulation components for modeling the impact of societal threats such as weather, food contamination, and traffic on critical infrastructure such as supply chains, road networks, and power grids. Each component encapsulates a high-fidelity simulation model that together form an asynchronous simulation pipeline: a system of systems of individual simulations with a common data and parameter exchange format. At the heart of VASA is the Workbench, a visual analytics application providing three distinct features: (1) low-fidelity approximations of the distributed simulation components using local simulation proxies to enable analysts to interactively configure a simulation run; (2) computational steering mechanisms to manage the execution of individual simulation components; and (3) spatiotemporal and interactive methods to explore the combined results of a simulation run. We showcase the utility of the platform using examples involving supply chains during a hurricane as well as food contamination in a fast food restaurant chain.
Reproducibility of apatite fission-track length data and thermal history reconstruction
NASA Astrophysics Data System (ADS)
Ketcham, Richard A.; Donelick, Raymond A.; Balestrieri, Maria Laura; Zattin, Massimiliano
2009-07-01
The ability to derive detailed thermal history information from apatite fission-track analysis is predicated on the reliability of track length measurements. However, insufficient attention has been given to whether and how these measurements should be standardized. In conjunction with a fission-track workshop we conducted an experiment in which 11 volunteers measured ~ 50 track lengths on one or two samples. One mount contained Durango apatite with unannealed induced tracks, and one contained apatite from a crystalline rock containing spontaneous tracks with a broad length distribution caused by partial resetting. Results for both mounts showed scatter indicative of differences in measurement technique among the individual analysts. The effects of this variability on thermal history inversion were tested using the HeFTy computer program to model the spontaneous track measurements. A cooling-only scenario and a reheating scenario more consistent with the sample's geological history were posed. When a uniform initial length value from the literature was used, results among analysts were very inconsistent in both scenarios, although normalizing for track angle by projecting all lengths to a c-axis parallel crystallographic orientation improved some aspects of congruency. When the induced track measurement was used as the basis for thermal history inversion congruency among analysts, and agreement with inversions based on data previously collected, was significantly improved. Further improvement was obtained by using c-axis projection. Differences among inversions that persisted could be traced to differential sampling of long- and short-track populations among analysts. The results of this study, while demonstrating the robustness of apatite fission-track thermal history inversion, nevertheless point to the necessity for a standardized length calibration schema that accounts for analyst variation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to complymore » with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.« less
Federal Employees: Appointees Converted to Career Positions, July through September 1988
1989-01-13
Media) GS-1035-13/2 GS- 1082 -12/5 Program Analyst yes Program Analyst Temporary GS-345-12/1 GS-345-12/1 Legislative Affairs yes Congressional Liaison...Officer GS-301-13/1 GM-345-14 GS-14/1 equivalent pay MERIT SYSTEMS PROTECTION BOARD Writer/Editor yes Writer/Editor Temporary GS- 1082 -12/1 GS- 1082 -12/1
Disability Evaluation System Analysis and Research Annual Report 2017
2017-11-20
Amanda L. Kelley, MPH Program Manager, AMSARA Deputy Program Manager, AMSARA Contractor, ManTech Health Contractor, ManTech Health Christine...Toolin, MS Cordie K. Campbell, MPH Public Health Analyst, AMSARA Public Health Analyst, AMSARA Contractor, ManTech Health Contractor...ManTech Health Preventive Medicine Branch Walter Reed Army Institute of Research 503 Robert Grant Road, Forest Glen Annex Silver
A generalized baleen whale call detection and classification system.
Baumgartner, Mark F; Mussoline, Sarah E
2011-05-01
Passive acoustic monitoring allows the assessment of marine mammal occurrence and distribution at greater temporal and spatial scales than is now possible with traditional visual surveys. However, the large volume of acoustic data and the lengthy and laborious task of manually analyzing these data have hindered broad application of this technique. To overcome these limitations, a generalized automated detection and classification system (DCS) was developed to efficiently and accurately identify low-frequency baleen whale calls. The DCS (1) accounts for persistent narrowband and transient broadband noise, (2) characterizes temporal variation of dominant call frequencies via pitch-tracking, and (3) classifies calls based on attributes of the resulting pitch tracks using quadratic discriminant function analysis (QDFA). Automated detections of sei whale (Balaenoptera borealis) downsweep calls and North Atlantic right whale (Eubalaena glacialis) upcalls were evaluated using recordings collected in the southwestern Gulf of Maine during the spring seasons of 2006 and 2007. The accuracy of the DCS was similar to that of a human analyst: variability in differences between the DCS and an analyst was similar to that between independent analysts, and temporal variability in call rates was similar among the DCS and several analysts.
Real-Time Visualization of Network Behaviors for Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.
Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less
Calibration Issues and Operating System Requirements for Electron-Probe Microanalysis
NASA Technical Reports Server (NTRS)
Carpenter, P.
2006-01-01
Instrument purchase requirements and dialogue with manufacturers have established hardware parameters for alignment, stability, and reproducibility, which have helped improve the precision and accuracy of electron microprobe analysis (EPMA). The development of correction algorithms and the accurate solution to quantitative analysis problems requires the minimization of systematic errors and relies on internally consistent data sets. Improved hardware and computer systems have resulted in better automation of vacuum systems, stage and wavelength-dispersive spectrometer (WDS) mechanisms, and x-ray detector systems which have improved instrument stability and precision. Improved software now allows extended automated runs involving diverse setups and better integrates digital imaging and quantitative analysis. However, instrumental performance is not regularly maintained, as WDS are aligned and calibrated during installation but few laboratories appear to check and maintain this calibration. In particular, detector deadtime (DT) data is typically assumed rather than measured, due primarily to the difficulty and inconvenience of the measurement process. This is a source of fundamental systematic error in many microprobe laboratories and is unknown to the analyst, as the magnitude of DT correction is not listed in output by microprobe operating systems. The analyst must remain vigilant to deviations in instrumental alignment and calibration, and microprobe system software must conveniently verify the necessary parameters. Microanalysis of mission critical materials requires an ongoing demonstration of instrumental calibration. Possible approaches to improvements in instrument calibration, quality control, and accuracy will be discussed. Development of a set of core requirements based on discussions with users, researchers, and manufacturers can yield documents that improve and unify the methods by which instruments can be calibrated. These results can be used to continue improvements of EPMA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Zhenhua; Rose, Adam Z.; Prager, Fynnwin
The state of the art approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) modeling. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems used by policy analysts to yield estimates of economic impacts of various types of transportation system failures due to natural hazards, human related attacks or technological accidents. This paper presents a reduced-form approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The reduced-form CGE analysis is conducted by first running simulations one hundred times, varying key parameters, suchmore » as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin Hypercube sampling procedure. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression. The analysis yields linear equations that are incorporated into a computerized system and utilized along with Monte Carlo simulation methods for propagating uncertainties in economic consequences. Although our demonstration and discussion focuses on aviation system disruptions caused by terrorist attacks, the approach can be applied to a broad range of threat scenarios.« less
Situational Awareness of Network System Roles (SANSR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huffer, Kelly M; Reed, Joel W
In a large enterprise it is difficult for cyber security analysts to know what services and roles every machine on the network is performing (e.g., file server, domain name server, email server). Using network flow data, already collected by most enterprises, we developed a proof-of-concept tool that discovers the roles of a system using both clustering and categorization techniques. The tool's role information would allow cyber analysts to detect consequential changes in the network, initiate incident response plans, and optimize their security posture. The results of this proof-of-concept tool proved to be quite accurate on three real data sets. Wemore » will present the algorithms used in the tool, describe the results of preliminary testing, provide visualizations of the results, and discuss areas for future work. Without this kind of situational awareness, cyber analysts cannot quickly diagnose an attack or prioritize remedial actions.« less
1990-10-01
involving a heavy artillery barrage, the impact point output alone could consume upwards of 10,000 pages of computer paper. For this reason, AURA provides...but pervasive factor: the asset allocation model must be compatible with the mathematical behavior of the input data. Thus, for example, if assets are...described as expendable during repair or decontamination activities, it must have HOMELINKS which appear in the consuming repair SUBCHAINs
NASA Technical Reports Server (NTRS)
Smith, Leigh M.; Parker, Nelson C. (Technical Monitor)
2002-01-01
This paper analyzes the use of Computer Aided Design (CAD) packages at NASA's Marshall Space Flight Center (MSFC). It examines the effectiveness of recent efforts to standardize CAD practices across MSFC engineering activities. An assessment of the roles played by management, designers, analysts, and manufacturers in this initiative will be explored. Finally, solutions are presented for better integration of CAD across MSFC in the future.
SnapShot: Visualization to Propel Ice Hockey Analytics.
Pileggi, H; Stolper, C D; Boyle, J M; Stasko, J T
2012-12-01
Sports analysts live in a world of dynamic games flattened into tables of numbers, divorced from the rinks, pitches, and courts where they were generated. Currently, these professional analysts use R, Stata, SAS, and other statistical software packages for uncovering insights from game data. Quantitative sports consultants seek a competitive advantage both for their clients and for themselves as analytics becomes increasingly valued by teams, clubs, and squads. In order for the information visualization community to support the members of this blossoming industry, it must recognize where and how visualization can enhance the existing analytical workflow. In this paper, we identify three primary stages of today's sports analyst's routine where visualization can be beneficially integrated: 1) exploring a dataspace; 2) sharing hypotheses with internal colleagues; and 3) communicating findings to stakeholders.Working closely with professional ice hockey analysts, we designed and built SnapShot, a system to integrate visualization into the hockey intelligence gathering process. SnapShot employs a variety of information visualization techniques to display shot data, yet given the importance of a specific hockey statistic, shot length, we introduce a technique, the radial heat map. Through a user study, we received encouraging feedback from several professional analysts, both independent consultants and professional team personnel.
Development of Automatic Control of Bayer Plant Digestion
NASA Astrophysics Data System (ADS)
Riffaud, J. P.
Supervisory computer control has been achieved in Alcan's Bayer Plants at Arvida, Quebec, Canada. The purpose of the automatic control system is to stabilize and consequently increase, the alumina/caustic ratio within the digester train and in the blow-off liquor. Measurements of the electrical conductivity of the liquor are obtained from electrodeless conductivity meters. These signals, along with several others are scanned by the computer and converted to engineering units, using specific relationships which are updated periodically for calibration purposes. On regular time intervals, values of ratio are compared to target values and adjustments are made to the bauxite flow entering the digesters. Dead time compensation included in the control algorithm enables a faster rate for corrections. Modification of production rate is achieved through careful timing of various flow changes. Calibration of the conductivity meters is achieved by sampling at intervals the liquor flowing through them, and analysing it with a thermometric titrator. Calibration of the thermometric titrator is done at intervals with a standard solution. Calculations for both calibrations are performed by computer from data entered by the analyst. The computer was used for on-line data collection, modelling of the digester system, calculation of disturbances and simulation of control strategies before implementing the most successful strategy in the Plant. Control of ratio has been improved by the integrated system, resulting in increased Plant productivity.
NASA Technical Reports Server (NTRS)
Hamilton, M. L.; Burriss, W. L.
1972-01-01
Detailed cycle steady-state performance data are presented for the final auxiliary power unit (APU) system configuration. The selection configuration is a hydrogen-oxygen APU incorporating a recuperator to utilize the exhaust energy and using the cycle hydrogen flow as a means of cooling the component heat loads. The data are given in the form of computer printouts and provide the following: (1) verification of the adequacy of the design to meet the problem statement for steady-state performance; (2) overall system performance data for the vehicle system analyst to determine propellant consumption and hydraulic fluid temperature as a function for varying mission profiles, propellant inlet conditions, etc.; and (3) detailed component performance and cycle state point data to show what is happening in the cycle as a function of the external forcing functions.
Simple uncertainty propagation for early design phase aircraft sizing
NASA Astrophysics Data System (ADS)
Lenz, Annelise
Many designers and systems analysts are aware of the uncertainty inherent in their aircraft sizing studies; however, few incorporate methods to address and quantify this uncertainty. Many aircraft design studies use semi-empirical predictors based on a historical database and contain uncertainty -- a portion of which can be measured and quantified. In cases where historical information is not available, surrogate models built from higher-fidelity analyses often provide predictors for design studies where the computational cost of directly using the high-fidelity analyses is prohibitive. These surrogate models contain uncertainty, some of which is quantifiable. However, rather than quantifying this uncertainty, many designers merely include a safety factor or design margin in the constraints to account for the variability between the predicted and actual results. This can become problematic if a designer does not estimate the amount of variability correctly, which then can result in either an "over-designed" or "under-designed" aircraft. "Under-designed" and some "over-designed" aircraft will likely require design changes late in the process and will ultimately require more time and money to create; other "over-designed" aircraft concepts may not require design changes, but could end up being more costly than necessary. Including and propagating uncertainty early in the design phase so designers can quantify some of the errors in the predictors could help mitigate the extent of this additional cost. The method proposed here seeks to provide a systematic approach for characterizing a portion of the uncertainties that designers are aware of and propagating it throughout the design process in a procedure that is easy to understand and implement. Using Monte Carlo simulations that sample from quantified distributions will allow a systems analyst to use a carpet plot-like approach to make statements like: "The aircraft is 'P'% likely to weigh 'X' lbs or less, given the uncertainties quantified" without requiring the systems analyst to have substantial knowledge of probabilistic methods. A semi-empirical sizing study of a small single-engine aircraft serves as an example of an initial version of this simple uncertainty propagation. The same approach is also applied to a variable-fidelity concept study using a NASA-developed transonic Hybrid Wing Body aircraft.
Model documentation report: Residential sector demand module of the national energy modeling system
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code. This reference document provides a detailed description for energy analysts, other users, and the public. The NEMS Residential Sector Demand Module is currently used for mid-term forecasting purposes and energy policy analysis over the forecast horizon of 1993 through 2020. The model generates forecasts of energy demand for the residential sector by service, fuel, and Census Division. Policy impacts resulting from new technologies,more » market incentives, and regulatory changes can be estimated using the module. 26 refs., 6 figs., 5 tabs.« less
NASA Astrophysics Data System (ADS)
Budde, M. E.; Rowland, J.; Anthony, M.; Palka, S.; Martinez, J.; Hussain, R.
2017-12-01
The U.S. Geological Survey (USGS) supports the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET). The USGS Earth Resources Observation and Science (EROS) Center has developed tools designed to aid food security analysts in developing assumptions of agro-climatological outcomes. There are four primary steps to developing agro-climatology assumptions; including: 1) understanding the climatology, 2) evaluating current climate modes, 3) interpretation of forecast information, and 4) incorporation of monitoring data. Analysts routinely forecast outcomes well in advance of the growing season, which relies on knowledge of climatology. A few months prior to the growing season, analysts can assess large-scale climate modes that might influence seasonal outcomes. Within two months of the growing season, analysts can evaluate seasonal forecast information as indicators. Once the growing season begins, monitoring data, based on remote sensing and field information, can characterize the start of season and remain integral monitoring tools throughout the duration of the season. Each subsequent step in the process can lead to modifications of the original climatology assumption. To support such analyses, we have created an agro-climatology analysis tool that characterizes each step in the assumption building process. Satellite-based rainfall and normalized difference vegetation index (NDVI)-based products support both the climatology and monitoring steps, sea-surface temperature data and knowledge of the global climate system inform the climate modes, and precipitation forecasts at multiple scales support the interpretation of forecast information. Organizing these data for a user-specified area provides a valuable tool for food security analysts to better formulate agro-climatology assumptions that feed into food security assessments. We have also developed a knowledge base for over 80 countries that provide rainfall and NDVI-based products, including annual and seasonal summaries, historical anomalies, coefficient of variation, and number of years below 70% of annual or seasonal averages. These products provide a quick look for analysts to assess the agro-climatology of a country.
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel Anne
Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show that the basic Image Interpretation Elements continue to be important despite technological augmentation of the interpretation process. These results are used to derive a set of design guidelines for developing geovisual analytic tools and training to support image analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Judith Alice; Long, Kevin Nicholas
2018-05-01
Sylgard® 184/Glass Microballoon (GMB) potting material is currently used in many NW systems. Analysts need a macroscale constitutive model that can predict material behavior under complex loading and damage evolution. To address this need, ongoing modeling and experimental efforts have focused on study of damage evolution in these materials. Micromechanical finite element simulations that resolve individual GMB and matrix components promote discovery and better understanding of the material behavior. With these simulations, we can study the role of the GMB volume fraction, time-dependent damage, behavior under confined vs. unconfined compression, and the effects of partial damage. These simulations are challengingmore » and push the boundaries of capability even with the high performance computing tools available at Sandia. We summarize the major challenges and the current state of this modeling effort, as an exemplar of micromechanical modeling needs that can motivate advances in future computing efforts.« less
Station Set Residual: Event Classification Using Historical Distribution of Observing Stations
NASA Astrophysics Data System (ADS)
Procopio, Mike; Lewis, Jennifer; Young, Chris
2010-05-01
Analysts working at the International Data Centre in support of treaty monitoring through the Comprehensive Nuclear-Test-Ban Treaty Organization spend a significant amount of time reviewing hypothesized seismic events produced by an automatic processing system. When reviewing these events to determine their legitimacy, analysts take a variety of approaches that rely heavily on training and past experience. One method used by analysts to gauge the validity of an event involves examining the set of stations involved in the detection of an event. In particular, leveraging past experience, an analyst can say that an event located in a certain part of the world is expected to be detected by Stations A, B, and C. Implicit in this statement is that such an event would usually not be detected by Stations X, Y, or Z. For some well understood parts of the world, the absence of one or more "expected" stations—or the presence of one or more "unexpected" stations—is correlated with a hypothesized event's legitimacy and to its survival to the event bulletin. The primary objective of this research is to formalize and quantify the difference between the observed set of stations detecting some hypothesized event, versus the expected set of stations historically associated with detecting similar nearby events close in magnitude. This Station Set Residual can be quantified in many ways, some of which are correlated with the analysts' determination of whether or not the event is valid. We propose that this Station Set Residual score can be used to screen out certain classes of "false" events produced by automatic processing with a high degree of confidence, reducing the analyst burden. Moreover, we propose that the visualization of the historically expected distribution of detecting stations can be immediately useful as an analyst aid during their review process.
NPV Sensitivity Analysis: A Dynamic Excel Approach
ERIC Educational Resources Information Center
Mangiero, George A.; Kraten, Michael
2017-01-01
Financial analysts generally create static formulas for the computation of NPV. When they do so, however, it is not readily apparent how sensitive the value of NPV is to changes in multiple interdependent and interrelated variables. It is the aim of this paper to analyze this variability by employing a dynamic, visually graphic presentation using…
Support-Staff Jobs Double in 20 Years, Outpacing Enrollment
ERIC Educational Resources Information Center
Brainard, Jeffrey; Fain, Paul; Masterson, Kathryn
2009-01-01
Colleges have added managers and support personnel at a steady, vigorous clip over the past 20 years, new research shows, far outpacing the growth in student enrollment and instructors. Support staff--like budget analysts, computer specialists, and loan counselors--nearly doubled from 1987 to 2007. Meanwhile, jobs for instructors increased by only…
2013-03-01
Proliferation Treaty OSINT Open Source Intelligence SAFF Safing, Arming, Fuzing, Firing SIAM Situational Influence Assessment Module SME Subject...expertise. One of the analysts can also be trained to tweak CAST logic as needed. In this initial build, only open-source intelligence ( OSINT ) will
SOLVE The performance analyst for hardwood sawmills
Jeff Palmer; Jan Wiedenbeck; Elizabeth Porterfield
2009-01-01
Presents the users manual and CD-ROM for SOLVE, a computer program that helps sawmill managers improve efficiency and solve problems commonly found in hardwood sawmills. SOLVE provides information on key operational factors including log size distribution, lumber grade yields, lumber recovery factor and overrun, and break-even log costs. (Microsoft Windows? Edition)...
An Advanced Simulation Framework for Parallel Discrete-Event Simulation
NASA Technical Reports Server (NTRS)
Li, P. P.; Tyrrell, R. Yeung D.; Adhami, N.; Li, T.; Henry, H.
1994-01-01
Discrete-event simulation (DEVS) users have long been faced with a three-way trade-off of balancing execution time, model fidelity, and number of objects simulated. Because of the limits of computer processing power the analyst is often forced to settle for less than desired performances in one or more of these areas.
Soviet Computers and Cybernetics: Shortcomings and Military Applications.
1980-06-01
FOOTNOTES.......................................24 BIBLIOGRAPHY......................................28 INTRODUCTION Military scientific technological...exploration which have alarmed some Western analysts. America’s scientific and technological advantages are integral elements in the delicate world balance...inferior quantity only up to a point, where superior numbers take over. A major element in the military scientific technological competition between
Evaluation techniques and metrics for assessment of pan+MSI fusion (pansharpening)
NASA Astrophysics Data System (ADS)
Mercovich, Ryan A.
2015-05-01
Fusion of broadband panchromatic data with narrow band multispectral data - pansharpening - is a common and often studied problem in remote sensing. Many methods exist to produce data fusion results with the best possible spatial and spectral characteristics, and a number have been commercially implemented. This study examines the output products of 4 commercial implementations with regard to their relative strengths and weaknesses for a set of defined image characteristics and analyst use-cases. Image characteristics used are spatial detail, spatial quality, spectral integrity, and composite color quality (hue and saturation), and analyst use-cases included a variety of object detection and identification tasks. The imagery comes courtesy of the RIT SHARE 2012 collect. Two approaches are used to evaluate the pansharpening methods, analyst evaluation or qualitative measure and image quality metrics or quantitative measures. Visual analyst evaluation results are compared with metric results to determine which metrics best measure the defined image characteristics and product use-cases and to support future rigorous characterization the metrics' correlation with the analyst results. Because pansharpening represents a trade between adding spatial information from the panchromatic image, and retaining spectral information from the MSI channels, the metrics examined are grouped into spatial improvement metrics and spectral preservation metrics. A single metric to quantify the quality of a pansharpening method would necessarily be a combination of weighted spatial and spectral metrics based on the importance of various spatial and spectral characteristics for the primary task of interest. Appropriate metrics and weights for such a combined metric are proposed here, based on the conducted analyst evaluation. Additionally, during this work, a metric was developed specifically focused on assessment of spatial structure improvement relative to a reference image and independent of scene content. Using analysis of Fourier transform images, a measure of high-frequency content is computed in small sub-segments of the image. The average increase in high-frequency content across the image is used as the metric, where averaging across sub-segments combats the scene dependent nature of typical image sharpness techniques. This metric had an improved range of scores, better representing difference in the test set than other common spatial structure metrics.
NSRD-15:Computational Capability to Substantiate DOE-HDBK-3010 Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, David; Bignell, John; Dingreville, Remi Philippe Michel
Safety basis analysts throughout the U.S. Department of Energy (DOE) complex rely heavily on the information provided in the DOE Handbook, DOE-HDBK-3010, Airborne Release Fractions/Rates and Respirable Fractions for Nonreactor Nuclear Facilities, to determine radionuclide source terms from postulated accident scenarios. In calculating source terms, analysts tend to use the DOE Handbook’s bounding values on airborne release fractions (ARFs) and respirable fractions (RFs) for various categories of insults (representing potential accident release categories). This is typically due to both time constraints and the avoidance of regulatory critique. Unfortunately, these bounding ARFs/RFs represent extremely conservative values. Moreover, they were derived frommore » very limited small-scale bench/laboratory experiments and/or from engineered judgment. Thus, the basis for the data may not be representative of the actual unique accident conditions and configurations being evaluated. The goal of this research is to develop a more accurate and defensible method to determine bounding values for the DOE Handbook using state-of-art multi-physics-based computer codes.« less
The discounting model selector: Statistical software for delay discounting applications.
Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A
2017-05-01
Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.
Kang, Youn-Ah; Stasko, J
2012-12-01
While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.
NASA Technical Reports Server (NTRS)
1974-01-01
A collection of blank worksheets for use on each BRAVO problem to be analyzed is supplied, for the purposes of recording the inputs for the BRAVO analysis, working out the definition of mission equipment, recording inputs to the satellite synthesis computer program, estimating satellite earth station costs, costing terrestrial systems, and cost effectiveness calculations. The group of analysts working BRAVO will normally use a set of worksheets on each problem, however, the workbook pages are of sufficiently good quality that the user can duplicate them, if more worksheet blanks are required than supplied. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.
Accurate Inventories Of Irrigated Land
NASA Technical Reports Server (NTRS)
Wall, S.; Thomas, R.; Brown, C.
1992-01-01
System for taking land-use inventories overcomes two problems in estimating extent of irrigated land: only small portion of large state surveyed in given year, and aerial photographs made on 1 day out of year do not provide adequate picture of areas growing more than one crop per year. Developed for state of California as guide to controlling, protecting, conserving, and distributing water within state. Adapted to any large area in which large amounts of irrigation water needed for agriculture. Combination of satellite images, aerial photography, and ground surveys yields data for computer analysis. Analyst also consults agricultural statistics, current farm reports, weather reports, and maps. These information sources aid in interpreting patterns, colors, textures, and shapes on Landsat-images.
NASA Technical Reports Server (NTRS)
Coen, Peter G.
1991-01-01
A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.
Zhu, Xinjie; Zhang, Qiang; Ho, Eric Dun; Yu, Ken Hung-On; Liu, Chris; Huang, Tim H; Cheng, Alfred Sze-Lok; Kao, Ben; Lo, Eric; Yip, Kevin Y
2017-09-22
A genomic signal track is a set of genomic intervals associated with values of various types, such as measurements from high-throughput experiments. Analysis of signal tracks requires complex computational methods, which often make the analysts focus too much on the detailed computational steps rather than on their biological questions. Here we propose Signal Track Query Language (STQL) for simple analysis of signal tracks. It is a Structured Query Language (SQL)-like declarative language, which means one only specifies what computations need to be done but not how these computations are to be carried out. STQL provides a rich set of constructs for manipulating genomic intervals and their values. To run STQL queries, we have developed the Signal Track Analytical Research Tool (START, http://yiplab.cse.cuhk.edu.hk/start/ ), a system that includes a Web-based user interface and a back-end execution system. The user interface helps users select data from our database of around 10,000 commonly-used public signal tracks, manage their own tracks, and construct, store and share STQL queries. The back-end system automatically translates STQL queries into optimized low-level programs and runs them on a computer cluster in parallel. We use STQL to perform 14 representative analytical tasks. By repeating these analyses using bedtools, Galaxy and custom Python scripts, we show that the STQL solution is usually the simplest, and the parallel execution achieves significant speed-up with large data files. Finally, we describe how a biologist with minimal formal training in computer programming self-learned STQL to analyze DNA methylation data we produced from 60 pairs of hepatocellular carcinoma (HCC) samples. Overall, STQL and START provide a generic way for analyzing a large number of genomic signal tracks in parallel easily.
The Hazard Mapping System (HMS)-a Multiplatform Remote Sensing Approach to Fire and Smoke Detection
NASA Astrophysics Data System (ADS)
Kibler, J.; Ruminski, M. G.
2003-12-01
The HMS is a multiplatform remote sensing approach to detecting fires and smoke over the US and adjacent areas of Canada and Mexico that has been in place since June 2002. This system is an integral part of the National Environmental Satellite and Data Information Service (NESDIS) near realtime hazard detection and mitigation efforts. The system utilizes NOAA's Geostationary Operational Environmental Satellites (GOES), Polar Operational Environmental Satellites (POES) and the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on NASA's Terra and Aqua spacecraft. Automated detection algorithms are employed for each of the satellites for the fire detects while smoke is added by a satellite image analyst. In June 2003 the HMS underwent an upgrade. A number of features were added for users of the products generated on the HMS. Sectors covering Alaska and Hawaii were added. The use of Geographic Information System (GIS) shape files for smoke analysis is a new feature. Shape files show the progression and time of a single smoke plume as each analysis is drawn and then updated. The analyst now has the ability to view GOES, POES, and MODIS data in a single loop. This allows the fire analyst the ability to easily confirm a fire in three different data sets. The upgraded HMS has faster satellite looping and gives the analyst the ability to design a false color image for a particular region. The GOES satellites provide a relatively coarse 4 km infrared resolution at satellite subpoint for thermal fire detection but provide the advantage of a rapid update cycle. GOES imagery is updated every 15 minutes utilizing both GOES-10 and GOES-12. POES imagery from NOAA-15, NOAA-16 and NOAA-17 and MODIS from Terra and Aqua are employed with each satellite providing twice per day coverage (more frequent over Alaska). While the frequency of imagery is much less than with GOES the higher resolution of these satellites (1 km along the suborbital track) allows for detection of smaller and/or cooler burning fires. Each of the algorithms utilizes a number of temporal, thermal and contextual filters in an attempt to screen out false detects. However, false detects do get processed by the algorithms to varying degrees. Therefore, the automated fire detects from each algorithm are quality controlled by an analyst who scans the imagery and may either accept or delete fire points. The analyst also has the ability to manually add additional fire points based on the imagery. Smoke is outlined by the analyst using visible imagery, primarily GOES which provides 1 km resolution. Occasionally a smoke plume seen in visible imagery is the only indicator of a fire and would be manually added to the fire detect file. The Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) is a forecast model that projects the trajectory and dispersion of a smoke plume over a period of time. The HYSPLIT is run for fires that are selected by the analyst that are seen to be producing a significant smoke plume. The analyst defines a smoke producing area commensurate with the size of the fire and amount of smoke detected. The output is hosted on an Air Resources Lab (ARL) web site which can be accessed from the web site listed below. All of the information is posted to the web page noted below. Besides the interactive GIS presentation users can view the product in graphical jpg format. The analyst edited points as well as the unedited automated fire detects are available for users to view directly on the web page or to download. All of the data is also archived and accessed via ftp.
Defining the role of a PACS technologist.
Cabrera, Alfred
2002-01-01
As hospitals convert from conventional image processing to picture archiving and communication systems (PACS) technology, new job opportunities arose for PACS analysts, PACS system administrators, PACS operators, and PACS trainers. To support a PACS, these positions require education in computer information systems and work experience in information technology. At Texas Children's Hospital, new roles for radiologic technologists (RT) in supporting the operation of PACS were not recognized until after implementation of the filmless system. A new position entitled PACS technologis was created, but roles and responsibilities largely were undefined. The inadequate job description contributed to problems with appropriate utilization of the PACS technologist. The primary role of the technologist was nebulous, and the priority of tasks was undefined. There was an excessive volume of information and technology to be mastered. The role represented a new paradigm, so no template for the job description was available that encompassed the array of functions to be performed. The result was a "morph" of the RT and PACS analyst job descriptions that was contrived and unworkable. The role of the PACS technologist is vital to the operation of the radiology department that uses PACS. There is a well-established need for cross training of RTs in PACS. PACS technology is not taught in RT training programs. There are recurrent communications problems between RT and Information Technology (IT) personnel. The PACS technologist can participate in a number of activities that improve the overall level of proficiency in the imaging operation, such as specialized PACS training for RTs, collection and analysis of quality control data, and planning for installations of PACS acquisition modalities. RTs have acquired knowledge of medical terminology and human anatomy, imaging modalities, and workflow. These qualifications constitute a common basis for communication with other RTs, physicians, and other health care providers. In addition the appropriate candidate for PACS technologist should have computer software and hardware knowledge, interpersonal skills, oral and written communications skills, and analytical skills to troubleshoot issues. This report will describe the evolution of a more accurate job description for the PACS technologist, the relationship between the PACS technologist and the RT supervisor, and specific tasks are appropriate for the PACS technologist to perform.
Visualization of Spatio-Temporal Relations in Movement Event Using Multi-View
NASA Astrophysics Data System (ADS)
Zheng, K.; Gu, D.; Fang, F.; Wang, Y.; Liu, H.; Zhao, W.; Zhang, M.; Li, Q.
2017-09-01
Spatio-temporal relations among movement events extracted from temporally varying trajectory data can provide useful information about the evolution of individual or collective movers, as well as their interactions with their spatial and temporal contexts. However, the pure statistical tools commonly used by analysts pose many difficulties, due to the large number of attributes embedded in multi-scale and multi-semantic trajectory data. The need for models that operate at multiple scales to search for relations at different locations within time and space, as well as intuitively interpret what these relations mean, also presents challenges. Since analysts do not know where or when these relevant spatio-temporal relations might emerge, these models must compute statistical summaries of multiple attributes at different granularities. In this paper, we propose a multi-view approach to visualize the spatio-temporal relations among movement events. We describe a method for visualizing movement events and spatio-temporal relations that uses multiple displays. A visual interface is presented, and the user can interactively select or filter spatial and temporal extents to guide the knowledge discovery process. We also demonstrate how this approach can help analysts to derive and explain the spatio-temporal relations of movement events from taxi trajectory data.
KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process
NASA Technical Reports Server (NTRS)
Gettig, Gary A.
1988-01-01
Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.
Understanding interfirm relationships in business ecosystems with interactive visualization.
Basole, Rahul C; Clear, Trustin; Hu, Mengdie; Mehrotra, Harshit; Stasko, John
2013-12-01
Business ecosystems are characterized by large, complex, and global networks of firms, often from many different market segments, all collaborating, partnering, and competing to create and deliver new products and services. Given the rapidly increasing scale, complexity, and rate of change of business ecosystems, as well as economic and competitive pressures, analysts are faced with the formidable task of quickly understanding the fundamental characteristics of these interfirm networks. Existing tools, however, are predominantly query- or list-centric with limited interactive, exploratory capabilities. Guided by a field study of corporate analysts, we have designed and implemented dotlink360, an interactive visualization system that provides capabilities to gain systemic insight into the compositional, temporal, and connective characteristics of business ecosystems. dotlink360 consists of novel, multiple connected views enabling the analyst to explore, discover, and understand interfirm networks for a focal firm, specific market segments or countries, and the entire business ecosystem. System evaluation by a small group of prototypical users shows supporting evidence of the benefits of our approach. This design study contributes to the relatively unexplored, but promising area of exploratory information visualization in market research and business strategy.
TelCoVis: Visual Exploration of Co-occurrence in Urban Human Mobility Based on Telco Data.
Wu, Wenchao; Xu, Jiayi; Zeng, Haipeng; Zheng, Yixian; Qu, Huamin; Ni, Bing; Yuan, Mingxuan; Ni, Lionel M
2016-01-01
Understanding co-occurrence in urban human mobility (i.e. people from two regions visit an urban place during the same time span) is of great value in a variety of applications, such as urban planning, business intelligence, social behavior analysis, as well as containing contagious diseases. In recent years, the widespread use of mobile phones brings an unprecedented opportunity to capture large-scale and fine-grained data to study co-occurrence in human mobility. However, due to the lack of systematic and efficient methods, it is challenging for analysts to carry out in-depth analyses and extract valuable information. In this paper, we present TelCoVis, an interactive visual analytics system, which helps analysts leverage their domain knowledge to gain insight into the co-occurrence in urban human mobility based on telco data. Our system integrates visualization techniques with new designs and combines them in a novel way to enhance analysts' perception for a comprehensive exploration. In addition, we propose to study the correlations in co-occurrence (i.e. people from multiple regions visit different places during the same time span) by means of biclustering techniques that allow analysts to better explore coordinated relationships among different regions and identify interesting patterns. The case studies based on a real-world dataset and interviews with domain experts have demonstrated the effectiveness of our system in gaining insights into co-occurrence and facilitating various analytical tasks.
VIGOR: Interactive Visual Exploration of Graph Query Results.
Pienta, Robert; Hohman, Fred; Endert, Alex; Tamersoy, Acar; Roundy, Kevin; Gates, Chris; Navathe, Shamkant; Chau, Duen Horng
2018-01-01
Finding patterns in graphs has become a vital challenge in many domains from biological systems, network security, to finance (e.g., finding money laundering rings of bankers and business owners). While there is significant interest in graph databases and querying techniques, less research has focused on helping analysts make sense of underlying patterns within a group of subgraph results. Visualizing graph query results is challenging, requiring effective summarization of a large number of subgraphs, each having potentially shared node-values, rich node features, and flexible structure across queries. We present VIGOR, a novel interactive visual analytics system, for exploring and making sense of query results. VIGOR uses multiple coordinated views, leveraging different data representations and organizations to streamline analysts sensemaking process. VIGOR contributes: (1) an exemplar-based interaction technique, where an analyst starts with a specific result and relaxes constraints to find other similar results or starts with only the structure (i.e., without node value constraints), and adds constraints to narrow in on specific results; and (2) a novel feature-aware subgraph result summarization. Through a collaboration with Symantec, we demonstrate how VIGOR helps tackle real-world problems through the discovery of security blindspots in a cybersecurity dataset with over 11,000 incidents. We also evaluate VIGOR with a within-subjects study, demonstrating VIGOR's ease of use over a leading graph database management system, and its ability to help analysts understand their results at higher speed and make fewer errors.
Airborne ladar man-in-the-loop operations in tactical environments
NASA Astrophysics Data System (ADS)
Grobmyer, Joseph E., Jr.; Lum, Tommy; Morris, Robert E.; Hard, Sarah J.; Pratt, H. L.; Florence, Tom; Peddycoart, Ed
2004-09-01
The U.S. Army Research, Development and Engineering Command (RDECOM) is developing approaches and processes that will exploit the characteristics of current and future Laser Radar (LADAR) sensor systems for critical man-in-the-loop tactical processes. The importance of timely and accurate target detection, classification, identification, and engagement for future combat systems has been documented and is viewed as a critical enabling factor for FCS survivability and lethality. Recent work has demonstrated the feasibility of using low cost but relatively capable personal computer class systems to exploit the information available in Ladar sensor frames to present the war fighter or analyst with compelling and usable imagery for use in the target identification and engagement processes in near real time. The advantages of LADAR imagery are significant in environments presenting cover for targets and the associated difficulty for automated target recognition (ATR) technologies.
Some guidance on preparing validation plans for the DART Full System Models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, Genetha Anne; Hough, Patricia Diane; Hills, Richard Guy
2009-03-01
Planning is an important part of computational model verification and validation (V&V) and the requisite planning document is vital for effectively executing the plan. The document provides a means of communicating intent to the typically large group of people, from program management to analysts to test engineers, who must work together to complete the validation activities. This report provides guidelines for writing a validation plan. It describes the components of such a plan and includes important references and resources. While the initial target audience is the DART Full System Model teams in the nuclear weapons program, the guidelines are generallymore » applicable to other modeling efforts. Our goal in writing this document is to provide a framework for consistency in validation plans across weapon systems, different types of models, and different scenarios. Specific details contained in any given validation plan will vary according to application requirements and available resources.« less
Feature extraction and classification of clouds in high resolution panchromatic satellite imagery
NASA Astrophysics Data System (ADS)
Sharghi, Elan
The development of sophisticated remote sensing sensors is rapidly increasing, and the vast amount of satellite imagery collected is too much to be analyzed manually by a human image analyst. It has become necessary for a tool to be developed to automate the job of an image analyst. This tool would need to intelligently detect and classify objects of interest through computer vision algorithms. Existing software called the Rapid Image Exploitation Resource (RAPIER®) was designed by engineers at Space and Naval Warfare Systems Center Pacific (SSC PAC) to perform exactly this function. This software automatically searches for anomalies in the ocean and reports the detections as a possible ship object. However, if the image contains a high percentage of cloud coverage, a high number of false positives are triggered by the clouds. The focus of this thesis is to explore various feature extraction and classification methods to accurately distinguish clouds from ship objects. An examination of a texture analysis method, line detection using the Hough transform, and edge detection using wavelets are explored as possible feature extraction methods. The features are then supplied to a K-Nearest Neighbors (KNN) or Support Vector Machine (SVM) classifier. Parameter options for these classifiers are explored and the optimal parameters are determined.
Principles of computer processing of Landsat data for geologic applications
Taranik, James V.
1978-01-01
The main objectives of computer processing of Landsat data for geologic applications are to improve display of image data to the analyst or to facilitate evaluation of the multispectral characteristics of the data. Interpretations of the data are made from enhanced and classified data by an analyst trained in geology. Image enhancements involve adjustments of brightness values for individual picture elements. Image classification involves determination of the brightness values of picture elements for a particular cover type. Histograms are used to display the range and frequency of occurrence of brightness values. Landsat-1 and -2 data are preprocessed at Goddard Space Flight Center (GSFC) to adjust for the detector response of the multispectral scanner (MSS). Adjustments are applied to minimize the effects of striping, adjust for bad-data lines and line segments and lost individual pixel data. Because illumination conditions and landscape characteristics vary considerably and detector response changes with time, the radiometric adjustments applied at GSFC are seldom perfect and some detector striping remain in Landsat data. Rotation of the Earth under the satellite and movements of the satellite platform introduce geometric distortions in the data that must also be compensated for if image data are to be correctly displayed to the data analyst. Adjustments to Landsat data are made to compensate for variable solar illumination and for atmospheric effects. GeoMetric registration of Landsat data involves determination of the spatial location of a pixel in. the output image and the determination of a new value for the pixel. The general objective of image enhancement is to optimize display of the data to the analyst. Contrast enhancements are employed to expand the range of brightness values in Landsat data so that the data can be efficiently recorded in a manner desired by the analyst. Spatial frequency enhancements are designed to enhance boundaries between features which have subtle differences in brightness values. Ratioing tends to reduce the effects due to topography and it tends to emphasize changes in brightness values between two Landsat bands. Simulated natural color is produced for geologists so that the colors of materials on images appear similar to colors of actual materials in the field. Image classification of Landsat data involves both machine assisted delineation of multispectral patterns in four-dimensional spectral space and identification of machine delineated multispectral patterns that represent particular cover conditions. The geological information derived from an analysis of a multispectral classification is usually related to lithology.
Critical event management with geographic information system technology
NASA Astrophysics Data System (ADS)
Booth, John F.; Young, Jeffrey M.
1997-02-01
Critical event management at the Los Angeles County Regional Criminal Information Clearinghouse (LACRCIC) provides for the deconfliction of operations, such as reverse stings, arrests, undercover buys/busts, searches, surveillances, and site surveys in the Los Angeles, Orange, Riverside, and San Bernardino county area. During these operations, the opportunity for officer-to-officer confrontation is high, possibly causing a worse case scenario -- officers drawing on each other resulting in friendly fire injuries or casualties. In order to prevent local, state, and federal agencies in the Los Angeles area from experiencing this scenario, the LACRCIC provides around the clock critical event management services via its secure war room. The war room maintains a multicounty detailed street-level map base and geographic information system (GIS) application to support this effort. Operations are telephoned in by the participating agencies and posted in the critical event management system by war room analysts. The application performs both a proximity search around the address and a commonality of suspects search. If a conflict is found, the system alerts the analyst by sounding an audible alarm and flashing the conflicting events on the automated basemap. The analyst then notifies the respective agencies of the conflicting critical events so coordination or rescheduling can occur.
FOCIS: A forest classification and inventory system using LANDSAT and digital terrain data
NASA Technical Reports Server (NTRS)
Strahler, A. H.; Franklin, J.; Woodcook, C. E.; Logan, T. L.
1981-01-01
Accurate, cost-effective stratification of forest vegetation and timber inventory is the primary goal of a Forest Classification and Inventory System (FOCIS). Conventional timber stratification using photointerpretation can be time-consuming, costly, and inconsistent from analyst to analyst. FOCIS was designed to overcome these problems by using machine processing techniques to extract and process tonal, textural, and terrain information from registered LANDSAT multispectral and digital terrain data. Comparison of samples from timber strata identified by conventional procedures showed that both have about the same potential to reduce the variance of timber volume estimates over simple random sampling.
Preventing Stress Disorders for Law Enforcement Officers Exposed to Disturbing Media
2016-09-01
soldiers suffering from post- traumatic stress disorder (PTSD), including group therapy, cognitive behavioral therapy, and service dogs. Further research...child pornography, child exploitation, group therapy, counterintelligence analyst, computer forensics, forensic examiner 15. NUMBER OF PAGES...from post-traumatic stress disorder (PTSD), including group therapy, cognitive behavioral therapy, and service dogs. Further research should be
The Start of a Tech Revolution
ERIC Educational Resources Information Center
Dyrli, Kurt O.
2009-01-01
We are at the start of a revolution in the use of computers, one that analysts predict will rival the development of the PC in its significance. Companies such as Google, HP, Amazon, Sun Microsystems, Sony, IBM, and Apple are orienting their entire business models toward this change, and software maker SAS has announced plans for a $70 million…
Workshop report on large-scale matrix diagonalization methods in chemistry theory institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bischof, C.H.; Shepard, R.L.; Huss-Lederman, S.
The Large-Scale Matrix Diagonalization Methods in Chemistry theory institute brought together 41 computational chemists and numerical analysts. The goal was to understand the needs of the computational chemistry community in problems that utilize matrix diagonalization techniques. This was accomplished by reviewing the current state of the art and looking toward future directions in matrix diagonalization techniques. This institute occurred about 20 years after a related meeting of similar size. During those 20 years the Davidson method continued to dominate the problem of finding a few extremal eigenvalues for many computational chemistry problems. Work on non-diagonally dominant and non-Hermitian problems asmore » well as parallel computing has also brought new methods to bear. The changes and similarities in problems and methods over the past two decades offered an interesting viewpoint for the success in this area. One important area covered by the talks was overviews of the source and nature of the chemistry problems. The numerical analysts were uniformly grateful for the efforts to convey a better understanding of the problems and issues faced in computational chemistry. An important outcome was an understanding of the wide range of eigenproblems encountered in computational chemistry. The workshop covered problems involving self- consistent-field (SCF), configuration interaction (CI), intramolecular vibrational relaxation (IVR), and scattering problems. In atomic structure calculations using the Hartree-Fock method (SCF), the symmetric matrices can range from order hundreds to thousands. These matrices often include large clusters of eigenvalues which can be as much as 25% of the spectrum. However, if Cl methods are also used, the matrix size can be between 10{sup 4} and 10{sup 9} where only one or a few extremal eigenvalues and eigenvectors are needed. Working with very large matrices has lead to the development of« less
Plutonium Critical Mass Curve Comparison to Mass at Upper Subcritical Limit (USL) Using Whisper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alwin, Jennifer Louise; Zhang, Ning
Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the MCNP ® Monte Carlo radiation transport package. Standard approaches to validation rely on the selection of benchmarks based upon expert judgment. Whisper uses sensitivity/uncertainty (S/U) methods to select relevant benchmarks to a particular application or set of applications being analyzed. Using these benchmarks, Whisper computes a calculational margin. Whisper attempts to quantify the margin of subcriticality (MOS) from errors in software and uncertainties in nuclear data. The combination of the Whisper-derived calculational margin and MOS comprise the baseline upper subcritical limit (USL), tomore » which an additional margin may be applied by the nuclear criticality safety analyst as appropriate to ensure subcriticality. A series of critical mass curves for plutonium, similar to those found in Figure 31 of LA-10860-MS, have been generated using MCNP6.1.1 and the iterative parameter study software, WORM_Solver. The baseline USL for each of the data points of the curves was then computed using Whisper 1.1. The USL was then used to determine the equivalent mass for plutonium metal-water system. ANSI/ANS-8.1 states that it is acceptable to use handbook data, such as the data directly from the LA-10860-MS, as it is already considered validated (Section 4.3 4) “Use of subcritical limit data provided in ANSI/ANS standards or accepted reference publications does not require further validation.”). This paper attempts to take a novel approach to visualize traditional critical mass curves and allows comparison with the amount of mass for which the k eff is equal to the USL (calculational margin + margin of subcriticality). However, the intent is to plot the critical mass data along with USL, not to suggest that already accepted handbook data should have new and more rigorous requirements for validation.« less
Principles and tools for collaborative entity-based intelligence analysis.
Bier, Eric A; Card, Stuart K; Bodnar, John W
2010-01-01
Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.
PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czuchlewski, Kristina Rodriguez; Hart, William E.
Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less
1993-09-15
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall SPace Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).
1993-12-15
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).
A model for anomaly classification in intrusion detection systems
NASA Astrophysics Data System (ADS)
Ferreira, V. O.; Galhardi, V. V.; Gonçalves, L. B. L.; Silva, R. C.; Cansian, A. M.
2015-09-01
Intrusion Detection Systems (IDS) are traditionally divided into two types according to the detection methods they employ, namely (i) misuse detection and (ii) anomaly detection. Anomaly detection has been widely used and its main advantage is the ability to detect new attacks. However, the analysis of anomalies generated can become expensive, since they often have no clear information about the malicious events they represent. In this context, this paper presents a model for automated classification of alerts generated by an anomaly based IDS. The main goal is either the classification of the detected anomalies in well-defined taxonomies of attacks or to identify whether it is a false positive misclassified by the IDS. Some common attacks to computer networks were considered and we achieved important results that can equip security analysts with best resources for their analyses.
Comparison of air-coupled GPR data analysis results determined by multiple analysts
NASA Astrophysics Data System (ADS)
Martino, Nicole; Maser, Ken
2016-04-01
Current bridge deck condition assessments using ground penetrating radar (GPR) requires a trained analyst to manually interpret substructure layering information from B-scan images in order to proceed with an intended analysis (pavement thickness, concrete cover, effects of rebar corrosion, etc.) For example, a recently developed method to rapidly and accurately analyze air-coupled GPR data based on the effects of rebar corrosion, requires that a user "picks" a layer of rebar reflections in each B-scan image collected along the length of the deck. These "picks" have information like signal amplitude and two way travel time. When a deck is new, or has little rebar corrosion, the resulting layer of rebar reflections is readily evident and there is little room for subjectivity. However, when a deck is severely deteriorated, the rebar layer may be difficult to identify, and different analysts may make different interpretations of the appropriate layer to analyze. One highly corroded bridge deck, was assessed with a number of nondestructive evaluation techniques including 2GHz air-coupled GPR. Two trained analysts separately selected the rebar layer in each B-scan image, choosing as much information as possible, even in areas of significant deterioration. The post processing of the selected data points was then completed and the results from each analyst were contour plotted to observe any discrepancies. The paper describes the differences between ground coupled and air-coupled GPR systems, the data collection and analysis methods used by two different analysts for one case study, and the results of the two different analyses.
Signal Quality and the Reliability of Seismic Observations
NASA Astrophysics Data System (ADS)
Zeiler, C. P.; Velasco, A. A.; Pingitore, N. E.
2009-12-01
The ability to detect, time and measure seismic phases depends on the location, size, and quality of the recorded signals. Additional constraints are an analyst’s familiarity with a seismogenic zone and with the seismic stations that record the energy. Quantification and qualification of an analyst’s ability to detect, time and measure seismic signals has not been calculated or fully assessed. The fundamental measurement for computing the accuracy of a seismic measurement is the signal quality. Several methods have been proposed to measure signal quality; however, the signal-to-noise ratio (SNR) has been adopted as a short-term average over the long-term average. While the standard SNR is an easy and computationally inexpensive term, the overall statistical significance has not been computed for seismic measurement analysis. The prospect of canonizing the process of cataloging seismic arrivals hinges on the ability to repeat measurements made by different methods and analysts. The first step in canonizing phase measurements has been done by the IASPEI, which established a reference for accepted practices in naming seismic phases. The New Manual for Seismological Observatory Practices (NMSOP, 2002) outlines key observations for seismic phases recorded at different distances and proposes to quantify timing uncertainty with a user-specified windowing technique. However, this added measurement would not completely remove bias introduced by different techniques used by analysts to time seismic arrivals. The general guideline to time a seismic arrival is to record the time where a noted change in frequency and/or amplitude begins. This is generally achieved by enhancing the arrivals through filtering or beam forming. However, these enhancements can alter the characteristics of the arrival and how the arrival will be measured. Furthermore, each enhancement has user-specified parameters that can vary between analysts and this results in reduced ability to repeat measurements between analysts. The SPEAR project (Zeiler and Velasco, 2009) has started to explore the effects of comparing measurements from the same seismograms. Initial results showed that experience and the signal quality are the leading contributors to pick differences. However, the traditional SNR method of measuring signal quality was replaced by a Wide-band Spectral Ratio (WSR) due to a decrease in scatter. This observation brings up an important question of what is the best way to measure signal quality. We compare various methods (traditional SNR, WSR, power spectral density plots, Allan Variance) that have been proposed to measure signal quality and discuss which method provides the best tool to compare arrival time uncertainty.
ATTDES: An Expert System for Satellite Attitude Determination and Control. 2
NASA Technical Reports Server (NTRS)
Mackison, Donald L.; Gifford, Kevin
1996-01-01
The design, analysis, and flight operations of satellite attitude determintion and attitude control systems require extensive mathematical formulations, optimization studies, and computer simulation. This is best done by an analyst with extensive education and experience. The development of programs such as ATTDES permit the use of advanced techniques by those with less experience. Typical tasks include the mission analysis to select stabilization and damping schemes, attitude determination sensors and algorithms, and control system designs to meet program requirements. ATTDES is a system that includes all of these activities, including high fidelity orbit environment models that can be used for preliminary analysis, parameter selection, stabilization schemes, the development of estimators covariance analyses, and optimization, and can support ongoing orbit activities. The modification of existing simulations to model new configurations for these purposes can be an expensive, time consuming activity that becomes a pacing item in the development and operation of such new systems. The use of an integrated tool such as ATTDES significantly reduces the effort and time required for these tasks.
Design Through Manufacturing: The Solid Model-Finite Element Analysis Interface
NASA Technical Reports Server (NTRS)
Rubin, Carol
2002-01-01
State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts reflecting every detail of the finished product. Ideally, in the aerospace industry, these models should fulfill two very important functions: (1) provide numerical. control information for automated manufacturing of precision parts, and (2) enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in aircraft and space vehicles. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. Presently, the process of preparing CAD models for FEA consumes a great deal of the analyst's time.
DOT National Transportation Integrated Search
2009-01-01
This booklet provides an overview of SafetyAnalyst. SafetyAnalyst is a set of software tools under development to help State and local highway agencies advance their programming of site-specific safety improvements. SafetyAnalyst will incorporate sta...
A crisis in the analyst's life: self-containment, symbolization, and the holding space.
Michelle, Flax
2011-04-01
Most analysts will experience some degree of crisis in the course of their working life. This paper explores the complex interplay between the analyst's affect during a crisis in her lifeü and the affective dynamics of the patient. The central question is "who or what holds the analyst"--especially in times of crisis. Symbolization of affect, facilitated by the analyst's self-created holding environment, is seen as a vital process in order for containment to take place. In the clinical case presented, the analyst's dog was an integral part of the analyst's self-righting through this difficult period; the dog functioned as an "analytic object" within the analysis.
An Introduction to the Mission Risk Diagnostic for Incident Management Capabilities (MRD-IMC)
2014-05-01
objectives. Analysts applying the MRD- IMC evaluate a set of systemic risk factors (called drivers) to aggregate decision-making data and provide decision...function is in position to achieve its mission and objective(s) [Alberts 2012]. To accomplish this goal, analysts applying the MRD- IMC evaluate a...005 | 3 evaluation of IM processes and capabilities. The MRD- IMC comprises the following three core tasks: 1. Identify the mission and objective(s
Quadrennial Review of Military Compensation (7th). Allowances. Major Topical Summary (MTS) 3
1992-08-01
Colonel D. Cragin Shelton, ANG Compensation Analyst Major Daniel J. Arena, USA Compensation Analyst QRMC SUPPORT Mr. William H. Warnock Director xviii...of living in the 84 randomly selected areas, in rank order. The QRMC also had Runzheimer survey what were ’ William H. Albright, Benjamin R. Baker...Directorate of Plans, Programs and Analysis, 1990. Albright, William H. et al., A Reference Guide to the 1984 Military Health Services System Beneficiary
NASA Technical Reports Server (NTRS)
Smith, W. W.
1973-01-01
A Langley Research Center version of NASTRAN Level 15.1.0 designed to provide the analyst with an added tool for debugging massive NASTRAN input data is described. The program checks all NASTRAN input data cards and displays on a CRT the graphic representation of the undeformed structure. In addition, the program permits the display and alteration of input data and allows reexecution without physically resubmitting the job. Core requirements on the CDC 6000 computer are approximately 77,000 octal words of central memory.
Synfuel program analysis. Volume 2: VENVAL users manual
NASA Astrophysics Data System (ADS)
Muddiman, J. B.; Whelan, J. W.
1980-07-01
This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.
Interpolation Method Needed for Numerical Uncertainty Analysis of Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Groves, Curtis; Ilie, Marcel; Schallhorn, Paul
2014-01-01
Using Computational Fluid Dynamics (CFD) to predict a flow field is an approximation to the exact problem and uncertainties exist. There is a method to approximate the errors in CFD via Richardson's Extrapolation. This method is based off of progressive grid refinement. To estimate the errors in an unstructured grid, the analyst must interpolate between at least three grids. This paper describes a study to find an appropriate interpolation scheme that can be used in Richardson's extrapolation or other uncertainty method to approximate errors. Nomenclature
A preliminary computer pattern analysis of satellite images of mature extratropical cyclones
NASA Technical Reports Server (NTRS)
Burfeind, Craig R.; Weinman, James A.; Barkstrom, Bruce R.
1987-01-01
This study has applied computerized pattern analysis techniques to the location and classification of features of several mature extratropical cyclones that were depicted in GOES satellite images. These features include the location of the center of the cyclone vortex core and the location of the associated occluded front. The cyclone type was classified in accord with the scheme of Troup and Streten. The present analysis was implemented on a personal computer; results were obtained within approximately one or two minutes without the intervention of an analyst.
Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record.
Ahmadi, Maryam; Aslani, Nasim
2018-01-01
With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology.
Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record
Ahmadi, Maryam; Aslani, Nasim
2018-01-01
Background: With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. Methods: The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. Results: The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. Conclusion: According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology. PMID:29719309
NASTRAN user's guide (Level 17.5)
NASA Technical Reports Server (NTRS)
Field, E. I.; Herting, D. N.; Morgan, M. J.
1979-01-01
The user's guide is a handbook for engineers and analysts who use the NASTRAN finite element computer program supplements the NASTRAN Theoretical Manual (NASA SP-221), the NASTRAN User's Manual (NASA SP-222), the NASTRAN Programmer's Manual (NASA SP-223), and the NASTRAN Demonstration Program Manual (NASA SP-224). It provides modeling hints, attributes of the program, and references to the four manuals listed.
The Classification and Evaluation of Computer-Aided Software Engineering Tools
1990-09-01
International Business Machines Corporation Customizer is a Registered Trademark of Index Technology Corporation Data Analyst is a Registered Trademark of...years, a rapid series of new approaches have been adopted including: information engineering, entity- relationship modeling, automatic code generation...support true information sharing among tools and automated consistency checking. Moreover, the repository must record and manage the relationships and
Manhunting: Counter-Network Organization for Irregular Warfare
2009-09-01
received his B.A. in International Relations from the University of Kansas in 1985 and M.S. in Computer Applications Management from Lesley University...collection management , and targeting.b. Operational level processes begin with intelligence on the adversary capabilities with granular focus to mitigate... managers work with analysts, targeting experts, decision makers, planners, and operations personnel, guiding multidisciplinary intelligence collection to
ERIC Educational Resources Information Center
Tyner, Bryan C.; Fienup, Daniel M.
2015-01-01
Graphing is socially significant for behavior analysts; however, graphing can be difficult to learn. Video modeling (VM) may be a useful instructional method but lacks evidence for effective teaching of computer skills. A between-groups design compared the effects of VM, text-based instruction, and no instruction on graphing performance.…
New Perspectives on Popular Culture, Science and Technology: Web Browsers and the New Illiteracy
ERIC Educational Resources Information Center
Charters, Elizabeth
2004-01-01
Analysts predict that the knowledge economy of the near future will require people to be both computer literate and print literate. However, some of the reading and thinking habits of current college students suggest that electronic media such as web browsers may be limiting the new generation's ability to absorb and process what they read. Their…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Dustin Yewell
Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less
Relatedness, national boarders, perceptions of firms and the value of their innovations
NASA Astrophysics Data System (ADS)
Castor, Adam R.
The main goal of this dissertation is to better understand how external corporate stakeholder perceptions of relatedness affect important outcomes for companies. In pursuit of this goal, I apply the lens of category studies. Categories not only help audiences to distinguish between members of different categories, they also convey patterns of relatedness. In turn, this may have implications for understanding how audiences search, what they attend to, and how the members are ultimately valued. In the first chapter, I apply incites from social psychology to show how the nationality of audience members affects the way that they cognitively group objects into similar categories. I find that the geographic location of stock market analysts affect the degree to which they will revise their earnings estimates for a given company in the wake of an earnings miss by another firm in the same industry. Foreign analysts revise their earnings estimates downward more so than do local analysts, suggesting that foreign analysts ascribe the earnings miss more broadly and tend to lump companies located in the same country into larger groups than do local analysts. In the second chapter, I demonstrate that the structure of inter-category relationships can have consequential effects for the members of a focal category. Leveraging an experimental-like design, I study the outcomes of nanotechnology patents and the pattern of forward citations across multiple patent jurisdictions. I find that members of technology categories with many close category 'neighbors' are more broadly cited than members of categories with few category 'neighbors.' My findings highlight how category embeddedness and category system structure affect the outcomes of category members as well as the role that classification plays in the valuation of innovation. In the third chapter, I propose a novel and dynamic measure of corporate similarity that is constructed from the two-mode analyst and company coverage network. The approach creates a fine-grained continuous measure of company similarity that can be used as an alternative or supplement to existing static industry classification systems. I demonstrate the value of this new measure in the context of predicting financial market responses to merger and acquisition deals.
Design Science Methodology Applied to a Chemical Surveillance Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.
Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specificmore » use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.« less
Examination of suspicious objects by virus analysts
NASA Astrophysics Data System (ADS)
Ananin, E. V.; Ananina, I. S.; Nikishova, A. V.
2018-05-01
The paper presents data on virus threats urgency. But in order for antiviruses to work properly, all data on new implementations of viruses should be added to its database. For that to be done, all suspicious objects should be investigated. It is a dangerous process and should be done in the virtual system. However, it is not secure for the main system as well. So the diagram of a secure workplace for a virus analyst is proposed. It contains software for its protection. Also all kinds of setting to ensure security of the process of investigating suspicious objects are proposed. The proposed approach allows minimizing risks caused by the virus.
A survey of tools and resources for the next generation analyst
NASA Astrophysics Data System (ADS)
Hall, David L.; Graham, Jake; Catherman, Emily
2015-05-01
We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.
Geiss, S; Einax, J W
2001-07-01
Detection limit, reporting limit and limit of quantitation are analytical parameters which describe the power of analytical methods. These parameters are used for internal quality assurance and externally for competing, especially in the case of trace analysis in environmental compartments. The wide variety of possibilities for computing or obtaining these measures in literature and in legislative rules makes any comparison difficult. Additionally, a host of terms have been used within the analytical community to describe detection and quantitation capabilities. Without trying to create an order for the variety of terms, this paper is aimed at providing a practical proposal for answering the main questions for the analysts concerning quality measures above. These main questions and related parameters were explained and graphically demonstrated. Estimation and verification of these parameters are the two steps to get real measures. A rule for a practical verification is given in a table, where the analyst can read out what to measure, what to estimate and which criteria have to be fulfilled. In this manner verified parameters detection limit, reporting limit and limit of quantitation now are comparable and the analyst himself is responsible to the unambiguity and reliability of these measures.
Interpretation and the psychic future.
Cooper, S H
1997-08-01
The author applies the analyst's multi-faceted awareness of his or her view of the patient's psychic future to analytic process. Loewald's (1960) interest in the way in which the analyst anticipates the future of the patient was linked to his epistemological assumptions about the analyst's superior objectivity and maturity relative to the patient. The elucidation of the authority of the analyst (e.g. Hoffman, 1991, 1994) allows us to begin to disentangle the analyst's view of the patient's psychic future from some of these epistemological assumptions. Clinical illustrations attempt to show how the analyst's awareness of this aspect of the interpretive process is often deconstructed over time and can help to understand aspects of resistance from both analyst and patient. This perspective may provide one more avenue for understanding our various modes of influence through interpretive process.
NASA Technical Reports Server (NTRS)
Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.
1994-01-01
At NASA's Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analysts Assistant (GenSAA), was developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. This paper describes GenSAA's capabilities and how it is supporting monitoring functions of current and future NASA missions for a variety of satellite monitoring applications ranging from subsystem health and safety to spacecraft attitude. Finally, this paper addresses efforts to generalize GenSAA's data interface for more widespread usage throughout the space and commercial industry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Paul; Hanson, Paige; Ardi, Calvin
2016-11-04
A system for processing network packet capture streams, extracting metadata and generating flow records (via Argus). The system can be used by network security operators and analysts to enable forensic investigations for network security events.
NASA Technical Reports Server (NTRS)
Elfer, N.; Meibaum, R.; Olsen, G.
1995-01-01
A unique collection of computer codes, Space Debris Surfaces (SD_SURF), have been developed to assist in the design and analysis of space debris protection systems. SD_SURF calculates and summarizes a vehicle's vulnerability to space debris as a function of impact velocity and obliquity. An SD_SURF analysis will show which velocities and obliquities are the most probable to cause a penetration. This determination can help the analyst select a shield design that is best suited to the predominant penetration mechanism. The analysis also suggests the most suitable parameters for development or verification testing. The SD_SURF programs offer the option of either FORTRAN programs or Microsoft-EXCEL spreadsheets and macros. The FORTRAN programs work with BUMPERII. The EXCEL spreadsheets and macros can be used independently or with selected output from the SD_SURF FORTRAN programs. Examples will be presented of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration.
A simple node and conductor data generator for SINDA
NASA Technical Reports Server (NTRS)
Gottula, Ronald R.
1992-01-01
This paper presents a simple, automated method to generate NODE and CONDUCTOR DATA for thermal match modes. The method uses personal computer spreadsheets to create SINDA inputs. It was developed in order to make SINDA modeling less time consuming and serves as an alternative to graphical methods. Anyone having some experience using a personal computer can easily implement this process. The user develops spreadsheets to automatically calculate capacitances and conductances based on material properties and dimensional data. The necessary node and conductor information is then taken from the spreadsheets and automatically arranged into the proper format, ready for insertion directly into the SINDA model. This technique provides a number of benefits to the SINDA user such as a reduction in the number of hand calculations, and an ability to very quickly generate a parametric set of NODE and CONDUCTOR DATA blocks. It also provides advantages over graphical thermal modeling systems by retaining the analyst's complete visibility into the thermal network, and by permitting user comments anywhere within the DATA blocks.
Using soft-hard fusion for misinformation detection and pattern of life analysis in OSINT
NASA Astrophysics Data System (ADS)
Levchuk, Georgiy; Shabarekh, Charlotte
2017-05-01
Today's battlefields are shifting to "denied areas", where the use of U.S. Military air and ground assets is limited. To succeed, the U.S. intelligence analysts increasingly rely on available open-source intelligence (OSINT) which is fraught with inconsistencies, biased reporting and fake news. Analysts need automated tools for retrieval of information from OSINT sources, and these solutions must identify and resolve conflicting and deceptive information. In this paper, we present a misinformation detection model (MDM) which converts text to attributed knowledge graphs and runs graph-based analytics to identify misinformation. At the core of our solution is identification of knowledge conflicts in the fused multi-source knowledge graph, and semi-supervised learning to compute locally consistent reliability and credibility scores for the documents and sources, respectively. We present validation of proposed method using an open source dataset constructed from the online investigations of MH17 downing in Eastern Ukraine.
1993-09-15
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Centerr (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provided general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.
1993-09-15
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. The Marshall Space Flight Center (MSFC) in Huntsville, Alabama began to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models were used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup was to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability providedgeneral visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC). The X-34 program was cancelled in 2001.
Model verification of large structural systems. [space shuttle model response
NASA Technical Reports Server (NTRS)
Lee, L. T.; Hasselman, T. K.
1978-01-01
A computer program for the application of parameter identification on the structural dynamic models of space shuttle and other large models with hundreds of degrees of freedom is described. Finite element, dynamic, analytic, and modal models are used to represent the structural system. The interface with math models is such that output from any structural analysis program applied to any structural configuration can be used directly. Processed data from either sine-sweep tests or resonant dwell tests are directly usable. The program uses measured modal data to condition the prior analystic model so as to improve the frequency match between model and test. A Bayesian estimator generates an improved analytical model and a linear estimator is used in an iterative fashion on highly nonlinear equations. Mass and stiffness scaling parameters are generated for an improved finite element model, and the optimum set of parameters is obtained in one step.
1964-12-01
Jeanette Scissum joined NASA’s Marshall Space Flight Center in 1964 after earning bachelor's and master's degrees in mathematics from Alabama A&M University. Scissum published a NASA report in 1967, “Survey of Solar Cycle Prediction Models,” which put forward techniques for improved forecasting of the sunspot cycle. In the mid-1970s she worked as a space scientist in the Space Environment Branch of Marshall’s Space Sciences Laboratory and later led activities in Marshall’s Atmospheric, Magnetospheric, and Plasmas in Space project.In 1975, Scissum wrote an article for the National Technical Association, “Equal Employment Opportunity and the Supervisor – A Counselor’s View,” which argued that many discrimination complaints could be avoided “through adequate and meaningful communication.” Scissum later worked at NASA Headquarters as a computer systems analyst responsible for analyzing and directing NASA management information and technical support systems.
Marshall Engineers Use Virtual Reality
NASA Technical Reports Server (NTRS)
1993-01-01
Virtual Reality (VR) can provide cost effective methods to design and evaluate components and systems for maintenance and refurbishment operations. Marshall Spce Flight Center (MSFC) is begirning to utilize VR for design analysis in the X-34 experimental reusable space vehicle. Analysts at MSFC's Computer Applications and Virtual Environments (CAVE) used Head Mounted Displays (HMD) (pictured), spatial trackers and gesture inputs as a means to animate or inhabit a properly sized virtual human model. These models are used in a VR scenario as a way to determine functionality of space and maintenance requirements for the virtual X-34. The primary functions of the virtual X-34 mockup is to support operations development and design analysis for engine removal, the engine compartment and the aft fuselage. This capability provides general visualization support to engineers and designers at MSFC and to the System Design Freeze Review at Orbital Sciences Corporation (OSC).
Jenson, Susan K.; Domingue, Julia O.
1988-01-01
The first phase of analysis is a conditioning phase that generates three data sets: the original OEM with depressions filled, a data set indicating the flow direction for each cell, and a flow accumulation data set in which each cell receives a value equal to the total number of cells that drain to it. The original OEM and these three derivative data sets can then be processed in a variety of ways to optionally delineate drainage networks, overland paths, watersheds for userspecified locations, sub-watersheds for the major tributaries of a drainage network, or pour point linkages between watersheds. The computer-generated drainage lines and watershed polygons and the pour point linkage information can be transferred to vector-based geographic information systems for futher analysis. Comparisons between these computergenerated features and their manually delineated counterparts generally show close agreement, indicating that these software tools will save analyst time spent in manual interpretation and digitizing.
NASA Astrophysics Data System (ADS)
Garfinkle, Noah W.; Selig, Lucas; Perkins, Timothy K.; Calfas, George W.
2017-05-01
Increasing worldwide internet connectivity and access to sources of print and open social media has increased near realtime availability of textual information. Capabilities to structure and integrate textual data streams can contribute to more meaningful representations of operational environment factors (i.e., Political, Military, Economic, Social, Infrastructure, Information, Physical Environment, and Time [PMESII-PT]) and tactical civil considerations (i.e., Areas, Structures, Capabilities, Organizations, People and Events [ASCOPE]). However, relying upon human analysts to encode this information as it arrives quickly proves intractable. While human analysts possess an ability to comprehend context in unstructured text far beyond that of computers, automated geoparsing (the extraction of locations from unstructured text) can empower analysts to automate sifting through datasets for areas of interest. This research evaluates existing approaches to geoprocessing as well as initiating the research and development of locally-improved methods of tagging parts of text as possible locations, resolving possible locations into coordinates, and interfacing such results with human analysts. The objective of this ongoing research is to develop a more contextually-complete picture of an area of interest (AOI) including human-geographic context for events. In particular, our research is working to make improvements to geoparsing (i.e., the extraction of spatial context from documents), which requires development, integration, and validation of named-entity recognition (NER) tools, gazetteers, and entity-attribution. This paper provides an overview of NER models and methodologies as applied to geoparsing, explores several challenges encountered, presents preliminary results from the creation of a flexible geoparsing research pipeline, and introduces ongoing and future work with the intention of contributing to the efficient geocoding of information containing valuable insights into human activities in space.
MetaboAnalystR: an R package for flexible and reproducible analysis of metabolomics data.
Chong, Jasmine; Xia, Jianguo
2018-06-28
The MetaboAnalyst web application has been widely used for metabolomics data analysis and interpretation. Despite its user-friendliness, the web interface has presented its inherent limitations (especially for advanced users) with regard to flexibility in creating customized workflow, support for reproducible analysis, and capacity in dealing with large data. To address these limitations, we have developed a companion R package (MetaboAnalystR) based on the R code base of the web server. The package has been thoroughly tested to ensure that the same R commands will produce identical results from both interfaces. MetaboAnalystR complements the MetaboAnalyst web server to facilitate transparent, flexible and reproducible analysis of metabolomics data. MetaboAnalystR is freely available from https://github.com/xia-lab/MetaboAnalystR. Supplementary data are available at Bioinformatics online.
Code of Federal Regulations, 2010 CFR
2010-04-01
... officer (GS-11/4) and the analyst (GS-12/4) is computed as follows: Hours Gross number of working hours in... d 208 Sick leave—13 d 104 Total 384 Net number of working hours 1,696 Gross number of working hours in 52 40-hr weeks 2,080 Working hour equivalent of Government contributions for employee retirement...
ERIC Educational Resources Information Center
RONEY, MAURICE W.; AND OTHERS
DESIGNED FOR USE IN PLANNING PREPARATORY PROGRAMS, THIS CURRICULUM CAN ALSO BE USEFUL IN PLANNING EXTENSION COURSES FOR EMPLOYED PERSONS. MATERIALS WERE ADAPTED FROM A GUIDE PREPARED BY ORANGE COAST COLLEGE, CALIFORNIA, UNDER A CONTRACTUAL ARRANGEMENT WITH THE U.S. OFFICE OF EDUCATION, AND REVIEWED BY A COMMITTEE COMPOSED OF SPECIALISTS IN DATA…
Training Guide for the Management Analyst Industrial Engineer Technician
1979-07-01
comtemporary work operations, and blending traditional and modern organization concepts, the student devwlops the facility to analyze and create organization...training, the attendee will know the functions of a computer as it processes business data to produce information for improved management. He will...action which is most cost effective when considering proposed investments. Emphasis is placed on the adaption of general business practices to
ERIC Educational Resources Information Center
CARTER, FOREST C.
AN 8-WEEK SEMINAR WAS HELD TO RETRAIN TEACHERS WITH A MINIMUM OF 3-YEARS' EXPERIENCE IN BUSINESS OR OFFICE EDUCATION TO TEACH BUSINESS DATA PROCESSING AND PROGRAMING TECHNIQUES. THE OBJECTIVES WERE TO ASSIST IN THE KNOWLEDGE AND SKILL DEVELOPMENT NECESSARY FOR PREPARING COMPUTER PROGRAMERS AND APPLICATION ANALYSTS, AND TO DEVELOP COURSE MATERIAL,…
NASA Technical Reports Server (NTRS)
Aster, R. W.; Chamberlain, R. G.; Zendejas, S. C.; Lee, T. S.; Malhotra, S.
1986-01-01
Company-wide or process-wide production simulated. Price Estimation Guidelines (IPEG) program provides simple, accurate estimates of prices of manufactured products. Simplification of SAMIS allows analyst with limited time and computing resources to perform greater number of sensitivity studies. Although developed for photovoltaic industry, readily adaptable to standard assembly-line type of manufacturing industry. IPEG program estimates annual production price per unit. IPEG/PC program written in TURBO PASCAL.
Evaluation of SNS Beamline Shielding Configurations using MCNPX Accelerated by ADVANTG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Risner, Joel M; Johnson, Seth R.; Remec, Igor
2015-01-01
Shielding analyses for the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory pose significant computational challenges, including highly anisotropic high-energy sources, a combination of deep penetration shielding and an unshielded beamline, and a desire to obtain well-converged nearly global solutions for mapping of predicted radiation fields. The majority of these analyses have been performed using MCNPX with manually generated variance reduction parameters (source biasing and cell-based splitting and Russian roulette) that were largely based on the analyst's insight into the problem specifics. Development of the variance reduction parameters required extensive analyst time, and was often tailored to specific portionsmore » of the model phase space. We previously applied a developmental version of the ADVANTG code to an SNS beamline study to perform a hybrid deterministic/Monte Carlo analysis and showed that we could obtain nearly global Monte Carlo solutions with essentially uniform relative errors for mesh tallies that cover extensive portions of the model with typical voxel spacing of a few centimeters. The use of weight window maps and consistent biased sources produced using the FW-CADIS methodology in ADVANTG allowed us to obtain these solutions using substantially less computer time than the previous cell-based splitting approach. While those results were promising, the process of using the developmental version of ADVANTG was somewhat laborious, requiring user-developed Python scripts to drive much of the analysis sequence. In addition, limitations imposed by the size of weight-window files in MCNPX necessitated the use of relatively coarse spatial and energy discretization for the deterministic Denovo calculations that we used to generate the variance reduction parameters. We recently applied the production version of ADVANTG to this beamline analysis, which substantially streamlined the analysis process. We also tested importance function collapsing (in space and energy) capabilities in ADVANTG. These changes, along with the support for parallel Denovo calculations using the current version of ADVANTG, give us the capability to improve the fidelity of the deterministic portion of the hybrid analysis sequence, obtain improved weight-window maps, and reduce both the analyst and computational time required for the analysis process.« less
NASA Astrophysics Data System (ADS)
Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.
2016-12-01
The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.
Towards an automated intelligence product generation capability
NASA Astrophysics Data System (ADS)
Smith, Alison M.; Hawes, Timothy W.; Nolan, James J.
2015-05-01
Creating intelligence information products is a time consuming and difficult process for analysts faced with identifying key pieces of information relevant to a complex set of information requirements. Complicating matters, these key pieces of information exist in multiple modalities scattered across data stores, buried in huge volumes of data. This results in the current predicament analysts find themselves; information retrieval and management consumes huge amounts of time that could be better spent performing analysis. The persistent growth in data accumulation rates will only increase the amount of time spent on these tasks without a significant advance in automated solutions for information product generation. We present a product generation tool, Automated PrOduct Generation and Enrichment (APOGEE), which aims to automate the information product creation process in order to shift the bulk of the analysts' effort from data discovery and management to analysis. APOGEE discovers relevant text, imagery, video, and audio for inclusion in information products using semantic and statistical models of unstructured content. APOGEEs mixed-initiative interface, supported by highly responsive backend mechanisms, allows analysts to dynamically control the product generation process ensuring a maximally relevant result. The combination of these capabilities results in significant reductions in the time it takes analysts to produce information products while helping to increase the overall coverage. Through evaluation with a domain expert, APOGEE has been shown the potential to cut down the time for product generation by 20x. The result is a flexible end-to-end system that can be rapidly deployed in new operational settings.
Geographic Visualization of Power-Grid Dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R.
2015-06-18
The visualization enables the simulation analyst to see changes in the frequency through time and space. With this technology, the analyst has a bird's eye view of the frequency at loads and generators as the simulated power system responds to the loss of a generator, spikes in load, and other contingencies. The significance of a contingency to the operation of an electrical power system depends critically on how the resulting tansients evolve in time and space. Consequently, these dynamic events can only be understood when seen in their proper geographic context. this understanding is indispensable to engineers working on themore » next generation of distributed sensing and control systems for the smart grid. By making possible a natural and intuitive presentation of dynamic behavior, our new visualization technology is a situational-awareness tool for power-system engineers.« less
A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice
ERIC Educational Resources Information Center
Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.
2015-01-01
To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…
"This strange disease": adolescent transference and the analyst's sexual orientation.
Burton, John K; Gilmore, Karen
2010-08-01
The treatment of adolescents by gay analysts is uncharted territory regarding the impact of the analyst's sexuality on the analytic process. Since a core challenge of adolescence involves the integration of the adult sexual body, gender role, and reproductive capacities into evolving identity, and since adolescents seek objects in their environment to facilitate both identity formation and the establishment of autonomy from primary objects, the analyst's sexual orientation is arguably a potent influence on the outcome of adolescent development. However, because sexual orientation is a less visible characteristic of the analyst than gender, race, or age, for example, the line between reality and fantasy is less clearly demarcated. This brings up special considerations regarding discovery and disclosure in the treatment. To explore these issues, the case of a late adolescent girl in treatment with a gay male analyst is presented. In this treatment, the question of the analyst's sexual orientation, and the demand by the patient for the analyst's self-disclosure, became a transference nucleus around which the patient's individual dynamics and adolescent dilemmas could be explored and clarified.
48 CFR 244.304 - Surveillance.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Surveillance. 244.304 Section 244.304 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT... Reviews 244.304 Surveillance. (b) The ACO, or the purchasing system analyst (PSA) with the concurrence of...
The Independent Technical Analysis Process Final Report 2006-2007.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duberstein, Corey; Ham, Kenneth; Dauble, Dennis
2007-03-01
The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities. The Independent Technical Analysis Process (ITAP) was created to provide non-routine analysis for fish and wildlife agencies and tribes in particular and the public in general on matters related tomore » juvenile and adult salmon and steelhead passage through the mainstem hydrosystem. The process was designed to maintain the independence of analysts and reviewers from parties requesting analyses, to avoid potential bias in technical products. The objectives identified for this project were to administer a rigorous, transparent process to deliver unbiased technical assistance necessary to coordinate recommendations for storage reservoir and river operations that avoid potential conflicts between anadromous and resident fish. Seven work elements, designated by numbered categories in the Pisces project tracking system, were created to define and accomplish project goals as follows: (1) 118 Coordination - Coordinate technical analysis and review process: (a) Retain expertise for analyst/reviewer roles. (b) Draft research directives. (c) Send directive to the analyst. (d) Coordinate two independent reviews of the draft report. (e) Ensure reviewer comments are addressed within the final report. (2) 162 Analyze/Interpret Data - Implement the independent aspects of the project. (3) 122 Provide Technical Review - Implement the review process for the analysts. (4) 132 Produce Annual Report - FY06 annual progress report with Pisces Disseminate (5) 161 Disseminate Raw/Summary Data and Results - Post technical products on the ITAP web site. (6) 185-Produce Pisces Status Report - Provide periodic status reports to BPA. (7) 119 Manage and Administer Projects - project/contract administration.« less
NASA Astrophysics Data System (ADS)
Kuzma, H. A.; Arehart, E.; Louie, J. N.; Witzleben, J. L.
2012-04-01
Listening to the waveforms generated by earthquakes is not new. The recordings of seismometers have been sped up and played to generations of introductory seismology students, published on educational websites and even included in the occasional symphony. The modern twist on earthquakes as music is an interest in using state-of-the-art computer algorithms for seismic data processing and evaluation. Algorithms such as such as Hidden Markov Models, Bayesian Network models and Support Vector Machines have been highly developed for applications in speech recognition, and might also be adapted for automatic seismic data analysis. Over the last three years, the International Data Centre (IDC) of the Comprehensive Test Ban Treaty Organization (CTBTO) has supported an effort to apply computer learning and data mining algorithms to IDC data processing, particularly to the problem of weeding through automatically generated event bulletins to find events which are non-physical and would otherwise have to be eliminated by the hand of highly trained human analysts. Analysts are able to evaluate events, distinguish between phases, pick new phases and build new events by looking at waveforms displayed on a computer screen. Human ears, however, are much better suited to waveform processing than are the eyes. Our hypothesis is that combining an auditory representation of seismic events with visual waveforms would reduce the time it takes to train an analyst and the time they need to evaluate an event. Since it takes almost two years for a person of extraordinary diligence to become a professional analyst and IDC contracts are limited to seven years by Treaty, faster training would significantly improve IDC operations. Furthermore, once a person learns to distinguish between true and false events by ear, various forms of audio compression can be applied to the data. The compression scheme which yields the smallest data set in which relevant signals can still be heard is likely an excellent candidate from which to draw features that can be fed into machine learning algorithms since it contains a compact numerical representation of the information that humans need to evaluate events. The challenge in this work is that, although it is relatively easy to pick out earthquake arrivals in waveform data from a single station, when stations are combined the addition of background noise tends to confuse and overwhelm the listener. To solve this problem, we rely on techniques such as the slowing down of recordings without altering the pitch which are used by ethnomusicologists to understand highly complex rhythms and sounds. We work with professional musicians and recorders to mix the data from different seismic stations in a way which reduces noise and preserves the uniqueness of each station.
Data management and analysis for the Earth System Grid
NASA Astrophysics Data System (ADS)
Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.
2008-07-01
The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.
Digital Stratigraphy: Contextual Analysis of File System Traces in Forensic Science.
Casey, Eoghan
2017-12-28
This work introduces novel methods for conducting forensic analysis of file allocation traces, collectively called digital stratigraphy. These in-depth forensic analysis methods can provide insight into the origin, composition, distribution, and time frame of strata within storage media. Using case examples and empirical studies, this paper illuminates the successes, challenges, and limitations of digital stratigraphy. This study also shows how understanding file allocation methods can provide insight into concealment activities and how real-world computer usage can complicate digital stratigraphy. Furthermore, this work explains how forensic analysts have misinterpreted traces of normal file system behavior as indications of concealment activities. This work raises awareness of the value of taking the overall context into account when analyzing file system traces. This work calls for further research in this area and for forensic tools to provide necessary information for such contextual analysis, such as highlighting mass deletion, mass copying, and potential backdating. © 2017 American Academy of Forensic Sciences.
Comparative case study between D3 and highcharts on lustre data visualization
NASA Astrophysics Data System (ADS)
ElTayeby, Omar; John, Dwayne; Patel, Pragnesh; Simmerman, Scott
2013-12-01
One of the challenging tasks in visual analytics is to target clustered time-series data sets, since it is important for data analysts to discover patterns changing over time while keeping their focus on particular subsets. In order to leverage the humans ability to quickly visually perceive these patterns, multivariate features should be implemented according to the attributes available. However, a comparative case study has been done using JavaScript libraries to demonstrate the differences in capabilities of using them. A web-based application to monitor the Lustre file system for the systems administrators and the operation teams has been developed using D3 and Highcharts. Lustre file systems are responsible of managing Remote Procedure Calls (RPCs) which include input output (I/O) requests between clients and Object Storage Targets (OSTs). The objective of this application is to provide time-series visuals of these calls and storage patterns of users on Kraken, a University of Tennessee High Performance Computing (HPC) resource in Oak Ridge National Laboratory (ORNL).
Real-time scalable visual analysis on mobile devices
NASA Astrophysics Data System (ADS)
Pattath, Avin; Ebert, David S.; May, Richard A.; Collins, Timothy F.; Pike, William
2008-02-01
Interactive visual presentation of information can help an analyst gain faster and better insight from data. When combined with situational or context information, visualization on mobile devices is invaluable to in-field responders and investigators. However, several challenges are posed by the form-factor of mobile devices in developing such systems. In this paper, we classify these challenges into two broad categories - issues in general mobile computing and issues specific to visual analysis on mobile devices. Using NetworkVis and Infostar as example systems, we illustrate some of the techniques that we employed to overcome many of the identified challenges. NetworkVis is an OpenVG-based real-time network monitoring and visualization system developed for Windows Mobile devices. Infostar is a flash-based interactive, real-time visualization application intended to provide attendees access to conference information. Linked time-synchronous visualization, stylus/button-based interactivity, vector graphics, overview-context techniques, details-on-demand and statistical information display are some of the highlights of these applications.
Cost approach of health care entity intangible asset valuation.
Reilly, Robert F
2012-01-01
In the valuation synthesis and conclusion process, the analyst should consider the following question: Does the selected valuation approach(es) and method(s) accomplish the analyst's assignment? Also, does the selected valuation approach and method actually quantify the desired objective of the intangible asset analysis? The analyst should also consider if the selected valuation approach and method analyzes the appropriate bundle of legal rights. The analyst should consider if there were sufficient empirical data available to perform the selected valuation approach and method. The valuation synthesis should consider if there were sufficient data available to make the analyst comfortable with the value conclusion. The valuation analyst should consider if the selected approach and method will be understandable to the intended audience. In the valuation synthesis and conclusion, the analyst should also consider which approaches and methods deserve the greatest consideration with respect to the intangible asset's RUL. The intangible asset RUL is a consideration of each valuation approach. In the income approach, the RUL may affect the projection period for the intangible asset income subject to either yield capitalization or direct capitalization. In the cost approach, the RUL may affect the total amount of obsolescence, if any, from the estimate cost measure (that is, the intangible reproduction cost new or replacement cost new). In the market approach, the RUL may effect the selection, rejection, and/or adjustment of the comparable or guideline intangible asset sale and license transactional data. The experienced valuation analyst will use professional judgment to weight the various value indications to conclude a final intangible asset value, based on: The analyst's confidence in the quantity and quality of available data; The analyst's level of due diligence performed on that data; The relevance of the valuation method to the intangible asset life cycle stage and degree of marketability; and The degree of variation in the range of value indications. Valuation analysts value health care intangible assets for a number of reasons. In addition to regulatory compliance reasons, these reasons include various transaction, taxation, financing, litigation, accounting, bankruptcy, and planning purposes. The valuation analyst should consider all generally accepted intangible asset valuation approaches, methods, and procedures. Many valuation analysts are more familiar with market approach and income approach valuation methods. However, there are numerous instances when cost approach valuation methods are also applicable to the health care intangible asset valuation. This discussion summarized the analyst's procedures and considerations with regard to the cost approach. The cost approach is often applicable to the valuation of intangible assets in the health care industry. However, the cost approach is only applicable if the valuation analyst (1) appropriately considers all of the cost components and (2) appropriately identifies and quantifies all obsolescence allowances. Regardless of the health care intangible asset or the reason for the valuation, the analyst should be familiar with all generally accepted valuation approaches and methods. And, the valuation analyst should have a clear, convincing, and cogent rationale for (1) accepting each approach and method applied and (2) rejecting each approach and method not applied. That way, the valuation analyst will best achieve the purpose and objective of the health care intangible asset valuation.
Application Reuse Library for Software, Requirements, and Guidelines
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Thronesbery, Carroll
1994-01-01
Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.
2012-10-01
education of a new generation of data fusion analysts Jacob L. Graham College of Information Sciences & Technology Pennsylvania State University...University Park, PA, U.S.A. jgraham@ist.psu.edu David L. Hall College of Information Sciences & Technology Pennsylvania State University...ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) College
3D Simulation of External Flooding Events for the RISMC Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad
2015-09-01
Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less
Creation of lumped parameter thermal model by the use of finite elements
NASA Technical Reports Server (NTRS)
1978-01-01
In the finite difference technique, the thermal network is represented by an analogous electrical network. The development of this network model, which is used to describe a physical system, often requires tedious and mental data preparation and checkout by the analyst which can be greatly reduced through the use of the computer programs to develop automatically the mathematical model and associated input data and graphically display the analytical model to facilitate model verification. Three separate programs are involved which are linked through common mass storage files and data card formats. These programs are SPAR, CINGEN and GEOMPLT, and are used to (1) develop thermal models for the MITAS II thermal analyzer program; (2) produce geometry plots of the thermal network; and (3) produce temperature distribution and time history plots.
ERIC Educational Resources Information Center
Arellano, Eduardo C.; Martinez, Mario C.
2009-01-01
This study compares the extent to which higher education policy analysts and master's and doctoral faculty of higher education and public affairs programs match on a set of competencies thought to be important to higher education policy analysis. Analysts matched master's faculty in three competencies while analysts and doctoral faculty matched in…
The Variability of Crater Identification Among Expert and Community Crater Analysts
NASA Astrophysics Data System (ADS)
Robbins, S. J.; Antonenko, I.; Kirchoff, M. R.; Chapman, C. R.; Fassett, C. I.; Herrick, R. R.; Singer, K.; Zanetti, M.; Lehan, C.; Huang, D.; Gay, P.
2014-04-01
Statistical studies of impact crater populations have been used to model ages of planetary surfaces for several decades [1]. This assumes that crater counts are approximately invariant and a "correct" population will be identified if the analyst is skilled and diligent. However, the reality is that crater identification is somewhat subjective, so variability between analysts, or even a single analyst's variation from day-to-day, is expected [e.g., 2, 3]. This study was undertaken to quantify that variability within an expert analyst population and between experts and minimally trained volunteers.
iTTVis: Interactive Visualization of Table Tennis Data.
Wu, Yingcai; Lan, Ji; Shu, Xinhuan; Ji, Chenyang; Zhao, Kejian; Wang, Jiachen; Zhang, Hui
2018-01-01
The rapid development of information technology paved the way for the recording of fine-grained data, such as stroke techniques and stroke placements, during a table tennis match. This data recording creates opportunities to analyze and evaluate matches from new perspectives. Nevertheless, the increasingly complex data poses a significant challenge to make sense of and gain insights into. Analysts usually employ tedious and cumbersome methods which are limited to watching videos and reading statistical tables. However, existing sports visualization methods cannot be applied to visualizing table tennis competitions due to different competition rules and particular data attributes. In this work, we collaborate with data analysts to understand and characterize the sophisticated domain problem of analysis of table tennis data. We propose iTTVis, a novel interactive table tennis visualization system, which to our knowledge, is the first visual analysis system for analyzing and exploring table tennis data. iTTVis provides a holistic visualization of an entire match from three main perspectives, namely, time-oriented, statistical, and tactical analyses. The proposed system with several well-coordinated views not only supports correlation identification through statistics and pattern detection of tactics with a score timeline but also allows cross analysis to gain insights. Data analysts have obtained several new insights by using iTTVis. The effectiveness and usability of the proposed system are demonstrated with four case studies.
Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook
2014-01-01
Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data. PMID:25225874
Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook
2014-09-15
Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data.
Effect of Variations in IRU Integration Time Interval On Accuracy of Aqua Attitude Estimation
NASA Technical Reports Server (NTRS)
Natanson, G. A.; Tracewell, Dave
2003-01-01
During Aqua launch support, attitude analysts noticed several anomalies in Onboard Computer (OBC) rates and in rates computed by the ground Attitude Determination System (ADS). These included: 1) periodic jumps in the OBC pitch rate every 2 minutes; 2) spikes in ADS pitch rate every 4 minutes; 3) close agreement between pitch rates computed by ADS and those derived from telemetered OBC quaternions (in contrast to the step-wise pattern observed for telemetered OBC rates); 4) spikes of +/- 10 milliseconds in telemetered IRU integration time every 4 minutes (despite the fact that telemetered time tags of any two sequential IRU measurements were always 1 second apart from each other). An analysis presented in the paper explains this anomalous behavior by a small average offset of about 0.5 +/- 0.05 microsec in the time interval between two sequential accumulated angle measurements. It is shown that errors in the estimated pitch angle due to neglecting the aforementioned variations in the integration time interval by the OBC is within +/- 2 arcseconds. Ground attitude solutions are found to be accurate enough to see the effect of the variations on the accuracy of the estimated pitch angle.
Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.
2016-08-18
This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.
One-to-One Wisdom: Expert Tips on How to Approach Professional Development in Laptop Environments
ERIC Educational Resources Information Center
Cutter, Chris
2006-01-01
Laptop computing programs have been in K-12 schools since the 1990s, but in recent months one-to-one learning seems to have reemerged as a top topic in education technology circles. According to Tim Wiley, senior analyst at research firm Eduventures, about 1,000 of the 15,000 school districts in the United States currently have one-to-one…
An Analysis of Defense Information and Information Technology Articles: A Sixteen-Year Perspective
2009-03-01
exploratory,” or “subjective” ( Denzin & Lincoln , 2000). Existing Research This research is based on content analysis methodologies utilized by Carter...same codes ( Denzin & Lincoln , 2000). Different analysts should code the same text in a similar manner (Weber, 1990). Typically, researchers compute...chosen. Krippendorf recommends an agreement level of at least .70 (Krippendorff, 2004). Some scholars use a cut-off rate of .80 ( Denzin & Lincoln
System-of-Systems Technology-Portfolio-Analysis Tool
NASA Technical Reports Server (NTRS)
O'Neil, Daniel; Mankins, John; Feingold, Harvey; Johnson, Wayne
2012-01-01
Advanced Technology Life-cycle Analysis System (ATLAS) is a system-of-systems technology-portfolio-analysis software tool. ATLAS affords capabilities to (1) compare estimates of the mass and cost of an engineering system based on competing technological concepts; (2) estimate life-cycle costs of an outer-space-exploration architecture for a specified technology portfolio; (3) collect data on state-of-the-art and forecasted technology performance, and on operations and programs; and (4) calculate an index of the relative programmatic value of a technology portfolio. ATLAS facilitates analysis by providing a library of analytical spreadsheet models for a variety of systems. A single analyst can assemble a representation of a system of systems from the models and build a technology portfolio. Each system model estimates mass, and life-cycle costs are estimated by a common set of cost models. Other components of ATLAS include graphical-user-interface (GUI) software, algorithms for calculating the aforementioned index, a technology database, a report generator, and a form generator for creating the GUI for the system models. At the time of this reporting, ATLAS is a prototype, embodied in Microsoft Excel and several thousand lines of Visual Basic for Applications that run on both Windows and Macintosh computers.
Enhanced detection and visualization of anomalies in spectral imagery
NASA Astrophysics Data System (ADS)
Basener, William F.; Messinger, David W.
2009-05-01
Anomaly detection algorithms applied to hyperspectral imagery are able to reliably identify man-made objects from a natural environment based on statistical/geometric likelyhood. The process is more robust than target identification, which requires precise prior knowledge of the object of interest, but has an inherently higher false alarm rate. Standard anomaly detection algorithms measure deviation of pixel spectra from a parametric model (either statistical or linear mixing) estimating the image background. The topological anomaly detector (TAD) creates a fully non-parametric, graph theory-based, topological model of the image background and measures deviation from this background using codensity. In this paper we present a large-scale comparative test of TAD against 80+ targets in four full HYDICE images using the entire canonical target set for generation of ROC curves. TAD will be compared against several statistics-based detectors including local RX and subspace RX. Even a perfect anomaly detection algorithm would have a high practical false alarm rate in most scenes simply because the user/analyst is not interested in every anomalous object. To assist the analyst in identifying and sorting objects of interest, we investigate coloring of the anomalies with principle components projections using statistics computed from the anomalies. This gives a very useful colorization of anomalies in which objects of similar material tend to have the same color, enabling an analyst to quickly sort and identify anomalies of highest interest.
The patient who believes and the analyst who does not (1).
Lijtmaer, Ruth M
2009-01-01
A patient's religious beliefs and practices challenge the clinical experience and self-knowledge of the analyst owing to a great complexity of factors, and often take the form of the analyst's resistances and countertransference reactions to spiritual and religious issues. The analyst's feelings about the patient's encounters with religion and other forms of healing experiences may result in impasses and communication breakdown for a variety of reasons. These reasons include the analyst's own unresolved issues around her role as a psychoanalyst-which incorporates in some way psychoanalysis's views of religious belief-and these old conflicts may be irritated by the religious themes expressed by the patient. Vignettes from the treatments of two patients provide examples of the analyst's countertransference conflicts, particularly envy in the case of a therapist who is an atheist.
78 FR 79694 - Privacy Act of 1974; Notice of an Updated System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-31
... analysts and managers. System information also may be used: a. In any legal proceeding, where pertinent, to...] Privacy Act of 1974; Notice of an Updated System of Records AGENCY: General Services Administration. ACTION: Notice. [[Page 79695
Integrated system dynamics toolbox for water resources planning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.
2006-12-01
Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associatedmore » processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward achieving the technology development goals of this center.« less
Advancing the Implementation of Hydrologic Models as Web-based Applications
NASA Astrophysics Data System (ADS)
Dahal, P.; Tarboton, D. G.; Castronova, A. M.
2017-12-01
Advanced computer simulations are required to understand hydrologic phenomenon such as rainfall-runoff response, groundwater hydrology, snow hydrology, etc. Building a hydrologic model instance to simulate a watershed requires investment in data (diverse geospatial datasets such as terrain, soil) and computer resources, typically demands a wide skill set from the analyst, and the workflow involved is often difficult to reproduce. This work introduces a web-based prototype infrastructure in the form of a web application that provides researchers with easy to use access to complete hydrological modeling functionality. This includes creating the necessary geospatial and forcing data, preparing input files for a model by applying complex data preprocessing, running the model for a user defined watershed, and saving the results to a web repository. The open source Tethys Platform was used to develop the web app front-end Graphical User Interface (GUI). We used HydroDS, a webservice that provides data preparation processing capability to support backend computations used by the app. Results are saved in HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. The TOPographic Kinematic APproximation and Integration (TOPKAPI) model served as the example for which we developed a complete hydrologic modeling service to demonstrate the approach. The final product is a complete modeling system accessible through the web to create input files, and run the TOPKAPI hydrologic model for a watershed of interest. We are investigating similar functionality for the preparation of input to Regional Hydro-Ecological Simulation System (RHESSys). Key Words: hydrologic modeling, web services, hydrologic information system, HydroShare, HydroDS, Tethys Platform
NASA Astrophysics Data System (ADS)
Le Bras, R. J.; Arora, N. S.; Kushida, N.; Kebede, F.; Feitio, P.; Tomuta, E.
2017-12-01
The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has reached out to the broader scientific community through a series of conferences, the later one of which took place in June 2017 in Vienna, Austria. Stemming out of this outreach effort, after the inception of research and development efforts in 2009, the NET-VISA software, following a Bayesian modelling approach, has been elaborated to improve on the key step of automatic association of joint seismic, hydro-acoustic, and infrasound detections. When compared with the current operational system, it has been consistently shown on off-line tests to improve the overlap with the analyst-reviewed Reviewed Event Bulletin (REB) by ten percent for an average of 85% overlap, while the inconsistency rate is essentially the same at about 50%. Testing by analysts in realistic conditions on a few days of data has also demonstrated the software performance in finding additional events which qualify for publication in the REB. Starting in August 2017, the automatic events produced by the software will be reviewed by analysts at the CTBTO, and we report on the initial evaluation of this introduction into operations.
Interactive visual comparison of multimedia data through type-specific views
NASA Astrophysics Data System (ADS)
Burtner, Russ; Bohn, Shawn; Payne, Debbie
2013-01-01
Analysts who work with collections of multimedia to perform information foraging understand how difficult it is to connect information across diverse sets of mixed media. The wealth of information from blogs, social media, and news sites often can provide actionable intelligence; however, many of the tools used on these sources of content are not capable of multimedia analysis because they only analyze a single media type. As such, analysts are taxed to keep a mental model of the relationships among each of the media types when generating the broader content picture. To address this need, we have developed Canopy, a novel visual analytic tool for analyzing multimedia. Canopy provides insight into the multimedia data relationships by exploiting the linkages found in text, images, and video co-occurring in the same document and across the collection. Canopy connects derived and explicit linkages and relationships through multiple connected visualizations to aid analysts in quickly summarizing, searching, and browsing collected information to explore relationships and align content. In this paper, we will discuss the features and capabilities of the Canopy system and walk through a scenario illustrating how this system might be used in an operational environment.
Interactive visual comparison of multimedia data through type-specific views
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burtner, Edwin R.; Bohn, Shawn J.; Payne, Deborah A.
2013-02-05
Analysts who work with collections of multimedia to perform information foraging understand how difficult it is to connect information across diverse sets of mixed media. The wealth of information from blogs, social media, and news sites often can provide actionable intelligence; however, many of the tools used on these sources of content are not capable of multimedia analysis because they only analyze a single media type. As such, analysts are taxed to keep a mental model of the relationships among each of the media types when generating the broader content picture. To address this need, we have developed Canopy, amore » novel visual analytic tool for analyzing multimedia. Canopy provides insight into the multimedia data relationships by exploiting the linkages found in text, images, and video co-occurring in the same document and across the collection. Canopy connects derived and explicit linkages and relationships through multiple connected visualizations to aid analysts in quickly summarizing, searching, and browsing collected information to explore relationships and align content. In this paper, we will discuss the features and capabilities of the Canopy system and walk through a scenario illustrating how this system might be used in an operational environment. Keywords: Multimedia (Image/Video/Music) Visualization.« less
God of the hinge: treating LGBTQIA patients.
Boland, Annie
2017-11-01
This paper looks at systems of gender within the context of analysis. It explores the unique challenges of individuation faced by transsexual, transgender, gender queer, gender non-conforming, cross-dressing and intersex patients. To receive patients generously we need to learn how a binary culture produces profound and chronic trauma. These patients wrestle with being who they are whilst simultaneously receiving negative projections and feeling invisible. While often presenting with the struggles of gender conforming individuals, understanding the specifically gendered aspect of their identity is imperative. An analyst's unconscious bias may lead to iatrogenic shaming. The author argues that rigorous, humble inquiry into the analyst's transphobia can be transformative for patient, analyst, and the work itself. Analysis may, then, provide gender-variant patients with their first remembered and numinous experience of authentic connection to self. Conjuring the image of a hinge, securely placed in the neutral region of a third space, creates a transpositive analytic temenos. Invoking the spirit of the Trickster in the construction of this matrix supports the full inclusion of gender-variant patients. Nuanced attunement scaffolds mirroring and the possibility of play. Being mindful that gender is sturdy and delicate as well as mercurial and defined enriches the analyst's listening. © 2017, The Society of Analytical Psychology.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Fault trees for decision making in systems analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, Howard E.
1975-10-09
The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less
Advancing satellite operations with intelligent graphical monitoring systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.
1993-01-01
For nearly twenty-five years, spacecraft missions have been operated in essentially the same manner: human operators monitor displays filled with alphanumeric text watching for limit violations or other indicators that signal a problem. The task is performed predominately by humans. Only in recent years have graphical user interfaces and expert systems been accepted within the control center environment to help reduce operator workloads. Unfortunately, the development of these systems is often time consuming and costly. At the NASA Goddard Space Flight Center (GSFC), a new domain specific expert system development tool called the Generic Spacecraft Analyst Assistant (GenSAA) has been developed. Through the use of a highly graphical user interface and point-and-click operation, GenSAA facilitates the rapid, 'programming-free' construction of intelligent graphical monitoring systems to serve as real-time, fault-isolation assistants for spacecraft analysts. Although specifically developed to support real-time satellite monitoring, GenSAA can support the development of intelligent graphical monitoring systems in a variety of space and commercial applications.
NASA Astrophysics Data System (ADS)
Sadler, Laurel
2017-05-01
In today's battlefield environments, analysts are inundated with real-time data received from the tactical edge that must be evaluated and used for managing and modifying current missions as well as planning for future missions. This paper describes a framework that facilitates a Value of Information (VoI) based data analytics tool for information object (IO) analysis in a tactical and command and control (C2) environment, which reduces analyst work load by providing automated or analyst assisted applications. It allows the analyst to adjust parameters for data matching of the IOs that will be received and provides agents for further filtering or fusing of the incoming data. It allows for analyst enhancement and markup to be made to and/or comments to be attached to the incoming IOs, which can then be re-disseminated utilizing the VoI based dissemination service. The analyst may also adjust the underlying parameters before re-dissemination of an IO, which will subsequently adjust the value of the IO based on this new/additional information that has been added, possibly increasing the value from the original. The framework is flexible and extendable, providing an easy to use, dynamically changing Command and Control decision aid that focuses and enhances the analyst workflow.
The analyst: his professional novel.
Ambrosiano, Laura
2005-12-01
The psychoanalyst needs to be in touch with a community of colleagues; he needs to feel part of a group with which he can share cognitive tension and therapeutic knowledge. Yet group ties are an aspect we analysts seldom discuss. The author defines the analyst's 'professional novel' as the emotional vicissitudes with the group that have marked the professional itinerary of every analyst; his relationship with institutions and with theories, and the emotional nuance of these relationships. The analyst's professional novel is the narrative elaboration of his professional autobiography. It is capable of transforming the individual's need to belong and the paths of identification and de-identification. Experience of the oedipal configuration allows the analyst to begin psychic work aimed at gaining spaces of separateness in his relationship with the group. This passage is marked by the work on mourning that separation involves, but also of mourning implicit in the awareness of the representative limits of our theories. Right from the start of analysis, the patient observes the emotional nuance of the analyst's connection to his group and theories; the patient notices how much this connection is governed by rigid needs to belong, and how much freedom of thought and exploration it allows the analyst. The author uses clinical examples to illustrate these hypotheses.
Godsil, Geraldine
2018-02-01
This paper discusses the residues of a somatic countertransference that revealed its meaning several years after apparently successful analytic work had ended. Psychoanalytic and Jungian analytic ideas on primitive communication, dissociation and enactment are explored in the working through of a shared respiratory symptom between patient and analyst. Growth in the analyst was necessary so that the patient's communication at a somatic level could be understood. Bleger's concept that both the patient's and analyst's body are part of the setting was central in the working through. © 2018, The Society of Analytical Psychology.
NASA Astrophysics Data System (ADS)
Yager, Kevin; Albert, Thomas; Brower, Bernard V.; Pellechia, Matthew F.
2015-06-01
The domain of Geospatial Intelligence Analysis is rapidly shifting toward a new paradigm of Activity Based Intelligence (ABI) and information-based Tipping and Cueing. General requirements for an advanced ABIAA system present significant challenges in architectural design, computing resources, data volumes, workflow efficiency, data mining and analysis algorithms, and database structures. These sophisticated ABI software systems must include advanced algorithms that automatically flag activities of interest in less time and within larger data volumes than can be processed by human analysts. In doing this, they must also maintain the geospatial accuracy necessary for cross-correlation of multi-intelligence data sources. Historically, serial architectural workflows have been employed in ABIAA system design for tasking, collection, processing, exploitation, and dissemination. These simpler architectures may produce implementations that solve short term requirements; however, they have serious limitations that preclude them from being used effectively in an automated ABIAA system with multiple data sources. This paper discusses modern ABIAA architectural considerations providing an overview of an advanced ABIAA system and comparisons to legacy systems. It concludes with a recommended strategy and incremental approach to the research, development, and construction of a fully automated ABIAA system.
Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.
Xia, Jianguo; Wishart, David S
2016-09-07
MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
Simulation and analysis of differential global positioning system for civil helicopter operations
NASA Technical Reports Server (NTRS)
Denaro, R. P.; Cabak, A. R.
1983-01-01
A Differential Global Positioning System (DGPS) computer simulation was developed, to provide a versatile tool for assessing DGPS referenced civil helicopter navigation. The civil helicopter community will probably be an early user of the GPS capability because of the unique mission requirements which include offshore exploration and low altitude transport into remote areas not currently served by ground based Navaids. The Monte Carlo simulation provided a sufficiently high fidelity dynamic motion and propagation environment to enable accurate comparisons of alternative differential GPS implementations and navigation filter tradeoffs. The analyst has provided the capability to adjust most aspects of the system, the helicopter flight profile, the receiver Kalman filter, and the signal propagation environment to assess differential GPS performance and parameter sensitivities. Preliminary analysis was conducted to evaluate alternative implementations of the differential navigation algorithm in both the position and measurement domain. Results are presented to show that significant performance gains are achieved when compared with conventional GPS but that differences due to DGPS implementation techniques were small. System performance was relatively insensitive to the update rates of the error correction information.
Exploring the Human Fabric through an Analyst's Eyes
NASA Astrophysics Data System (ADS)
Belov, Nadya; Patti, Jeff; Wilcox, Saki; Almanzar, Rafael; Kim, Janet; Kellogg, Jennifer; Dang, Steven
The nature and type of conflicts drastically changed in the last half of the twentieth century. Wars are no longer limited to the field; they are supplemented with guerrilla warfare and other asymmetric warfare tactics including domestic terrorism. Domestic terrorism has demonstrated a need for improved homeland security capabilities. Establishing and maintaining the understanding of the key players and the underlying social networks is essential to combating asynchronous warfare tactics. Herein, we identify the key challenges addressed by our Collection/Exploitation Decision System (CEDS) that assist analysts in maintaining an up-to-date understanding of dynamic human networks.
Models, Data, and War: a Critique of the Foundation for Defense Analyses.
1980-03-12
scientific formulation 6 An "objective" solution 8 Analysis of a squishy problem 9 A judgmental formulation 9 A potential for distortion 11 A subjective...inextricably tied to those judgments. Different analysts, with apparently identical knowledge of a real world problem, may develop plausible formulations ...configured is a concrete theoretical statement." 2/ The formulation of a computer model--conceiving a mathematical representation of the real world
Air Vehicles Division Computational Structural Analysis Facilities Policy and Guidelines for Users
2005-05-01
34 Thermal " as appropriate and the tolerance set to "default". b) Create the model geometry. c) Create the finite elements. d) Create the...linear, non-linear, dynamic, thermal , acoustic analysis. The modelling of composite materials, creep, fatigue and plasticity are also covered...perform professional, high quality finite element analysis (FEA). FE analysts from many tasks within AVD are using the facilities to conduct FEA with
Evaluation of the Presentation of Network Data via Visualization Tools for Network Analysts
2014-03-01
A. (eds.) The Human Computer Interaction Handbook, pp.544–582. Lawrence Erlbaum Associates, Mawah, NJ, 2003. 4. Goodall , John R. Introduction to...of either display type being used in the analysis of cyber security tasks. Goodall (19) is one of few whose work focused on comparing user...relating source IP address to destination IP address and time, Goodall remains the only known approach comparing tabular and graphical displays
Energy Systems Integration News | Energy Systems Integration Facility |
capabilities, and new methodologies that allowed NREL to model operations of the Eastern Interconnection at Analyst Power Systems Modeling Researcher Project Manager Power Systems Engineering Center Research Engineer Power Systems Modeling and Control Get the full list of job postings and learn more about working
Tree value system: users guide.
J.K. Ayer Sachet; D.G. Briggs; R.D. Fight
1989-01-01
This paper instructs resource analysts on use of the Tree Value System (TREEVAL). TREEVAL is a microcomputer system of programs for calculating tree or stand values and volumes based on predicted product recovery. Designed for analyzing silvicultural decisions, the system can also be used for appraisals and for evaluating log bucking. The system calculates results...
Lecture Notes on Criticality Safety Validation Using MCNP & Whisper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
Training classes for nuclear criticality safety, MCNP documentation. The need for, and problems surrounding, validation of computer codes and data area considered first. Then some background for MCNP & Whisper is given--best practices for Monte Carlo criticality calculations, neutron spectra, S(α,β) thermal neutron scattering data, nuclear data sensitivities, covariance data, and correlation coefficients. Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the Monte Carlo radiation transport package MCNP. Whisper's methodology (benchmark selection – C k's, weights; extreme value theory – bias, bias uncertainty; MOS for nuclear data uncertainty – GLLS) and usagemore » are discussed.« less
Requirements for a multifunctional code architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiihonen, O.; Juslin, K.
1997-07-01
The present paper studies a set of requirements for a multifunctional simulation software architecture in the light of experiences gained in developing and using the APROS simulation environment. The huge steps taken in the development of computer hardware and software during the last ten years are changing the status of the traditional nuclear safety analysis software. The affordable computing power on the safety analysts table by far exceeds the possibilities offered to him/her ten years ago. At the same time the features of everyday office software tend to set standards to the way the input data and calculational results aremore » managed.« less
Validation of a common data model for active safety surveillance research
Ryan, Patrick B; Reich, Christian G; Hartzema, Abraham G; Stang, Paul E
2011-01-01
Objective Systematic analysis of observational medical databases for active safety surveillance is hindered by the variation in data models and coding systems. Data analysts often find robust clinical data models difficult to understand and ill suited to support their analytic approaches. Further, some models do not facilitate the computations required for systematic analysis across many interventions and outcomes for large datasets. Translating the data from these idiosyncratic data models to a common data model (CDM) could facilitate both the analysts' understanding and the suitability for large-scale systematic analysis. In addition to facilitating analysis, a suitable CDM has to faithfully represent the source observational database. Before beginning to use the Observational Medical Outcomes Partnership (OMOP) CDM and a related dictionary of standardized terminologies for a study of large-scale systematic active safety surveillance, the authors validated the model's suitability for this use by example. Validation by example To validate the OMOP CDM, the model was instantiated into a relational database, data from 10 different observational healthcare databases were loaded into separate instances, a comprehensive array of analytic methods that operate on the data model was created, and these methods were executed against the databases to measure performance. Conclusion There was acceptable representation of the data from 10 observational databases in the OMOP CDM using the standardized terminologies selected, and a range of analytic methods was developed and executed with sufficient performance to be useful for active safety surveillance. PMID:22037893
The Pope's confessor: a metaphor relating to illness in the analyst.
Clark, R W
1995-01-01
This paper examines some of the internal and external eventualities in the situation of illness in the analyst. The current emphasis on the use of the self as part of the analyzing instrument makes impairments in the analyst's physical well-being potentially disabling to the analytic work. A recommendation is made for analysts, both individually and as a professional group, to always consider this aspect of a personal medical problem.
Desire and the female analyst.
Schaverien, J
1996-04-01
The literature on erotic transference and countertransference between female analyst and male patient is reviewed and discussed. It is known that female analysts are less likely than their male colleagues to act out sexually with their patients. It has been claimed that a) male patients do not experience sustained erotic transferences, and b) female analysts do not experience erotic countertransferences with female or male patients. These views are challenged and it is argued that, if there is less sexual acting out by female analysts, it is not because of an absence of eros in the therapeutic relationship. The literature review covers material drawn from psychoanalysis, feminist psychotherapy, Jungian analysis, as well as some sociological and cultural sources. It is organized under the following headings: the gender of the analyst, sexual acting out, erotic transference, maternal and paternal transference, gender and power, countertransference, incest taboo--mothers and sons and sexual themes in the transference.
Organizational Performance and Customer Value
ERIC Educational Resources Information Center
Tosti, Donald; Herbst, Scott A.
2009-01-01
While behavior systems analysts have recognized the importance of the consumer of organizational products (i.e., receiving system) in developing models of organizational change, few have offered a systematic assessment of the relationship between consumer and organizational practices. In this article we will discuss how a behavior systems approach…
75 FR 22187 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-27
... (VA) proposes to amend the existing system of records titled ``Compensation, Pension, Education, and... by adding a new system location and a new routine use regarding transfer of educational benefits... Analyst, Education Service (225C), Department of Veterans Affairs, 810 Vermont Avenue, NW., Washington, DC...
78 FR 20168 - Twenty Fourth Meeting: RTCA Special Committee 203, Unmanned Aircraft Systems
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-03
... Washington, DC, on March 28, 2013. Paige Williams, Management Analyst, NextGen, Business Operations Group... Introductions Review Meeting Agenda Review/Approval of Twenty Third Plenary Meeting Summary Leadership Update... for Unmanned Aircraft Systems and Minimum Aviation System Performance Standards Other Business Adjourn...
What value may geographic information systems add to the art of identifying crash countermeasures?
DOT National Transportation Integrated Search
1999-04-01
Geographic Information Systems (GIS) can be employed to relate, organize, and analyze roadway and crash data, thereby facilitating crash countermeasure identification and evaluation. GIS cannot, however, replace the critical role of the local analyst...
Smarter Cities Marketing Insights 2.0 initiative, a data quality analyst at EnerNOC for its demand wind energy as a wind program analyst for Green Energy Ohio in 2005 and as a data analyst for The
Correlation of ERTS MSS data and earth coordinate systems
NASA Technical Reports Server (NTRS)
Malila, W. A. (Principal Investigator); Hieber, R. H.; Mccleer, A. P.
1973-01-01
The author has identified the following significant results. Experience has revealed a problem in the analysis and interpretation of ERTS-1 multispectral scanner (MSS) data. The problem is one of accurately correlating ERTS-1 MSS pixels with analysis areas specified on aerial photographs or topographic maps for training recognition computers and/or evaluating recognition results. It is difficult for an analyst to accurately identify which ERTS-1 pixels on a digital image display belong to specific areas and test plots, especially when they are small. A computer-aided procedure to correlate coordinates from topographic maps and/or aerial photographs with ERTS-1 data coordinates has been developed. In the procedure, a map transformation from earth coordinates to ERTS-1 scan line and point numbers is calculated using selected ground control points nad the method of least squares. The map transformation is then applied to the earth coordinates of selected areas to obtain the corresponding ERTS-1 point and line numbers. An optional provision allows moving the boundaries of the plots inward by variable distances so the selected pixels will not overlap adjacent features.
The combined use of order tracking techniques for enhanced Fourier analysis of order components
NASA Astrophysics Data System (ADS)
Wang, K. S.; Heyns, P. S.
2011-04-01
Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.
Visualization techniques for computer network defense
NASA Astrophysics Data System (ADS)
Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew
2011-06-01
Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.
Elements of analytic style: Bion's clinical seminars.
Ogden, Thomas H
2007-10-01
The author finds that the idea of analytic style better describes significant aspects of the way he practices psychoanalysis than does the notion of analytic technique. The latter is comprised to a large extent of principles of practice developed by previous generations of analysts. By contrast, the concept of analytic style, though it presupposes the analyst's thorough knowledge of analytic theory and technique, emphasizes (1) the analyst's use of his unique personality as reflected in his individual ways of thinking, listening, and speaking, his own particular use of metaphor, humor, irony, and so on; (2) the analyst's drawing on his personal experience, for example, as an analyst, an analysand, a parent, a child, a spouse, a teacher, and a student; (3) the analyst's capacity to think in a way that draws on, but is independent of, the ideas of his colleagues, his teachers, his analyst, and his analytic ancestors; and (4) the responsibility of the analyst to invent psychoanalysis freshly for each patient. Close readings of three of Bion's 'Clinical seminars' are presented in order to articulate some of the elements of Bion's analytic style. Bion's style is not presented as a model for others to emulate or, worse yet, imitate; rather, it is described in an effort to help the reader consider from a different vantage point (provided by the concept of analytic style) the way in which he, the reader, practices psychoanalysis.
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
ASPECTS: an automation-assisted SPE method development system.
Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu
2013-07-01
A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.
Systems Analysts in Higher Education: Some Concerns.
ERIC Educational Resources Information Center
Alden, John W.
This paper focuses on the concerns associated with the use of systems analysis in higher education. One fear is that systems analysis will increase the need for centralized authority and highly structured activity, thus contributing to further alienation and dehumanization. A second objection pertains to the increased requirement for specifying…
Archiving a Software Development Project
2013-04-01
an ongoing monitoring system that identifies attempts and requests for retrieval, and ensures that the attempts and requests cannot proceed without...Intelligence Division Peter Fisher has worked as a consultant, systems analyst, software developer and project manager in Australia, Holland, the USA...4 3.1.3 DRMS – Defence Records Management System
Monterey Bay study. [analysis of Landsat 1 multispectral band scanner data
NASA Technical Reports Server (NTRS)
Bizzell, R. M.; Wade, L. C.
1975-01-01
The multispectral scanner capabilities of LANDSAT 1 were tested over California's Monterey Bay area and portions of the San Joaquin Valley. Using both computer aided and image interpretive processing techniques, the LANDSAT 1 data were analyzed to determine their potential application in terms of land use and agriculture. Utilizing LANDSAT 1 data, analysts were able to provide the identifications and areal extent of the individual land use categories ranging from very general to highly specific levels (e.g., from agricultural lands to specific field crop types and even the different stages of growth). It is shown that the LANDSAT system is useful in the identification of major crop species and the delineation of numerous land use categories on a global basis and that repeated surveillance would permit the monitoring of changes in seasonal growth characteristics of crops as well as the assessment of various cultivation practices with a minimum of onsite observation. The LANDSAT system is demonstrated to be useful in the planning and development of resource programs on earth.
Human-system interfaces for space cognitive awareness
NASA Astrophysics Data System (ADS)
Ianni, J.
Space situational awareness is a human activity. We have advanced sensors and automation capabilities but these continue to be tools for humans to use. The reality is, however, that humans cannot take full advantage of the power of these tools due to time constraints, cognitive limitations, poor tool integration, poor human-system interfaces, and other reasons. Some excellent tools may never be used in operations and, even if they were, they may not be well suited to provide a cohesive and comprehensive picture. Recognizing this, the Air Force Research Laboratory (AFRL) is applying cognitive science principles to increase the knowledge derived from existing tools and creating new capabilities to help space analysts and decision makers. At the center of this research is Sensemaking Support Environment technology. The concept is to create cognitive-friendly computer environments that connect critical and creative thinking for holistic decision making. AFRL is also investigating new visualization technologies for multi-sensor exploitation and space weather, human-to-human collaboration technologies, and other technology that will be discussed in this paper.
An artificial intelligence approach to lithostratigraphic correlation using geophysical well logs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olea, R.A.; Davis, J.C.
1986-01-01
Computer programs for lithostratigraphic correlation of well logs have achieved limited success. Their algorithms are based on an oversimplified view of the manual process used by analysts to establish geologically correct correlations. The programs experience difficulties if the correlated rocks deviate from an ideal geometry of perfectly homogeneous, parallel layers of infinite extent. Artificial intelligence provides a conceptual basis for formulating the task of lithostratigraphic correlation, leading to more realistic procedures. A prototype system using the ''production rule'' approach of expert systems successfully correlates well logs in areas of stratigraphic complexity. Two digitized logs are used per well, one formore » curve matching and the other for lithologic comparison. The software has been successfully used to correlate more than 100,000 ft (30 480 m) of section, through clastic sequences in Louisiana and through carbonate sequences in Kansas. Correlations have been achieved even in the presence of faults, unconformities, facies changes, and lateral variations in bed thickness.« less
DOT National Transportation Integrated Search
2009-11-01
The Great Lakes Maritime Information Delivery System (GLMIDS) is designed to facilitate the acquisition, storage, management, analysis and exchange of data between analysts and decision-makers within maritime commerce. (See http://maritime.utoledo.ed...
SafetyAnalyst : software tools for safety management of specific highway sites
DOT National Transportation Integrated Search
2010-07-01
SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...
Exploring the Analytical Processes of Intelligence Analysts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Kuchar, Olga A.; Wolf, Katherine E.
We present an observational case study in which we investigate and analyze the analytical processes of intelligence analysts. Participating analysts in the study carry out two scenarios where they organize and triage information, conduct intelligence analysis, report results, and collaborate with one another. Through a combination of artifact analyses, group interviews, and participant observations, we explore the space and boundaries in which intelligence analysts work and operate. We also assess the implications of our findings on the use and application of relevant information technologies.
Reflections: can the analyst share a traumatizing experience with a traumatized patient?
Lijtmaer, Ruth
2010-01-01
This is a personal account of a dreadful event in the analyst's life that was similar to a patient's trauma. It is a reflection on how the analyst dealt with her own trauma, the patient's trauma, and the transference and countertransference dynamics. Included is a description of the analyst's inner struggles with self-disclosure, continuance of her professional work, and the need for persistent self-scrutiny. The meaning of objects in people's life, particularly the concept of home, will be addressed.
Do Sell-Side Stock Analysts Exhibit Escalation of Commitment?
Milkman, Katherine L.
2010-01-01
This paper presents evidence that when an analyst makes an out-of-consensus forecast of a company’s quarterly earnings that turns out to be incorrect, she escalates her commitment to maintaining an out-of-consensus view on the company. Relative to an analyst who was close to the consensus, the out-of-consensus analyst adjusts her forecasts for the current fiscal year’s earnings less in the direction of the quarterly earnings surprise. On average, this type of updating behavior reduces forecasting accuracy, so it does not seem to reflect superior private information. Further empirical results suggest that analysts do not have financial incentives to stand by extreme stock calls in the face of contradictory evidence. Managerial and financial market implications are discussed. PMID:21516220
Mortality, integrity, and psychoanalysis (who are you to me? Who am I to you?).
Pinsky, Ellen
2014-01-01
The author narrates her experience of mourning her therapist's sudden death. The profession has neglected implications of the analyst's mortality: what is lost or vulnerable to loss? What is that vulnerability's function? The author's process of mourning included her writing and her becoming an analyst. Both pursuits inspired reflections on mortality in two overlapping senses: bodily (the analyst is mortal and can die) and character (the analyst is mortal and can err). The subject thus expands to include impaired character and ethical violations. Paradoxically, the analyst's human limitations threaten each psychoanalytic situation, but also enable it: human imperfection animates the work. The essay ends with a specific example of integrity. © 2014 The Psychoanalytic Quarterly, Inc.
The tobacco industry's use of Wall Street analysts in shaping policy.
Alamar, B C; Glantz, S A
2004-09-01
To document how the tobacco industry has used Wall Street analysts to further its public policy objectives. Searching tobacco documents available on the internet, newspaper articles, and transcripts of public hearings. The tobacco industry used nominally independent Wall Street analysts as third parties to support the tobacco industry's legislative agenda at both national and state levels in the USA. The tobacco industry has, for example, edited the testimony of at least one analyst before he testified to the US Senate Judiciary Committee, while representing himself as independent of the industry. The tobacco industry has used undisclosed collaboration with Wall Street analysts, as they have used undisclosed relationships with research scientists and academics, to advance the interests of the tobacco industry in public policy.
Interpolation Method Needed for Numerical Uncertainty
NASA Technical Reports Server (NTRS)
Groves, Curtis E.; Ilie, Marcel; Schallhorn, Paul A.
2014-01-01
Using Computational Fluid Dynamics (CFD) to predict a flow field is an approximation to the exact problem and uncertainties exist. There is a method to approximate the errors in CFD via Richardson's Extrapolation. This method is based off of progressive grid refinement. To estimate the errors, the analyst must interpolate between at least three grids. This paper describes a study to find an appropriate interpolation scheme that can be used in Richardson's extrapolation or other uncertainty method to approximate errors.
CASL Dakota Capabilities Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Simmons, Chris; Williams, Brian J.
2017-10-10
The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.
Design Through Analysis (DTA) roadmap vision.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blacker, Teddy Dean; Adams, Charles R.; Hoffman, Edward L.
2004-10-01
The Design through Analysis Realization Team (DART) will provide analysts with a complete toolset that reduces the time to create, generate, analyze, and manage the data generated in a computational analysis. The toolset will be both easy to learn and easy to use. The DART Roadmap Vision provides for progressive improvements that will reduce the Design through Analysis (DTA) cycle time by 90-percent over a three-year period while improving both the quality and accountability of the analyses.
SoccerStories: a kick-off for visual soccer analysis.
Perin, Charles; Vuillemot, Romain; Fekete, Jean-Daniel
2013-12-01
This article presents SoccerStories, a visualization interface to support analysts in exploring soccer data and communicating interesting insights. Currently, most analyses on such data relate to statistics on individual players or teams. However, soccer analysts we collaborated with consider that quantitative analysis alone does not convey the right picture of the game, as context, player positions and phases of player actions are the most relevant aspects. We designed SoccerStories to support the current practice of soccer analysts and to enrich it, both in the analysis and communication stages. Our system provides an overview+detail interface of game phases, and their aggregation into a series of connected visualizations, each visualization being tailored for actions such as a series of passes or a goal attempt. To evaluate our tool, we ran two qualitative user studies on recent games using SoccerStories with data from one of the world's leading live sports data providers. The first study resulted in a series of four articles on soccer tactics, by a tactics analyst, who said he would not have been able to write these otherwise. The second study consisted in an exploratory follow-up to investigate design alternatives for embedding soccer phases into word-sized graphics. For both experiments, we received a very enthusiastic feedback and participants consider further use of SoccerStories to enhance their current workflow.
Zhao, Jian; Glueck, Michael; Breslav, Simon; Chevalier, Fanny; Khan, Azam
2017-01-01
User-authored annotations of data can support analysts in the activity of hypothesis generation and sensemaking, where it is not only critical to document key observations, but also to communicate insights between analysts. We present annotation graphs, a dynamic graph visualization that enables meta-analysis of data based on user-authored annotations. The annotation graph topology encodes annotation semantics, which describe the content of and relations between data selections, comments, and tags. We present a mixed-initiative approach to graph layout that integrates an analyst's manual manipulations with an automatic method based on similarity inferred from the annotation semantics. Various visual graph layout styles reveal different perspectives on the annotation semantics. Annotation graphs are implemented within C8, a system that supports authoring annotations during exploratory analysis of a dataset. We apply principles of Exploratory Sequential Data Analysis (ESDA) in designing C8, and further link these to an existing task typology in the visualization literature. We develop and evaluate the system through an iterative user-centered design process with three experts, situated in the domain of analyzing HCI experiment data. The results suggest that annotation graphs are effective as a method of visually extending user-authored annotations to data meta-analysis for discovery and organization of ideas.
Rule-based expert system for maritime anomaly detection
NASA Astrophysics Data System (ADS)
Roy, Jean
2010-04-01
Maritime domain operators/analysts have a mandate to be aware of all that is happening within their areas of responsibility. This mandate derives from the needs to defend sovereignty, protect infrastructures, counter terrorism, detect illegal activities, etc., and it has become more challenging in the past decade, as commercial shipping turned into a potential threat. In particular, a huge portion of the data and information made available to the operators/analysts is mundane, from maritime platforms going about normal, legitimate activities, and it is very challenging for them to detect and identify the non-mundane. To achieve such anomaly detection, they must establish numerous relevant situational facts from a variety of sensor data streams. Unfortunately, many of the facts of interest just cannot be observed; the operators/analysts thus use their knowledge of the maritime domain and their reasoning faculties to infer these facts. As they are often overwhelmed by the large amount of data and information, automated reasoning tools could be used to support them by inferring the necessary facts, ultimately providing indications and warning on a small number of anomalous events worthy of their attention. Along this line of thought, this paper describes a proof-of-concept prototype of a rule-based expert system implementing automated rule-based reasoning in support of maritime anomaly detection.
NASA Astrophysics Data System (ADS)
Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Duffy, P. B.
2007-12-01
Incorporating climate change information into long-term evaluations of water and energy resources requires analysts to have access to climate projection data that have been spatially downscaled to "basin-relevant" resolution. This is necessary in order to develop system-specific hydrology and demand scenarios consistent with projected climate scenarios. Analysts currently have access to "climate model" resolution data (e.g., at LLNL PCMDI), but not spatially downscaled translations of these datasets. Motivated by a common interest in supporting regional and local assessments, the U.S. Bureau of Reclamation and LLNL (through support from the DOE National Energy Technology Laboratory) have teamed to develop an archive of downscaled climate projections (temperature and precipitation) with geographic coverage consistent with the North American Land Data Assimilation System domain, encompassing the contiguous United States. A web-based information service, hosted at LLNL Green Data Oasis, has been developed to provide Reclamation, LLNL, and other interested analysts free access to archive content. A contemporary statistical method was used to bias-correct and spatially disaggregate projection datasets, and was applied to 112 projections included in the WCRP CMIP3 multi-model dataset hosted by LLNL PCMDI (i.e. 16 GCMs and their multiple simulations of SRES A2, A1b, and B1 emissions pathways).
The development of a reliable amateur boxing performance analysis template.
Thomson, Edward; Lamb, Kevin; Nicholas, Ceri
2013-01-01
The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.
75 FR 20385 - Amended Certification Regarding Eligibility To Apply for Worker Adjustment Assistance
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-19
..., Inconen, CTS, Hi-Tec, Woods, Ciber, Kelly Services, Analysts International Corp, Comsys, Filter LLC..., Ciber, Kelly Services, Analysts International Corp, Comsys, Filter LLC, Excell, Entegee, Chipton- Ross..., Kelly Services, Analysts International Corp, Comsys, Filter LLC, Excell, Entegee, Chipton- Ross, Ian...
Subcellular object quantification with Squassh3C and SquasshAnalyst.
Rizk, Aurélien; Mansouri, Maysam; Ballmer-Hofer, Kurt; Berger, Philipp
2015-11-01
Quantitative image analysis plays an important role in contemporary biomedical research. Squassh is a method for automatic detection, segmentation, and quantification of subcellular structures and analysis of their colocalization. Here we present the applications Squassh3C and SquasshAnalyst. Squassh3C extends the functionality of Squassh to three fluorescence channels and live-cell movie analysis. SquasshAnalyst is an interactive web interface for the analysis of Squassh3C object data. It provides segmentation image overview and data exploration, figure generation, object and image filtering, and a statistical significance test in an easy-to-use interface. The overall procedure combines the Squassh3C plug-in for the free biological image processing program ImageJ and a web application working in conjunction with the free statistical environment R, and it is compatible with Linux, MacOS X, or Microsoft Windows. Squassh3C and SquasshAnalyst are available for download at www.psi.ch/lbr/SquasshAnalystEN/SquasshAnalyst.zip.
Nurses using futuristic technology in today's healthcare setting.
Wolf, Debra M; Kapadia, Amar; Kintzel, Jessie; Anton, Bonnie B
2009-01-01
Human computer interaction (HCI) equates nurses using voice assisted technology within a clinical setting to document patient care real time, retrieve patient information from care plans, and complete routine tasks. This is a reality currently utilized by clinicians today in acute and long term care settings. Voice assisted documentation provides hands & eyes free accurate documentation while enabling effective communication and task management. The speech technology increases the accuracy of documentation, while interfacing directly into the electronic health record (EHR). Using technology consisting of a light weight headset and small fist size wireless computer, verbal responses to easy to follow cues are converted into a database systems allowing staff to obtain individualized care status reports on demand. To further assist staff in their daily process, this innovative technology allows staff to send and receive pages as needed. This paper will discuss how leading edge and award winning technology is being integrated within the United States. Collaborative efforts between clinicians and analyst will be discussed reflecting the interactive design and build functionality. Features such as the system's voice responses and directed cues will be shared and how easily data can be documented, viewed and retrieved. Outcome data will be presented on how the technology impacted organization's quality outcomes, financial reimbursement, and employee's level of satisfaction.
This art of psychoanalysis. Dreaming undreamt dreams and interrupted cries.
Ogden, Thomas H
2004-08-01
It is the art of psychoanalysis in the making, a process inventing itself as it goes, that is the subject of this paper. The author articulates succinctly how he conceives of psychoanalysis, and offers a detailed clinical illustration. He suggests that each analysand unconsciously (and ambivalently) is seeking help in dreaming his 'night terrors' (his undreamt and undreamable dreams) and his 'nightmares' (his dreams that are interrupted when the pain of the emotional experience being dreamt exceeds his capacity for dreaming). Undreamable dreams are understood as manifestations of psychotic and psychically foreclosed aspects of the personality; interrupted dreams are viewed as reflections of neurotic and other non-psychotic parts of the personality. The analyst's task is to generate conditions that may allow the analysand--with the analyst's participation--to dream the patient's previously undreamable and interrupted dreams. A significant part of the analyst's participation in the patient's dreaming takes the form of the analyst's reverie experience. In the course of this conjoint work of dreaming in the analytic setting, the analyst may get to know the analysand sufficiently well for the analyst to be able to say something that is true to what is occurring at an unconscious level in the analytic relationship. The analyst's use of language contributes significantly to the possibility that the patient will be able to make use of what the analyst has said for purposes of dreaming his own experience, thereby dreaming himself more fully into existence.
Huff, Andrew G; Breit, Nathan; Allen, Toph; Whiting, Karissa; Kiley, Christopher
2016-01-01
The Global Rapid Identification of Threats System (GRITS) is a biosurveillance application that enables infectious disease analysts to monitor nontraditional information sources (e.g., social media, online news outlets, ProMED-mail reports, and blogs) for infectious disease threats. GRITS analyzes these textual data sources by identifying, extracting, and succinctly visualizing epidemiologic information and suggests potentially associated infectious diseases. This manuscript evaluates and verifies the diagnoses that GRITS performs and discusses novel aspects of the software package. Via GRITS' web interface, infectious disease analysts can examine dynamic visualizations of GRITS' analyses and explore historical infectious disease emergence events. The GRITS API can be used to continuously analyze information feeds, and the API enables GRITS technology to be easily incorporated into other biosurveillance systems. GRITS is a flexible tool that can be modified to conduct sophisticated medical report triaging, expanded to include customized alert systems, and tailored to address other biosurveillance needs.
Breit, Nathan
2016-01-01
The Global Rapid Identification of Threats System (GRITS) is a biosurveillance application that enables infectious disease analysts to monitor nontraditional information sources (e.g., social media, online news outlets, ProMED-mail reports, and blogs) for infectious disease threats. GRITS analyzes these textual data sources by identifying, extracting, and succinctly visualizing epidemiologic information and suggests potentially associated infectious diseases. This manuscript evaluates and verifies the diagnoses that GRITS performs and discusses novel aspects of the software package. Via GRITS' web interface, infectious disease analysts can examine dynamic visualizations of GRITS' analyses and explore historical infectious disease emergence events. The GRITS API can be used to continuously analyze information feeds, and the API enables GRITS technology to be easily incorporated into other biosurveillance systems. GRITS is a flexible tool that can be modified to conduct sophisticated medical report triaging, expanded to include customized alert systems, and tailored to address other biosurveillance needs. PMID:27698665
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-05
... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small And Medium Business, Tampa, Florida; Verizon Business Networks Services, Inc., Senior Coordinator-Order... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
...,968B] Verizon Business Networks Services, Inc. Senior Analysts-Sales Impletmentation (SA-SI) Birmingham, Alabama; Verizon Business Networks Services, Inc. Senior Analysts-Sales Impletmentation (SA-SI) Service Program Delivery Division San Francisco, California; Verizon Business Networks Services, Inc.Senior...
User Interface Developed for Controls/CFD Interdisciplinary Research
NASA Technical Reports Server (NTRS)
1996-01-01
The NASA Lewis Research Center, in conjunction with the University of Akron, is developing analytical methods and software tools to create a cross-discipline "bridge" between controls and computational fluid dynamics (CFD) technologies. Traditionally, the controls analyst has used simulations based on large lumping techniques to generate low-order linear models convenient for designing propulsion system controls. For complex, high-speed vehicles such as the High Speed Civil Transport (HSCT), simulations based on CFD methods are required to capture the relevant flow physics. The use of CFD should also help reduce the development time and costs associated with experimentally tuning the control system. The initial application for this research is the High Speed Civil Transport inlet control problem. A major aspect of this research is the development of a controls/CFD interface for non-CFD experts, to facilitate the interactive operation of CFD simulations and the extraction of reduced-order, time-accurate models from CFD results. A distributed computing approach for implementing the interface is being explored. Software being developed as part of the Integrated CFD and Experiments (ICE) project provides the basis for the operating environment, including run-time displays and information (data base) management. Message-passing software is used to communicate between the ICE system and the CFD simulation, which can reside on distributed, parallel computing systems. Initially, the one-dimensional Large-Perturbation Inlet (LAPIN) code is being used to simulate a High Speed Civil Transport type inlet. LAPIN can model real supersonic inlet features, including bleeds, bypasses, and variable geometry, such as translating or variable-ramp-angle centerbodies. Work is in progress to use parallel versions of the multidimensional NPARC code.
Processing, Cataloguing and Distribution of Uas Images in Near Real Time
NASA Astrophysics Data System (ADS)
Runkel, I.
2013-08-01
Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications - where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services) image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security demands - the images can be checked and interpreted in near real-time. For sensible areas it gives you the possibility to inform remote decision makers or interpretation experts in order to provide them situations awareness, wherever they are. For monitoring and inspection tasks it speeds up the process of data capture and data interpretation. The fully automated workflow of data pre-processing, data georeferencing, data cataloguing and data dissemination in near real time was developed based on the Intergraph products ERDAS IMAGINE, ERDAS APOLLO and GEOSYSTEMS METAmorph!IT. It is offered as adaptable solution by GEOSYSTEMS GmbH.
NASA Astrophysics Data System (ADS)
Phipps, Marja; Capel, David; Srinivasan, James
2014-06-01
Motion imagery capabilities within the Department of Defense/Intelligence Community (DoD/IC) have advanced significantly over the last decade, attempting to meet continuously growing data collection, video processing and analytical demands in operationally challenging environments. The motion imagery tradecraft has evolved accordingly, enabling teams of analysts to effectively exploit data and generate intelligence reports across multiple phases in structured Full Motion Video (FMV) Processing Exploitation and Dissemination (PED) cells. Yet now the operational requirements are drastically changing. The exponential growth in motion imagery data continues, but to this the community adds multi-INT data, interoperability with existing and emerging systems, expanded data access, nontraditional users, collaboration, automation, and support for ad hoc configurations beyond the current FMV PED cells. To break from the legacy system lifecycle, we look towards a technology application and commercial adoption model course which will meet these future Intelligence, Surveillance and Reconnaissance (ISR) challenges. In this paper, we explore the application of cutting edge computer vision technology to meet existing FMV PED shortfalls and address future capability gaps. For example, real-time georegistration services developed from computer-vision-based feature tracking, multiple-view geometry, and statistical methods allow the fusion of motion imagery with other georeferenced information sources - providing unparalleled situational awareness. We then describe how these motion imagery capabilities may be readily deployed in a dynamically integrated analytical environment; employing an extensible framework, leveraging scalable enterprise-wide infrastructure and following commercial best practices.
Description and Documentation of the Dental School Dental Delivery System.
ERIC Educational Resources Information Center
Chase, Rosen and Wallace, Inc., Alexandria, VA.
A study was undertaken to describe and document the dental school dental delivery system using an integrated systems approach. In late 1976 and early 1977, a team of systems analysts and dental consultants visited three dental schools to observe the delivery of dental services and patient flow and to interview administrative staff and faculty.…
Generalized Data Management Systems--Some Perspectives.
ERIC Educational Resources Information Center
Minker, Jack
A Generalized Data Management System (GDMS) is a software environment provided as a tool for analysts, administrators, and programmers who are responsible for the maintenance, query and analysis of a data base to permit the manipulation of newly defined files and data with the existing programs and system. Because the GDMS technology is believed…
Library Effectiveness: A Systems Approach.
ERIC Educational Resources Information Center
Morse, Philip M.
Addressed to both librarians and systems analysts, this book attempts to apply the analytic methods of operations research and systems analysis to the operating problems of the library. The first part of the book discusses theoretical models with emphasis on the pattern of book use, on its change with time and on the problem of estimating and…
DOT National Transportation Integrated Search
1981-12-01
The purpose of this report is to present a general procedure for using the SOS software to analyze AGT systems. Data to aid the analyst in specifying input information, required as input to the software, are summarized in the appendices. The data are...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 9 2012-04-01 2012-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
21 CFR 1304.23 - Records for chemical analysts.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 9 2014-04-01 2014-04-01 false Records for chemical analysts. 1304.23 Section... REGISTRANTS Continuing Records § 1304.23 Records for chemical analysts. (a) Each person registered or authorized (by § 1301.22(b) of this chapter) to conduct chemical analysis with controlled substances shall...
78 FR 77769 - Data Collection Available for Public Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-24
... comments to Amy Garcia, Program Analyst, Office of Government Contracting, Small Business Administration, 409 3rd Street, 7th Floor, Washington, DC 20416. FOR FURTHER INFORMATION CONTACT: Amy Garcia, Program Analyst, 202-205- 6842, amy.garcia@sba.gov , or Curtis B. Rich, Management Analyst, 202- 205-7030, curtis...
Osborne, Nikola K P; Taylor, Michael C; Healey, Matthew; Zajac, Rachel
2016-03-01
It is becoming increasingly apparent that contextual information can exert a considerable influence on decisions about forensic evidence. Here, we explored accuracy and contextual influence in bloodstain pattern classification, and how these variables might relate to analyst characteristics. Thirty-nine bloodstain pattern analysts with varying degrees of experience each completed measures of compliance, decision-making style, and need for closure. Analysts then examined a bloodstain pattern without any additional contextual information, and allocated votes to listed pattern types according to favoured and less favoured classifications. Next, if they believed it would assist with their classification, analysts could request items of contextual information - from commonly encountered sources of information in bloodstain pattern analysis - and update their vote allocation. We calculated a shift score for each item of contextual information based on vote reallocation. Almost all forms of contextual information influenced decision-making, with medical findings leading to the highest shift scores. Although there was a small positive association between shift scores and the degree to which analysts displayed an intuitive decision-making style, shift scores did not vary meaningfully as a function of experience or the other characteristics measured. Almost all of the erroneous classifications were made by novice analysts. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
Transient dynamics capability at Sandia National Laboratories
NASA Technical Reports Server (NTRS)
Attaway, Steven W.; Biffle, Johnny H.; Sjaardema, G. D.; Heinstein, M. W.; Schoof, L. A.
1993-01-01
A brief overview of the transient dynamics capabilities at Sandia National Laboratories, with an emphasis on recent new developments and current research is presented. In addition, the Sandia National Laboratories (SNL) Engineering Analysis Code Access System (SEACAS), which is a collection of structural and thermal codes and utilities used by analysts at SNL, is described. The SEACAS system includes pre- and post-processing codes, analysis codes, database translation codes, support libraries, Unix shell scripts for execution, and an installation system. SEACAS is used at SNL on a daily basis as a production, research, and development system for the engineering analysts and code developers. Over the past year, approximately 190 days of CPU time were used by SEACAS codes on jobs running from a few seconds up to two and one-half days of CPU time. SEACAS is running on several different systems at SNL including Cray Unicos, Hewlett Packard PH-UX, Digital Equipment Ultrix, and Sun SunOS. An overview of SEACAS, including a short description of the codes in the system, are presented. Abstracts and references for the codes are listed at the end of the report.
NASA Astrophysics Data System (ADS)
Kibler, J. M.; Ruminski, M. G.; Simko, J. J.; McNamara, D. P.; Kasheta, T.
2004-12-01
The Hazard Mapping System (HMS) is a multiplatform remote sensing approach to detecting fires and smoke over the US, Canada, Mexico and Central America. This system is an integral part of the Satellite Services Division's near realtime hazards detection and mitigation efforts. The system utilizes NOAA's Geostationary Operational Environmental Satellites (GOES), Polar Operational Environmental Satellites (POES), the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument on NASA's Terra and Aqua spacecraft and the Defense Meteorological Satellite Program Operational Linescan System (OLS) sensor, (F14 and F15). Automated detection algorithms are employed for each of the satellites (except DMSP OLS) for the fire detects while smoke is annotated by a satellite analyst. Fire detects can also be added by the satellite analyst. Major customers for this product include the National Weather Service, Storm Prediction Center, US Forest Service, Environmental Protection Agency, research science teams, as well as numerous federal, state and local land and air quality managers. In 2004 the HMS was upgraded by adding the Canadian, Mexican, and Central American sectors for hotspot and smoke detection. These sectors can be easily turned off or on by changing flags in the system configuration file. This enables analysis in sectors only during their respective burning seasons. The Alaskan and Canadian sectors are typically turned off in the winter season and the Mexican sector is cut off after the March-May burning season. But sectors can also be easily added or restarted if, for instance, smoke from a region is affecting the United States. Various ancillary data sources are used in the HMS to aid the analysis. Stable Lights is a static product that identifies stable sources of light from the OLS sensor and is usually associated with cities and urban areas. It appears on the screen as a transparent overlay on the satellite imagery being displayed. This capability can assist the analyst by screening out heat sources where stable lights are present. Vegetation type and water overlays aid in the decision to add or delete a fire point. Water sources many times may cause false detects due to low sun zenith angles during sunrise and sunset or due to temperature contrast between land and water at night. This overlay aides in quickly finding these false detects. Vegetation overlays enable the analyst by showing what type of land is present near the hotspot in question. For example, fires are more likely in forest or grassland than desert or barren lands. The SSD fire team is currently assessing the feasibility of a descriptive smoke text product and would like to incorporate additional datasets for the monitoring of fires, smoke, dust, and air pollution. The HMS is a dynamic product that changes with the needs of our analysts and customers.
Rich Language Analysis for Counterterrorism
NASA Astrophysics Data System (ADS)
Guidère, Mathieu; Howard, Newton; Argamon, Shlomo
Accurate and relevant intelligence is critical for effective counterterrorism. Too much irrelevant information is as bad or worse than not enough information. Modern computational tools promise to provide better search and summarization capabilities to help analysts filter and select relevant and key information. However, to do this task effectively, such tools must have access to levels of meaning beyond the literal. Terrorists operating in context-rich cultures like fundamentalist Islam use messages with multiple levels of interpretation, which are easily misunderstood by non-insiders. This chapter discusses several kinds of such “encryption” used by terrorists and insurgents in the Arabic language, and how knowledge of such methods can be used to enhance computational text analysis techniques for use in counterterrorism.
Some Issues of Electrical Systems Modeling in Course of PSA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lankin, Michael
2002-07-01
Electrical power supply systems are one of the essential parts of nuclear power plants. The distinctive feature of these systems from the PSA analyst's point of view is significant amount of bi-directional dependencies present within electrical systems. This paper describes an approach that has been used for electrical systems modeling in course of Kola 4 NPP Level 1 PSA. (authors)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-06
..., Division of Transplantation, Healthcare Systems Bureau, Health Resources and Services Administration... INFORMATION CONTACT: Mesmin Germain, Public Health Analyst, Division of Transplantation, Healthcare Systems... is based on donor and recipient incomes of 300 percent or less of the HHS Poverty Guidelines...
2008-06-10
Presentation, Directorate of Information Management, Competitive Sourcing In-Process Review, July 13, 2006. 21 Opening Statement, The Honorable John O...computer analysts, and clerks under a Department of the Treasury contract.30 CRS-10 30 (...continued) secretary John W. Snow. The company is headed by two...former high-ranking executives of KBR, formerly known as Kellogg Brown & Root. Al Neffgen, IAP’s chief executive, was chief operating officer for a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vang, Leng; Prescott, Steven R; Smith, Curtis
In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.
2014-06-01
intelligence analysis processes. However, as has been noted in previous work (e.g., [42]), there are a number of important differences between the nature of the...problem encountered in the context of the ELICIT task and the problems dealt with by intelligence analysts. Perhaps most importantly, the fact that a...see Section 7). 6 departure from the reality of most intelligence analysis situations: in most real-world intelligence analysis problems agents have
NASA Technical Reports Server (NTRS)
1987-01-01
The proceedings of the conference are presented. The objective was to provide a forum for the discussion of the structure and status of existing computer programs which are used to simulate the dynamics of a variety of tether applications in space. A major topic was different simulation models and the process of validating them. Guidance on future work in these areas was obtained from a panel discussion; the panel was composed of resource and technical managers and dynamic analysts in the tether field. The conclusions of this panel are also presented.
NASA Technical Reports Server (NTRS)
Isaacson, D.; Marchesin, D.; Paes-Leme, P. J.
1980-01-01
This paper is an expanded version of a talk given at the 1979 T.I.C.O.M. conference. It is a self-contained introduction, for applied mathematicians and numerical analysts, to quantum mechanics and quantum field theory. It also contains a brief description of the authors' numerical approach to the problems of quantum field theory, which may best be summarized by the question; Can we compute the eigenvalues and eigenfunctions of Schrodinger operators in infinitely many variables.
1976-08-01
target type. A variable used to store the value of IT. UNITS none none none PK PKK POWER PPEAF(7) The square of the range at which the...targets in the column to provide for removal of the three damaged targets. A standard linear interpolation technique is used to pro - vide for the...considered to be shielded by terrain features. provided into which are pro - priority Whereas intruder/mine encounters are position (or dis- tance
An Application of the H-Function to Curve-Fitting and Density Estimation.
1983-12-01
equations into a model that is linear in its coefficients. Nonlinear least squares estimation is a relatively new area developed to accomodate models which...to converge on a solution (10:9-10). For the simple linear model and when general assump- tions are made, the Gauss-Markov theorem states that the...distribution. For example, if the analyst wants to model the time between arrivals to a queue for a computer simulation, he infers the true probability
This study examined inter-analyst classification variability based on training site signature selection only for six classifications from a 10 km2 Landsat ETM+ image centered over a highly heterogeneous area in south-central Virginia. Six analysts classified the image...
Avila, Manuel; Graterol, Eduardo; Alezones, Jesús; Criollo, Beisy; Castillo, Dámaso; Kuri, Victoria; Oviedo, Norman; Moquete, Cesar; Romero, Marbella; Hanley, Zaida; Taylor, Margie
2012-06-01
The appearance of rice grain is a key aspect in quality determination. Mainly, this analysis is performed by expert analysts through visual observation; however, due to the subjective nature of the analysis, the results may vary among analysts. In order to evaluate the concordance between analysts from Latin-American rice quality laboratories for rice grain appearance through digital images, an inter-laboratory test was performed with ten analysts and images of 90 grains captured with a high resolution scanner. Rice grains were classified in four categories including translucent, chalky, white belly, and damaged grain. Data was categorized using statistic parameters like mode and its frequency, the relative concordance, and the reproducibility parameter kappa. Additionally, a referential image gallery of typical grain for each category was constructed based on mode frequency. Results showed a Kappa value of 0.49, corresponding to a moderate reproducibility, attributable to subjectivity in the visual analysis of grain images. These results reveal the need for standardize the evaluation criteria among analysts to improve the confidence of the determination of rice grain appearance.
NASA Technical Reports Server (NTRS)
Hoffer, R. M. (Principal Investigator); Knowlton, D. J.; Dean, M. E.
1981-01-01
A set of training statistics for the 30 meter resolution simulated thematic mapper MSS data was generated based on land use/land cover classes. In addition to this supervised data set, a nonsupervised multicluster block of training statistics is being defined in order to compare the classification results and evaluate the effect of the different training selection methods on classification performance. Two test data sets, defined using a stratified sampling procedure incorporating a grid system with dimensions of 50 lines by 50 columns, and another set based on an analyst supervised set of test fields were used to evaluate the classifications of the TMS data. The supervised training data set generated training statistics, and a per point Gaussian maximum likelihood classification of the 1979 TMS data was obtained. The August 1980 MSS data was radiometrically adjusted. The SAR data was redigitized and the SAR imagery was qualitatively analyzed.
User's Manual for Space Debris Surfaces (SD_SURF)
NASA Technical Reports Server (NTRS)
Elfer, N. C.
1996-01-01
A unique collection of computer codes, Space Debris Surfaces (SD_SURF), have been developed to assist in the design and analysis of space debris protection systems. SD_SURF calculates and summarizes a vehicle's vulnerability to space debris as a function of impact velocity and obliquity. An SD_SURF analysis will show which velocities and obliquities are the most probable to cause a penetration. This determination can help the analyst select a shield design which is best suited to the predominant penetration mechanism. The analysis also indicates the most suitable parameters for development or verification testing. The SD_SURF programs offer the option of either FORTRAN programs and Microsoft EXCEL spreadsheets and macros. The FORTRAN programs work with BUMPERII version 1.2a or 1.3 (Cosmic released). The EXCEL spreadsheets and macros can be used independently or with selected output from the SD_SURF FORTRAN programs.
Scientific visualization of volumetric radar cross section data
NASA Astrophysics Data System (ADS)
Wojszynski, Thomas G.
1992-12-01
For aircraft design and mission planning, designers, threat analysts, mission planners, and pilots require a Radar Cross Section (RCS) central tendency with its associated distribution about a specified aspect and its relation to a known threat, Historically, RCS data sets have been statically analyzed to evaluate a d profile. However, Scientific Visualization, the application of computer graphics techniques to produce pictures of complex physical phenomena appears to be a more promising tool to interpret this data. This work describes data reduction techniques and a surface rendering algorithm to construct and display a complex polyhedron from adjacent contours of RCS data. Data reduction is accomplished by sectorizing the data and characterizing the statistical properties of the data. Color, lighting, and orientation cues are added to complete the visualization system. The tool may be useful for synthesis, design, and analysis of complex, low observable air vehicles.
Görg, Carsten; Liu, Zhicheng; Kihm, Jaeyeon; Choo, Jaegul; Park, Haesun; Stasko, John
2013-10-01
Investigators across many disciplines and organizations must sift through large collections of text documents to understand and piece together information. Whether they are fighting crime, curing diseases, deciding what car to buy, or researching a new field, inevitably investigators will encounter text documents. Taking a visual analytics approach, we integrate multiple text analysis algorithms with a suite of interactive visualizations to provide a flexible and powerful environment that allows analysts to explore collections of documents while sensemaking. Our particular focus is on the process of integrating automated analyses with interactive visualizations in a smooth and fluid manner. We illustrate this integration through two example scenarios: an academic researcher examining InfoVis and VAST conference papers and a consumer exploring car reviews while pondering a purchase decision. Finally, we provide lessons learned toward the design and implementation of visual analytics systems for document exploration and understanding.
A content analysis of analyst research: health care through the eyes of analysts.
Nielsen, Christian
2008-01-01
This article contributes to the understanding of how health care companies may communicate the business models by studying financial analysts' analyst reports. The study examines the differences between the information conveyed in recurrent and fundamental analyst reports as well as whether the characteristics of the analysts and their environment affect their business model analyses. A medium-sized health care company in the medical-technology sector, internationally renowned for its state-of-the-art business reporting, was chosen as the basis for the study. An analysis of 111 fundamental and recurrent analyst reports on this company by each investment bank actively following it was conducted using a content analysis methodology. The study reveals that the recurrent analyses are concerned with evaluating the information disclosed by the health care company itself and not so much with digging up new information. It also indicates that while maintenance work might be focused on evaluating specific details, fundamental research is more concerned with extending the understanding of the general picture, i.e., the sustainability and performance of the overall business model. The amount of financial information disclosed in either type of report is not correlated to the other disclosures in the reports. In comparison to business reporting practices, the fundamental analyst reports put considerably less weight on social and sustainability, intellectual capital and corporate governance information, and they disclose much less comparable non-financial information. The suggestion made is that looking at the types of information financial analysts consider important and convey to their "customers," the investors and fund managers, constitutes a valuable indication to health care companies regarding the needs of the financial market users of their reports and other communications. There are some limitations to the possibility of applying statistical tests to the data-set as well as methodological limitations in relation to the exclusion of tables and graphs.
Human/autonomy collaboration for the automated generation of intelligence products
NASA Astrophysics Data System (ADS)
DiBona, Phil; Schlachter, Jason; Kuter, Ugur; Goldman, Robert
2017-05-01
Intelligence Analysis remains a manual process despite trends toward autonomy in information processing. Analysts need agile decision--support tools that can adapt to the evolving information needs of the mission, allowing the analyst to pose novel analytic questions. Our research enables the analysts to only provide a constrained English specification of what the intelligence product should be. Using HTN planning, the autonomy discovers, decides, and generates a workflow of algorithms to create the intelligence product. Therefore, the analyst can quickly and naturally communicate to the autonomy what information product is needed, rather than how to create it.
Use of the Homeland-Defense Operational Planning System (HOPS) for Emergency Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durling, Jr., R L; Price, D E
2005-12-16
The Homeland-Defense Operational Planning System (HOPS), is a new operational planning tool leveraging Lawrence Livermore National Laboratory's expertise in weapons systems and in sparse information analysis to support the defense of the U.S. homeland. HOPS provides planners with a basis to make decisions to protect against acts of terrorism, focusing on the defense of facilities critical to U.S. infrastructure. Criticality of facilities, structures, and systems is evaluated on a composite matrix of specific projected casualty, economic, and sociopolitical impact bins. Based on these criteria, significant unidentified vulnerabilities are identified and secured. To provide insight into potential successes by malevolent actors,more » HOPS analysts strive to base their efforts mainly on unclassified open-source data. However, more cooperation is needed between HOPS analysts and facility representatives to provide an advantage to those whose task is to defend these facilities. Evaluated facilities include: refineries, major ports, nuclear power plants and other nuclear licensees, dams, government installations, convention centers, sports stadiums, tourist venues, and public and freight transportation systems. A generalized summary of analyses of U.S. infrastructure facilities will be presented.« less
Risk Assessment Using The Homeland-Defense Operational Planning System (HOPS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, D E; Durling, R L
2005-10-10
The Homeland-Defense Operational Planning System (HOPS), is a new operational planning tool leveraging Lawrence Livermore National Laboratory's expertise in weapons systems and in sparse information analysis to support the defense of the U.S. homeland. HOPS provides planners with a basis to make decisions to protect against acts of terrorism, focusing on the defense of facilities critical to U.S. infrastructure. Criticality of facilities, structures, and systems is evaluated on a composite matrix of specific projected casualty, economic, and sociopolitical impact bins. Based on these criteria, significant unidentified vulnerabilities are identified and secured. To provide insight into potential successes by malevolent actors,more » HOPS analysts strive to base their efforts mainly on unclassified open-source data. However, more cooperation is needed between HOPS analysts and facility representatives to provide an advantage to those whose task is to defend these facilities. Evaluated facilities include: refineries, major ports, nuclear power plants and other nuclear licensees, dams, government installations, convention centers, sports stadiums, tourist venues, and public and freight transportation systems. A generalized summary of analyses of U.S. infrastructure facilities will be presented.« less
Memory Forensics: Review of Acquisition and Analysis Techniques
2013-11-01
Management Overview Processes running on modern multitasking operating systems operate on an abstraction of RAM, called virtual memory [7]. In these systems...information such as user names, email addresses and passwords [7]. Analysts also use tools such as WinHex to identify headers or other suspicious data within
Cost Beneftt Analysts of LH2 PadB
NASA Technical Reports Server (NTRS)
Mott, Brittany
2013-01-01
This analysis is used to evaluate, from a cost and benefit perspective, potential outcomes when replacing the pressurization switches and the pressurization system to meet the needs of the LH2 storage system at Pad B. This also includes alternatives, tangible and intangible benefits, and the results of the analysis.
Systems Analysis | Hydrogen and Fuel Cells | NREL
risks. Analysts also develop least-cost scenarios for hydrogen infrastructure rollout in support of the opportunities for multi-sector integration using hydrogen systems as well as the capability and cost associated with the H2USA public-private collaboration. Publications The following technical reports
KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery
NASA Astrophysics Data System (ADS)
Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan
2013-05-01
KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.
Gas chromatography - mass spectrometry data processing made easy.
Johnsen, Lea G; Skou, Peter B; Khakimov, Bekzod; Bro, Rasmus
2017-06-23
Evaluation of GC-MS data may be challenging due to the high complexity of data including overlapped, embedded, retention time shifted and low S/N ratio peaks. In this work, we demonstrate a new approach, PARAFAC2 based Deconvolution and Identification System (PARADISe), for processing raw GC-MS data. PARADISe is a computer platform independent freely available software incorporating a number of newly developed algorithms in a coherent framework. It offers a solution for analysts dealing with complex chromatographic data. It allows extraction of chemical/metabolite information directly from the raw data. Using PARADISe requires only few inputs from the analyst to process GC-MS data and subsequently converts raw netCDF data files into a compiled peak table. Furthermore, the method is generally robust towards minor variations in the input parameters. The method automatically performs peak identification based on deconvoluted mass spectra using integrated NIST search engine and generates an identification report. In this paper, we compare PARADISe with AMDIS and ChromaTOF in terms of peak quantification and show that PARADISe is more robust to user-defined settings and that these are easier (and much fewer) to set. PARADISe is based on non-proprietary scientifically evaluated approaches and we here show that PARADISe can handle more overlapping signals, lower signal-to-noise peaks and do so in a manner that requires only about an hours worth of work regardless of the number of samples. We also show that there are no non-detects in PARADISe, meaning that all compounds are detected in all samples. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Cantor, Jeffrey A.; Hobson, Edward N.
The development of a test design methodology used to construct a criterion-referenced System Achievement Test (CR-SAT) for selected Naval enlisted classification (NEC) in the Strategic Weapon System (SWS) of the United States Navy is described. Subject matter experts, training data analysts and educational specialists developed a comprehensive…
Psychoanalysis in Crisis: The Danger of Ideology.
Richards, Arnold
2015-06-01
Psychoanalysis is in crisis. Its prestige with the public has plummeted, as well as its economic viability and even its population. There are fewer analytic candidates and fewer patients, less insurance coverage, less presence in departments of psychiatry, and less prestige among the traditional academic disciplines. Analysts are getting older, and there are fewer and fewer young ones to replace us. A once-fascinated public now distrusts analysts as unscientific, deluded, authoritarian, reactionary, arrogant, sexist, and/or passé. This paper examines some causes of this decline within psychoanalysis itself as well as possibilities for reform. The status of psychoanalysis as a science is in question, although Freud considered it as an empirical science, and modified his theories to fit new facts. In reality, however, transmission of psychoanalytic knowledge in the training analyst system has led to its perpetuation as an ideology, rather than a science, and to the formation of oligarchies in the structure of psychoanalytic organizations and some institutes. Psychoanalysis is nothing if not an exploratory endeavor, and it thrives in an open environment. Psychoanalytic theory becomes ideology when exploration, testing, and challenge are suppressed. There are many analysts for whom psychoanalysis is neither ideology or theology, but an intellectually stimulating and emotionally rewarding human and humane endeavor, where convention is enlivened by creative challenge, and innovation is disciplined by tradition. In that form, it is too valuable to lose. It is time for us to step back and reclaim our citizenship in the larger intellectual world of curiosity, creativity, and freedom.
ERIC Educational Resources Information Center
Mulvenon, Sean W.; Wang, Kening; Mckenzie, Sarah; Anderson, Travis
2006-01-01
Effective exploration of spatially referenced educational achievement data can help educational researchers and policy analysts speed up gaining valuable insight into datasets. This article illustrates a demo system developed in the National Office for Research on Measurement and Evaluation Systems (NORMES) for supporting Web-based interactive…
ERIC Educational Resources Information Center
Watson, William J.
Occupational analysts using Comprehensive Occupational Data Analysis Programs (CODAP) make subjective decisions at various stages in their analysis of an occupation. The possibility exists that two different analysts could reach different conclusions in analyzing an occupation, and thereby provide divergent guidance to management. Two analysts,…
Supporting the Growing Needs of the GIS Industry
NASA Technical Reports Server (NTRS)
2003-01-01
Visual Learning Systems, Inc. (VLS), of Missoula, Montana, has developed a commercial software application called Feature Analyst. Feature Analyst was conceived under a Small Business Innovation Research (SBIR) contract with NASA's Stennis Space Center, and through the Montana State University TechLink Center, an organization funded by NASA and the U.S. Department of Defense to link regional companies with Federal laboratories for joint research and technology transfer. The software provides a paradigm shift to automated feature extraction, as it utilizes spectral, spatial, temporal, and ancillary information to model the feature extraction process; presents the ability to remove clutter; incorporates advanced machine learning techniques to supply unparalleled levels of accuracy; and includes an exceedingly simple interface for feature extraction.
Geospatial intelligence workforce
NASA Astrophysics Data System (ADS)
Showstack, Randy
2013-02-01
A report on the future U.S. workforce for geospatial intelligence, requested by the U.S. National Geospatial-Intelligence Agency (NGA), found that the agency—which hires about 300 scientists and analysts annually—is probably finding sufficient experts to fill the needs in all of its core areas, with the possible exception of geographic information systems (GIS) and remote sensing. The report by the U.S. National Research Council, released on 25 January, noted that competition for GIS applications analysts is strong. While there appear to be enough cartographers, photogrammetrists, and geodesists to meet NGA's current needs in those core areas, the report cautioned that future shortages in these areas seem likely because of a relatively small number of graduates.
Understanding the health care business model: the financial analysts' point of view.
Bukh, Per Nikolaj; Nielsen, Christian
2010-01-01
This study focuses on how financial analysts understand the strategy of a health care company and which elements, from such a strategy perspective, they perceive as constituting the cornerstone of a health care company's business model. The empirical part of this study is based on semi-structured interviews with analysts following a large health care company listed on the Copenhagen Stock Exchange. The authors analyse how the financial analysts view strategy and value creation within the framework of a business model. Further, the authors analyze whether the characteristics emerging from a comprehensive literature review are reflected in the financial analysts' perceptions of which information is decision-relevant and important to communicate to the financial markets. Among the conclusions of the study is the importance of distinguishing between the health care companies' business model and the model by which the payment of revenues are allocated between end users and reimbursing organizations.
The analyst's authenticity: "if you see something, say something".
Goldstein, George; Suzuki, Jessica Y
2015-05-01
The history of authenticity in psychoanalysis is as old as analysis itself, but the analyst's authenticity in particular has become an increasingly important area of focus in recent decades. This article traces the development of conceptions of analytic authenticity and proposes that the analyst's spontaneous verbalization of his or her unformulated experience in session can be a potent force in the course of an analysis. We acknowledge that although analytic authenticity can be a challenging ideal for the analyst to strive for, it contains the power to transform the experience of the patient and the analyst, as well as the meaning of their work together. Whether it comes in the form of an insight-oriented comment or a simple acknowledgment of things as they seem to be, a therapist's willingness to speak aloud something that has lost its language is a powerful clinical phenomenon that transcends theoretical orientation and modality. © 2015 Wiley Periodicals, Inc.
Instruction in information structuring improves Bayesian judgment in intelligence analysts.
Mandel, David R
2015-01-01
An experiment was conducted to test the effectiveness of brief instruction in information structuring (i.e., representing and integrating information) for improving the coherence of probability judgments and binary choices among intelligence analysts. Forty-three analysts were presented with comparable sets of Bayesian judgment problems before and immediately after instruction. After instruction, analysts' probability judgments were more coherent (i.e., more additive and compliant with Bayes theorem). Instruction also improved the coherence of binary choices regarding category membership: after instruction, subjects were more likely to invariably choose the category to which they assigned the higher probability of a target's membership. The research provides a rare example of evidence-based validation of effectiveness in instruction to improve the statistical assessment skills of intelligence analysts. Such instruction could also be used to improve the assessment quality of other types of experts who are required to integrate statistical information or make probabilistic assessments.
The lure of the symptom in psychoanalytic treatment.
Ogden, Thomas H; Gabbard, Glen O
2010-06-01
Psychoanalysis, which at its core is a search for truth, stands in a subversive position vis-à-vis the contemporary therapeutic culture that places a premium on symptomatic "cure." Nevertheless, analysts are vulnerable to succumbing to the internal and external pressures for the achievement of symptomatic improvement. In this communication we trace the evolution of Freud's thinking about the relationship between the aims of psychoanalysis and the alleviation of symptoms. We note that analysts today may recapitulate Freud's early struggles in their pursuit of symptom removal. We present an account of a clinical consultation in which the analytic pair were ensnared in an impasse that involved the analyst's preoccupation with the intransigence of one of the patient's symptoms. We suggest alternative ways of working with these clinical issues and offer some thoughts on how our own work as analysts and consultants to colleagues has been influenced by our understanding of what frequently occurs when the analyst becomes symptom-focused.
Self-disclosure, trauma and the pressures on the analyst.
West, Marcus
2017-09-01
This paper argues that self-disclosure is intimately related to traumatic experience and the pressures on the analyst not to re-traumatize the patient or repeat traumatic dynamics. The paper gives a number of examples of such pressures and outlines the difficulties the analyst may experience in adopting an analytic attitude - attempting to stay as closely as possible with what the patient brings. It suggests that self-disclosure may be used to try to disconfirm the patient's negative sense of themselves or the analyst, or to try to induce a positive sense of self or of the analyst which, whilst well-meaning, may be missing the point and may be prolonging the patient's distress. Examples are given of staying with the co-construction of the traumatic early relational dynamics and thus working through the traumatic complex; this attitude is compared and contrasted with some relational psychoanalytic attitudes. © 2017, The Society of Analytical Psychology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meadows, D.L.; Weinberg, A.M.; Boyd, J.
Geologist James Boyd, physicist Alvin Weinberg, and systems analyst Dennis Meadows participated in a debate at which they forecast the cost and availability of world resources. Highlights of the debate and questions and comments from their audience are presented here. A range of optimism is evident in the predictions. Boyd foresees that energy and resource problems will be solved by technology, while Meadows contends that no solutions are possible until institutional and political constraints are lifted to allow resource development. Weinberg takes a middle view and proposes substitution of new resources for those, like fossil fuels, that are nearing depletion.more » The role of the market system is debated with disagreement over whether energy development should or can respond to a free market--and whether per capita energy consumption will increase or decline with limited economic growth. Policies governing access to fossil fuels and metals in the future are felt to be central to the issue. (DCK)« less
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Developing an Automated Method for Detection of Operationally Relevant Ocean Fronts and Eddies
NASA Astrophysics Data System (ADS)
Rogers-Cotrone, J. D.; Cadden, D. D. H.; Rivera, P.; Wynn, L. L.
2016-02-01
Since the early 90's, the U.S. Navy has utilized an observation-based process for identification of frontal systems and eddies. These Ocean Feature Assessments (OFA) rely on trained analysts to identify and position ocean features using satellite-observed sea surface temperatures. Meanwhile, as enhancements and expansion of the navy's Hybrid Coastal Ocean Model (HYCOM) and Regional Navy Coastal Ocean Model (RNCOM) domains have proceeded, the Naval Oceanographic Office (NAVO) has provided Tactical Oceanographic Feature Assessments (TOFA) that are based on data-validated model output but also rely on analyst identification of significant features. A recently completed project has migrated OFA production to the ArcGIS-based Acoustic Reach-back Cell Ocean Analysis Suite (ARCOAS), enabling use of additional observational datasets and significantly decreasing production time; however, it has highlighted inconsistencies inherent to this analyst-based identification process. Current efforts are focused on development of an automated method for detecting operationally significant fronts and eddies that integrates model output and observational data on a global scale. Previous attempts to employ techniques from the scientific community have been unable to meet the production tempo at NAVO. Thus, a system that incorporates existing techniques (Marr-Hildreth, Okubo-Weiss, etc.) with internally-developed feature identification methods (from model-derived physical and acoustic properties) is required. Ongoing expansions to the ARCOAS toolset have shown promising early results.
75 FR 76729 - Market Access Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-09
... falls below pre-established financial performance thresholds. The draft amendment (MAA Amendment) is... System banks and the Funding Corporation that establishes certain financial performance criteria. Under... Agreement). FOR FURTHER INFORMATION CONTACT: Chris Wilson, Financial Analyst, Office of Regulatory Policy...
A data mart for operations analysis.
Isken, M W; Littig, S J; West, M
2001-01-01
In this article we describe the evolution and architecture of a data mart developed to address the modeling and analysis needs of healthcare operations analysts. More specifically, the data mart is used in projects relating to demand analysis, forecasting, capacity planning, and service system design for a healthcare system consisting of a large tertiary care hospital and a smaller community hospital. The primary focus of the mart is on the detailed movement of inpatients through each hospital, although most component data tables include outpatient information such as emergency center visits, surgical cases, cardiac catheterization cases, and short-stay visits. We show that the data mart goes well beyond consolidating data from different sources by including a number of complex, precalculated fields, data structures, and function libraries that are specific to the needs of operations analysts. We discuss several outstanding and challenging design issues that should be of interest to the data warehouse vendor community.
Simulation of linear mechanical systems
NASA Technical Reports Server (NTRS)
Sirlin, S. W.
1993-01-01
A dynamics and controls analyst is typically presented with a structural dynamics model and must perform various input/output tests and design control laws. The required time/frequency simulations need to be done many times as models change and control designs evolve. This paper examines some simple ways that open and closed loop frequency and time domain simulations can be done using the special structure of the system equations usually available. Routines were developed to run under Pro-Matlab in a mixture of the Pro-Matlab interpreter and FORTRAN (using the .mex facility). These routines are often orders of magnitude faster than trying the typical 'brute force' approach of using built-in Pro-Matlab routines such as bode. This makes the analyst's job easier since not only does an individual run take less time, but much larger models can be attacked, often allowing the whole model reduction step to be eliminated.
Developing Analogy Cost Estimates for Space Missions
NASA Technical Reports Server (NTRS)
Shishko, Robert
2004-01-01
The analogy approach in cost estimation combines actual cost data from similar existing systems, activities, or items with adjustments for a new project's technical, physical or programmatic differences to derive a cost estimate for the new system. This method is normally used early in a project cycle when there is insufficient design/cost data to use as a basis for (or insufficient time to perform) a detailed engineering cost estimate. The major limitation of this method is that it relies on the judgment and experience of the analyst/estimator. The analyst must ensure that the best analogy or analogies have been selected, and that appropriate adjustments have been made. While analogy costing is common, there is a dearth of advice in the literature on the 'adjustment methodology', especially for hardware projects. This paper discusses some potential approaches that can improve rigor and repeatability in the analogy costing process.
Nothing but the truth: self-disclosure, self-revelation, and the persona of the analyst.
Levine, Susan S
2007-01-01
The question of the analyst's self-disclosure and self-revelation inhabits every moment of every psychoanalytic treatment. All self-disclosures and revelations, however, are not equivalent, and differentiating among them allows us to define a construct that can be called the analytic persona. Analysts already rely on an unarticulated concept of an analytic persona that guides them, for instance, as they decide what constitutes appropriate boundaries. Clinical examples illustrate how self-disclosures and revelations from within and without the analytic persona feel different, for both patient and analyst. The analyst plays a specific role for each patient and is both purposefully and unconsciously different in this context than in other settings. To a great degree, the self is a relational phenomenon. Our ethics call for us to tell nothing but the truth and simultaneously for us not to tell the whole truth. The unarticulated working concept of an analytic persona that many analysts have refers to the self we step out of at the close of each session and the self we step into as the patient enters the room. Attitudes toward self-disclosure and self-revelation can be considered reflections of how we conceptualize this persona.
A method for discrimination of noise and EMG signal regions recorded during rhythmic behaviors.
Ying, Rex; Wall, Christine E
2016-12-08
Analyses of muscular activity during rhythmic behaviors provide critical data for biomechanical studies. Electrical potentials measured from muscles using electromyography (EMG) require discrimination of noise regions as the first step in analysis. An experienced analyst can accurately identify the onset and offset of EMG but this process takes hours to analyze a short (10-15s) record of rhythmic EMG bursts. Existing computational techniques reduce this time but have limitations. These include a universal threshold for delimiting noise regions (i.e., a single signal value for identifying the EMG signal onset and offset), pre-processing using wide time intervals that dampen sensitivity for EMG signal characteristics, poor performance when a low frequency component (e.g., DC offset) is present, and high computational complexity leading to lack of time efficiency. We present a new statistical method and MATLAB script (EMG-Extractor) that includes an adaptive algorithm to discriminate noise regions from EMG that avoids these limitations and allows for multi-channel datasets to be processed. We evaluate the EMG-Extractor with EMG data on mammalian jaw-adductor muscles during mastication, a rhythmic behavior typified by low amplitude onsets/offsets and complex signal pattern. The EMG-Extractor consistently and accurately distinguishes noise from EMG in a manner similar to that of an experienced analyst. It outputs the raw EMG signal region in a form ready for further analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hydrologic modeling for monitoring water availability in Africa and the Middle East
NASA Astrophysics Data System (ADS)
McNally, A.; Getirana, A.; Arsenault, K. R.; Peters-Lidard, C. D.; Verdin, J. P.
2015-12-01
Drought impacts water resources required by crops and communities, in turn threatening lives and livelihoods. Early warning systems, which rely on inputs from hydro-climate models, are used to help manage risk and provide humanitarian assistance to the right place at the right time. However, translating advancements in hydro-climate science into action is a persistent and time-consuming challenge: scientists and decision-makers need to work together to enhance the salience, credibility, and legitimacy of the hydrological data products being produced. One organization that tackles this challenge is the Famine Early Warning Systems Network (FEWS NET), which has been using evidence-based approaches to address food security since the 1980s.In this presentation, we describe the FEWS NET Land Data Assimilation System (FLDAS), developed by FEWS NET and NASA hydrologic scientists to maximize the use of limited hydro-climatic observations for humanitarian applications. The FLDAS, an instance of the NASA Land Information System (LIS), is comprised of land surface models driven by satellite rainfall inputs already familiar to FEWS NET food security analysts. First, we evaluate the quality of model outputs over parts of the Middle East and Africa using remotely sensed soil moisture and vegetation indices. We then describe derived water availability indices that have been identified by analysts as potentially useful sources of information. Specifically, we demonstrate how the Baseline Water Stress and Drought Severity Index detect recent water availability crisis events in the Tigris-Euphrates Basin and the Gaborone Reservoir, Botswana. Finally we discuss ongoing work to deliver this information to FEWS NET analysts in a timely and user-friendly manner, with the ultimate goal of integrating these water availability metrics into regular decision-making activities.
Visualization Techniques for Computer Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beaver, Justin M; Steed, Chad A; Patton, Robert M
2011-01-01
Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less
Damage Assessment for Disaster Relief Efforts in Urban Areas Using Optical Imagery and LiDAR Data
NASA Astrophysics Data System (ADS)
Bahr, Thomas
2014-05-01
Imagery combined with LiDAR data and LiDAR-derived products provides a significant source of geospatial data which is of use in disaster mitigation planning. Feature rich building inventories can be constructed from tools with 3D rooftop extraction capabilities, and two dimensional outputs such as DSMs and DTMs can be used to generate layers to support routing efforts in Spatial Analyst and Network Analyst workflows. This allows us to leverage imagery and LiDAR tools for disaster mitigation or other scenarios. Software such as ENVI, ENVI LiDAR, and ArcGIS® Spatial and Network Analyst can therefore be used in conjunction to help emergency responders route ground teams in support of disaster relief efforts. This is exemplified by a case study against the background of the magnitude 7.0 earthquake that struck Haiti's capital city of Port-au-Prince on January 12, 2010. Soon after, both LiDAR data and an 8-band WorldView-2 scene were collected to map the disaster zone. The WorldView-2 scene was orthorectified and atmospherically corrected in ENVI prior to use. ENVI LiDAR was used to extract the DSM, DTM, buildings, and debris from the LiDAR data point cloud. These datasets provide a foundation for the 2D portion of the analysis. As the data was acquired over an area of dense urbanization, the majority of ground surfaces are roads, and standing buildings and debris are actually largely separable on the basis of elevation classes. To extract the road network of Port-au-Prince, the LiDAR-based feature height information was fused with the WorldView-2 scene, using ENVI's object-based feature extraction approach. This road network was converted to a network dataset for further analysis by the ArcGIS Network Analyst. For the specific case of Haiti, the distribution of blue tarps, used as accommodations for refugees, provided a spectrally distinct target. Pure blue tarp pixel spectra were selected from the WorldView-2 scene and input as a reference into ENVI's Spectral Angle Mapper (SAM) classification routine, together with a water-shadow mask to prevent false positives. The resulting blue tarp shape file was input into the ArcGIS Point Density tool, a feature of the Spatial Analyst toolbox. The final distribution map shows the density of blue tarps in Port-au-Prince and can be used to roughly delineate camps of refugees. Analogous, a debris density map was generated after separating the debris elevation class. The combination of this debris density map with the road network allowed to construct an intact road network of Port-au-Prince within the ArcGIS Network Analyst. Moderate density debris was used as a cost-increase barrier feature of the network dataset, and high density debris was used as a total obstruction barrier feature. Based on this information, two hypothetical routing scenarios were analyzed. One involved routing a ground team between two different refugee concentration zones. For the other, potential helicopter landing zones were computed from the LiDAR-derived products and added as facility features to the Network Analyst. Routes from the helicopter landing zones to refugee concentration access points were solved using closest facility logic, again making use of the obstructed network.
Code inspection instructional validation
NASA Technical Reports Server (NTRS)
Orr, Kay; Stancil, Shirley
1992-01-01
The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.
Methods for the evaluation of alternative disaster warning systems
NASA Technical Reports Server (NTRS)
Agnew, C. E.; Anderson, R. J., Jr.; Lanen, W. N.
1977-01-01
For each of the methods identified, a theoretical basis is provided and an illustrative example is described. The example includes sufficient realism and detail to enable an analyst to conduct an evaluation of other systems. The methods discussed in the study include equal capability cost analysis, consumers' surplus, and statistical decision theory.
and focuses on thermal system and economic analysis for low temperature and co-produced hydrothermal mechanical systems and economic analysis has led to his most recent work with the Marine and Hydrokinetic (MHK) team, where he is one of the lead techno-economic analysts for wave and current energy. His areas
DOT National Transportation Integrated Search
2011-06-01
This report describes an accuracy assessment of extracted features derived from three : subsets of Quickbird pan-sharpened high resolution satellite image for the area of the : Port of Los Angeles, CA. Visual Learning Systems Feature Analyst and D...
Getting Staff to Use Data Systems
ERIC Educational Resources Information Center
Riley, Sheila
2006-01-01
In this article, John Forbes, administrative analyst for the 80,000-student Fresno Unified School District in Fresno, and Terrence Young, chief information officer for the 70,000-student Guilford Country Schools in Greensboro, North Carolina, share their strategies for getting staff on board with Web-based data systems. These are the strategies:…
APMS 3.0 Flight Analyst Guide: Aviation Performance Measuring System
NASA Technical Reports Server (NTRS)
Jay, Griff; Prothero, Gary; Romanowski, Timothy; Lynch, Robert; Lawrence, Robert; Rosenthal, Loren
2004-01-01
The Aviation Performance Measuring System (APMS) is a method-embodied in software-that uses mathematical algorithms and related procedures to analyze digital flight data extracted from aircraft flight data recorders. APMS consists of an integrated set of tools used to perform two primary functions: a) Flight Data Importation b) Flight Data Analysis.